Description
Coralogix is a modern, full-stack observability platform transforming how businesses process and understand their data. Our unique architecture powers in-stream analytics without reliance on expensive indexing or hot storage. We specialize in comprehensive monitoring of logs, metrics, trace and security events with features such as APM, RUM, SIEM, Kubernetes monitoring and more, all enhancing operational efficiency and reducing observability spend by up to 70%.
We’re looking for engineers who love Rust to join our team and develop modern, cloud-native production systems that process terabytes of data daily. In this role, you will design and develop our backend services using Rust along with other exciting technologies, all deployed in AWS on Kubernetes.
*This opportunity is available in a hybrid model from our TLV site or remotely within the EU time zone
Requirements
- At least 5 years of software development experience
- At least 2 years of experience developing and operating Rust-based systems in production-MUST
- Experience working on data-intensive applications.
- Experience with kafka
- Excellent written and verbal communication skills.
- Experience using ClickHouse in production- an advantage
Cultural Fit
We’re seeking candidates who are hungry, humble, and smart. Coralogix fosters a culture of innovation and continuous learning, where team members are encouraged to challenge the status quo and contribute to our shared mission. If you thrive in dynamic environments and are eager to shape the future of observability solutions, we’d love to hear from you.
Coralogix is an equal opportunity employer and encourages applicants from all backgrounds to apply.
Top Skills
What We Do
We’re rebuilding the path to observability using a real-time streaming analytics pipeline that provides monitoring, visualization, and alerting capabilities without the burden of indexing.
By enabling users to define different data pipelines per use case, we provide deep insights for less than half the cost.
In short, we are streaming the future of data.