Qualifications
Responsibilities:
Design and develop event-driven architectures using Apache Kafka and Kafka Streams.
Build and maintain stream processing applications and data pipelines.
Design and implement ELK pipelines, indexes, dashboards, and alerts.
Lead architecture discussions and contribute to the design of Kafka-based event streaming systems.
Configure Kafka connectors and manage Kafka topics.
Develop scalable event-based processing applications according to architecture and design specifications.
Support development and QA teams throughout the software development lifecycle.
Contribute to monitoring, observability, and reliability of streaming platforms.
Requirements:
4+ years of experience building Kafka-based applications, with strong familiarity with Kafka ecosystem components such as Kafka Streams (including Processor API), Kafka Connect, and Schema Registry.
4+ years of experience with Java and Spring, specifically implementing Kafka Streams applications.
Hands-on experience with the ELK stack (Elasticsearch, Logstash, Kibana) for monitoring and observability.
Proficiency in Logstash Configuration Language (LCL) for configuring pipelines (inputs, filters, outputs).
Solid understanding of SQL and Oracle PL/SQL, focusing on query optimization.
Experience working with REST APIs.
Familiarity with Kafka CLI tools and REST APIs for managing Kafka resources.
Competence in Linux/Unix environments.
Experience with IntelliJ IDEA.
Contributions to DevOps processes for Kafka and ELK are highly valued.
Nice to Have:
Experience managing and maintaining ELK servers.
About the job
Join our dynamic development team as an experienced Backend Engineer, where you will play a crucial role in building large-scale event-driven systems.
We are collaborating with a forward-thinking integrated shipping services company, known for their innovative approach to global logistics. They maintain a smart and efficient worldwide network, providing reliable shipping services that ensure stable operations for customers globally.
Your primary responsibilities will include designing and constructing event-driven solutions utilizing Kafka and related technologies, with a strong emphasis on stream processing applications, data pipelines, and monitoring solutions built with the ELK stack.