About the job
- Design and develop efficient ETL/ELT processes to ensure optimal data processing and analysis.
- Build and maintain data pipelines delivering real-time data while assisting teams in creating data-driven products such as personalization, recommendations, and tracking systems.
- Develop interfaces for external systems and ensure reliable integration of new data sources (e.g., REST APIs, Google Analytics).
- Automate testing and deployment processes for data models, ETL jobs, and machine learning applications.
- Manage infrastructure components for Big Data and streaming technologies (e.g., ClickHouse, Kinesis), ensuring high availability and efficiency.
- Ensure high data quality and develop strategies for continuous system optimization.
- Contribute to the development and integration of machine learning models to enhance personalization and recommendation systems.
- Actively participate in our DevOps team and take responsibility in on-call duties.
- Document technical processes to ensure transparency and traceability of all data operations.

