About the job
- Enhance and maintain existing data processes to guarantee their reliability, performance, and scalability.
- Support and improve Python-based services while also developing new services as needed.
- Design, implement, and optimize sustainable data architecture to accommodate growth and efficiency.
- Foster improvements in system reliability and performance across all data workflows.
- Modernize data processes by integrating cutting-edge tools and technologies.
- Oversee data flow between local storage, AWS, and Snowflake, with an emphasis on performance and cost efficiency.
- Lead the migration of data pipelines and services to Snowflake for improved performance.
- Enhance the data orchestration layer (Airflow) to ensure stable operations and scalability.
- Collaborate with cross-functional teams to ensure architectural coherence and the adoption of best practices.

