About the job
- Data Pipeline Development: You will design, develop, and refine data pipelines that efficiently extract, transform, and load data from diverse client systems into our data lake and marts.
- Data Quality Assurance: You will implement processes and tools to ensure our data is accurate, complete, and consistent.
- Collaborative Projects: You will work in synergy with data scientists, architects, and client tech teams to create impactful data solutions.
- Innovative Contributions: You will participate in building and maintaining data infrastructure that supports predictive modeling and descriptive analytics.
- Continuous Learning: You will stay current with emerging technologies in the data engineering domain.
- Data Engineering Foundation: You possess a solid understanding of data warehousing principles, ETL processes, and data modeling techniques.
- Programming Skills: You are proficient in Java and Python, with hands-on experience in SQL database design.
- Cloud Computing Experience: You have experience working with at least one major cloud platform (AWS, GCP, or Azure).
- Team Player: You excel in a collaborative environment and enjoy engaging with varied stakeholders.
- Analytical Mindset: You are a creative problem solver, capable of tackling complex data challenges.
- Willingness to Learn: You are passionate about data and eager to advance your knowledge and skill set.
- Familiarity with ETL tools/technologies (e.g., Informatica, Talend, Snowflake).
- Experience with data visualization tools.
- Make an Impact: Your contributions will directly influence our clients’ success and help them achieve their business objectives.
- Utilize Cutting-Edge Technology: You will have the opportunity to work with the latest tools and technologies in data engineering.
- Career Growth: We offer a supportive and collaborative environment for you to learn and grow your skills.

