About the job
Shift Technology stands at the forefront of innovation in the insurance sector, harnessing the power of artificial intelligence. Our platform integrates generative, agentic, and predictive AI to revolutionize underwriting, claims management, and fraud detection, thereby enhancing operational efficiency and delivering exceptional customer experiences.
With a diverse team from over 50 countries, we are united by a commitment to reshape the insurance landscape through our SaaS solutions.
Our engineering team is pivotal to our success, tackling intricate challenges and collaborating across departments to deliver high-quality results.
We are redefining how data influences decision-making by developing robust, scalable software and data platforms. Whether your passion lies in distributed systems, API creation, or facilitating data-driven insights, you will discover rewarding opportunities here.
Our technology stack encompasses . NET Core (C#), Python, MS SQL Server, Spark, Azure Data Factory (ADF), Databricks, GoodData, Terraform, and more. We are advancing toward distributed data systems on cloud platforms, offering intriguing challenges in both batch and real-time processing.
Key Responsibilities:- Design and sustain scalable data models and frameworks for data access that underpin centralized databases and downstream applications.
- Construct and enhance ETL pipelines to ingest, transform, and integrate diverse data sources, ensuring data integrity, quality, and accessibility through tools like Apache Spark, . NET libraries, Azure Data Factory, and DBT.
- Engage in Agile methodologies by contributing to sprint planning, daily stand-ups, and retrospectives to ensure continuous delivery aligned with team objectives.
- Collaborate with product teams to translate roadmap priorities into actionable technical designs and deliverables that satisfy business and customer demands.
- Work closely with Data Scientists and Engineers to create frameworks that simplify customer data integration and bolster advanced analytics.
- Participate in code reviews, deployment activities, and monitoring to ensure high-quality production releases and seamless operation of data pipelines and frameworks.

