Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Experience Level
Manager
Qualifications
Proven experience in data engineering and team leadership. Strong understanding of data architecture, ETL processes, and data modeling. Familiarity with big data technologies and cloud solutions. Excellent problem-solving skills and a passion for mentoring junior engineers. Ability to communicate complex technical concepts to non-technical stakeholders.
About the job
Join our innovative team as a Data Engineering Manager at Stratacareers, where you will lead a dynamic group of data engineers dedicated to building robust and scalable data infrastructure that empowers leading healthcare providers across the nation.
In this pivotal role, you won’t just be responsible for coding; you will also define strategic initiatives, mentor your team, and drive the transformation of data into actionable insights that propel our clients’ missions forward. You will play a key role in scaling our data platform, integrating cutting-edge technologies, and collaborating across departments to ensure our data engineering efforts are efficient and aligned with future goals.
If you thrive on mentoring engineers, tackling technical challenges, and enhancing processes while staying hands-on with data, we would love to hear from you!
About Strata Careers
Stratacareers is a forward-thinking company committed to transforming healthcare through innovative data solutions. We pride ourselves on a collaborative environment where creativity and teamwork are valued, and every team member plays a crucial role in our mission to enhance the quality of care through robust data infrastructure.
Join our innovative team as a Data Engineering Manager at Stratacareers, where you will lead a dynamic group of data engineers dedicated to building robust and scalable data infrastructure that empowers leading healthcare providers across the nation.In this pivotal role, you won’t just be responsible for coding; you will also define strategic initiatives, mentor your team, and drive the transformation of data into actionable insights that propel our clients’ missions forward. You will play a key role in scaling our data platform, integrating cutting-edge technologies, and collaborating across departments to ensure our data engineering efforts are efficient and aligned with future goals.If you thrive on mentoring engineers, tackling technical challenges, and enhancing processes while staying hands-on with data, we would love to hear from you!
Join Valorem Reply, an award-winning digital transformation agency dedicated to driving solutions in data-centric enterprises, IT modernization, customer experience, product transformation, and digital workplace innovation through the power of Microsoft technologies. We specialize in hyper-scale and agile delivery of unique digital business services, strategic business models, and design-focused user interactions, ensuring our clients experience secure and rapid transformations in their operations. We are currently seeking a passionate and skilled Manager for our Data & AI team. In this role, you will primarily concentrate on developing modern data platforms utilizing Microsoft cloud technologies like Fabric and Databricks for our diverse clientele. As a technical leader, you will help chart the technical course of our practice while staying abreast of the latest trends in data-focused technology. Strong communication ability is essential, as you will act as a trusted partner to our customers. A solid grasp of data governance and the ability to guide clients on frameworks is critical to success in this role. This position is ideal for individuals who thrive in collaborative settings, appreciate flexibility and autonomy in their work, and enjoy tackling complex challenges.
About Us: At Carefeed, we empower senior living and long-term care providers with a comprehensive platform that streamlines operations, enhancing the experience for both staff and residents. Our innovative solution replaces outdated paper processes, phone calls, and disparate systems with an integrated digital approach, allowing teams to focus on what truly matters—caring for residents and their families.We seamlessly integrate with existing EHR and HR systems, reducing operational strain while keeping communities organized. Carefeed is designed for ease of use, functionality, and the realities of multi-community care, helping providers maintain efficiency and confidence in their operations.With thousands of communities in the US and Canada relying on us, Carefeed is dedicated to supporting organizations in delivering exceptional experiences for residents, families, and caregivers alike.About the Opportunity: We are seeking a Senior Data Engineer to spearhead the development of our data infrastructure and enhance our developer tools. In this pivotal role, you will design and implement robust data systems, develop tools that improve developer productivity, and create the foundational technology for impactful data-driven products.This position is perfect for an experienced data engineer ready to transition into infrastructure management. Your primary project will involve designing and constructing a central data lake for 2026, alongside the necessary pipelines (e.g., SQS, Kafka) to support it. As the inaugural data engineer at Carefeed, you will play a key role in guiding design and tooling decisions and executing those plans.As part of our Infrastructure team, you will also engage in standard infrastructure tasks, including observability and deployments.
Data EngineerEmployment Type: Full-Time, Mid-LevelDepartment: Business Intelligence CGS Federal is in search of a dedicated and innovative Data Engineer to bolster our expanding Data Analytics and Business Intelligence platform. This platform is designed to provide federal clients with the essential tools and capabilities to transform data into actionable insights. The ideal candidate will possess strong analytical skills and a love for continuous learning, eager to navigate and develop expertise across a variety of technologies while addressing some of our clients' most challenging problems. At CGS, we unite motivated, skilled, and creative individuals to tackle the government's most pressing challenges with state-of-the-art technology. We are looking for candidates who are enthusiastic about contributing to government innovation, value teamwork, and can proactively address the needs of others. We foster an environment where our employees feel supported and promote professional development through diverse learning opportunities. Key Responsibilities:- Develop and maintain data pipelines to effectively store, manage, and provide data to users.- Actively participate in an Agile/Scrum team, adhering to best practices.- Write efficient code to ensure reliable data extraction and processing.- Facilitate continuous automation of data ingestion processes.- Promote technical excellence by following lean-agile engineering principles, including API-first design, simple designs, continuous integration, version control, and automated testing.- Collaborate with program management and engineers to understand and document complex, evolving requirements.- Foster an environment that encourages customer service excellence and teamwork.- Work collaboratively within a cross-functional team, including user experience researchers, product managers, and other specialists. Qualifications:- U.S. Citizenship is required.- Ability to obtain a Public Trust Clearance.- A minimum of 7 years of IT experience, focusing on the design and management of large, complex datasets and models.- Proven experience developing data pipelines from various sources, including structured and unstructured data formats.- Proficient in creating ETL processes and conducting testing and validation.- Strong skills in data manipulation using tools such as Python, R, SQL, or SAS.
Are you an innovative Data Engineer looking to make a significant impact in a dynamic environment? Join Reply, a leading technology consulting firm, where you will be at the forefront of data transformation and analytics. As a Data Engineer, you will design and implement robust data pipelines, ensuring that our clients leverage data effectively to drive business decisions.
Join Our Award-Winning Technology Team as a Senior Data Engineer at RAPP ChicagoAbout Us:At RAPP, we are pioneers in driving growth with precision and empathy on a global scale. As a leading next-generation precision marketing agency, we blend data, creativity, technology, and empathy to stimulate client growth. We pride ourselves on creating tailored marketing solutions that resonate with the individuality of our clients and their audiences.We are committed to fostering an inclusive workplace that prioritizes personal well-being.Our Approach:Our dynamic team of superconnectors excels in delivering value from personal brand experiences by focusing on three primary areas: connected data, connected content, and connected decision-making. Our data analysts identify individual insights, our strategists decode client needs, and our talented technologists and creatives craft solutions for authentic customer engagement.Part of Omnicom’s Precision Marketing Group, RAPP comprises over 2,000 creatives, technologists, strategists, and data scientists across more than 15 global markets.Your Role:We are seeking a highly skilled Senior Data Engineer with extensive experience in creating scalable, cloud-native data pipelines and platforms. The ideal candidate will be proficient in Python, Apache Airflow, AWS Lambda, DynamoDB, and dbt and possess a track record in designing reliable data workflows that facilitate advanced analytics, reporting, and machine learning applications. A keen attention to detail, a passion for information management, and the ability to collaborate effectively with creative teams to optimize asset workflows are essential.Your Responsibilities:Data Pipeline DevelopmentDesign, implement, and maintain robust ETL/ELT pipelines leveraging Python and Airflow.Craft serverless workflows using AWS Lambda for scalable, event-driven data processing.Optimize dbt models for analytics and transformations.Data Architecture & StorageDesign schemas and manage data in DynamoDB and other cloud-native storage solutions.Ensure high availability and performance of data storage solutions.
Join RAPP Chicago as a Data Engineer!About Us:At RAPP, we stand at the forefront of precision marketing, leading the charge in fostering growth with empathy and precision on a global scale. Our innovative approach combines data, creativity, and technology to deliver tailored marketing solutions that resonate with individual client needs. We believe in nurturing a diverse and inclusive workplace that prioritizes personal well-being and growth.Our Methodology:Our dynamic team is composed of fearless superconnectors who harness the power of connected data, content, and decision-making to create value from personal brand experiences. We strive to build authentic customer connections through meticulous data analysis, strategic insights, and cutting-edge technology.Your Role:We are on the lookout for a passionate Data Engineer who is eager to learn and contribute to the design and implementation of scalable, cloud-native data pipelines and platforms. The ideal candidate will possess foundational Python knowledge and a keen interest in utilizing modern technologies such as Apache Airflow, AWS Lambda, DynamoDB, and dbt to enhance data workflows. Curiosity about data systems and a commitment to mastering data engineering best practices are essential as you gain hands-on experience.
Join the innovative team at Reply as a Senior Data Engineer, where you will play a crucial role in developing and maintaining data architectures that drive business intelligence and analytics. You will collaborate with cross-functional teams to streamline data processes and ensure robust data governance.Your expertise in data modeling, ETL processes, and cloud technologies will be essential in helping us harness the power of data to make informed business decisions.
Join m1finance as a Senior Data Engineer and play a vital role in transforming our data architecture and analytics capabilities. You will collaborate with cross-functional teams to design and implement robust, scalable data solutions that empower our organization to make data-driven decisions.
Contract|$60/hr - $120/hr|Remote|Chicago, Illinois, United States
Notice: We are currently not accepting new applications for this position.Position: Contract Data Engineer - Civic & Political DataRole Overview: As a Contract Data Engineer, you will play a critical role in enhancing and maintaining the essential data infrastructure and tools at Climate Cabinet. This position is key in identifying opportunities to support the election of pro-climate candidates and the advancement of impactful pro-climate legislation across various state houses.Location: Remote, available to candidates across the United States.Application Deadline: March 17, 2026Expected Start Date: March 2026Contract Duration: 6 months with an option to extend.Hours: Flexible commitment ranging from 15 to 40 hours per week, with project assignments tailored to contractor availability.Compensation: Anticipated hourly rates between $60 and $120, commensurate with experience and demonstrated skill level. Those with extensive experience in political data systems, AI infrastructure, and independent project leadership may receive compensation at the higher end of this range.
Join our dynamic team at Jane Street as a Data Center Engineer. In this role, you will be instrumental in maintaining the reliability and efficiency of our data center operations. You will work closely with cross-functional teams to ensure optimal performance and uptime of our systems.Your responsibilities will include monitoring and troubleshooting data center equipment, implementing best practices for infrastructure management, and collaborating on various projects to enhance our data center capabilities.
Full-time|On-site|San Francisco, CA; Sunnyvale, CA; Seattle, WA; New York, NY; Chicago, IL; Austin, TX
Role Overview DoorDash is hiring a Data Science Manager to lead a team of data scientists working on projects that use data to improve DoorDash’s services. This role involves managing team members, setting direction for analytical model development, and shaping strategies that influence decisions across the company. What You Will Do Oversee and mentor a team of data scientists Guide the development of analytical models and data-driven strategies Work with teams from different departments to spot opportunities for improvement Deliver insights that help DoorDash connect people with top local options Locations San Francisco, CA Sunnyvale, CA Seattle, WA New York, NY Chicago, IL Austin, TX
Full-time|Remote|Netherlands, Remote; Spain, Remote; United Kingdom, Remote; United States, Atlanta; United States, Boston; United States, Charlotte; United States, Chicago; United States, Cincinnati; United States, Miami; United States, Milwaukee; United States, Minneapolis; United States, Philadelphia; United States, Raleigh; United States, St. Louis; United States, Tampa
About Dataiku Dataiku provides a platform for building, deploying, and managing AI and analytics at scale. The company helps organizations design and operate analytics, machine learning, and AI agents with transparency and control. Dataiku connects the enterprise AI stack, supporting centralized governance and multi-vendor environments. Learn more on the Dataiku blog, LinkedIn, X, and YouTube. Role Overview Dataiku is hiring a Data Engineer II for the Enterprise Data and Analytics (EDA) team. This role focuses on delivering reliable data to drive analytics, AI, and insights for teams across the company. The Data Engineer II will play a central part in supporting the Data Platform, which serves centralized analytics, Generative AI engineering, embedded analytics teams, and self-service users. What You Will Do Contribute technical expertise to the Data Platform, working with Snowflake, Dataiku, and GitHub. Develop solutions using Python and SQL, with DataOps processes integrated into GitHub Actions and Dataiku. Collaborate with engineers from other teams to deliver solutions across various technical domains. Promote engineering best practices within the EDA team and the broader Analytics Community. Support analytics and AI initiatives for a wide range of stakeholders. What We Look For Experience with Snowflake, Dataiku, GitHub, Python, and SQL. Understanding of the software development lifecycle and DataOps methodologies. Strong collaboration skills and the ability to work across teams. Clear verbal and written communication. Analytical mindset and curiosity. Locations This position is open to candidates in the Netherlands (remote), Spain (remote), United Kingdom (remote), and the following US cities: Atlanta, Boston, Charlotte, Chicago, Cincinnati, Miami, Milwaukee, Minneapolis, Philadelphia, Raleigh, St. Louis, and Tampa.
About Akuna Capital:Akuna Capital is a forward-thinking trading firm dedicated to collaboration, innovative technologies, data-driven solutions, and automation. We excel as an options market-maker, committed to providing competitive quotes for buying and selling. To achieve this, we develop and implement our own low-latency technologies, trading strategies, and mathematical models.Founded by our partners in Sydney, we established our first office in 2011 in Chicago, the heart of the derivatives industry and options capital worldwide. Today, Akuna proudly operates additional offices in Sydney, Shanghai, London, and Singapore.Your Role as a Software Engineer in Data Engineering at Akuna:As a data-centric organization, we leverage our data as a critical competitive asset. The Akuna Data Engineering team comprises world-class professionals responsible for designing, building, and maintaining systems, applications, and infrastructure necessary for collecting, storing, processing, managing, and querying our data assets. This team plays a vital role in ensuring trustworthy data is accessible and reliable to support various initiatives within Akuna’s Quant, Trading, and Business Operations units.Key Responsibilities:Contribute to the growth of our Data Engineering division, supporting the strategic importance of data at Akuna.Lead the design and enhancement of our data platform across diverse data sources, facilitating multiple streaming, operational, and research workflows.Collaborate closely with Trading, Quant, Technology, and Business Operations teams to identify data production and consumption processes, and define impactful projects.Develop and deploy batch and streaming pipelines to collect and transform our expanding Big Data set within a hybrid cloud architecture, utilizing Kubernetes/EKS, Kafka/MSK, and Databricks/Spark.Mentor junior engineers on software and data engineering best practices.Create clean, well-tested, and well-documented code to support mission-critical applications.Implement automated data validation test suites to ensure data is processed and published according to established Service Level Agreements (SLAs) regarding data quality.
Full-time|$130K/yr - $145K/yr|Hybrid|Chicago, IL; Boston, MA
Join Our Team at Kalderos At Kalderos, we are dedicated to developing innovative technologies that foster transparency, trust, and equity across the healthcare landscape, with a particular emphasis on pharmaceutical pricing. Our true measure of success lies in empowering the healthcare community to prioritize improving the health of individuals. Our achievements are driven by our greatest asset—our people. We thrive on problem-solving, innovation, and the constructive feedback from our colleagues. We are passionate about our mission and are seeking like-minded individuals to join our dynamic team. We are currently in search of a collaborative Data Engineer II. You should be eager to work in a rapidly growing and evolving environment, possessing the skills and experience necessary to excel. Being familiar with the fast-paced and often unpredictable nature of operations will enable you to deliver meaningful results. This is a full-time, hybrid position that can be based in either Chicago, IL or Boston, MA. Please note that relocation assistance will not be provided. Expected Salary Range: $130,000 - $145,000 base salary + bonus
Full-time|$207K/yr - $284K/yr|On-site|Chicago, Illinois; New York, New York; Washington, DC
Discover OktaOkta is recognized as the leading identity management solution, empowering individuals to securely access technology anytime, anywhere, across any device or application. Our versatile products, namely the Okta Platform and Auth0 Platform, ensure secure access, robust authentication, and streamlined automation, placing identity at the forefront of business security and growth.At Okta, we embrace diverse perspectives and experiences. We're not searching for someone who fits a specific mold – we value lifelong learners and those who can enhance our team with their unique insights.Join us in creating a world where identity is truly yours.Are you driven by the challenge of solving intricate data problems and wish to make a significant impact? Do you want to collaborate with a dynamic team of cloud engineers and architects? If so, we want to meet you!The Auth0 platform handles over 100 million logins daily for customers worldwide, and our growth trajectory is remarkable! The Data Platform team is responsible for building and managing essential data services that underpin this platform, ensuring scalability, reliability, efficiency, and operational excellence. As the Senior Manager for this team, you'll collaborate with engineers across the organization, influence the platform roadmap, and create foundational infrastructure that will support Auth0's growth for years to come.If you thrive in nurturing high-performing teams and excel in cross-organizational collaboration, you will find this role to be a perfect match!Your Responsibilities...Lead and cultivate an inclusive, diverse agile software development team focused on delivering value while possessing deep expertise in distributed systems, cloud infrastructure, and site reliability engineering.Foster a culture of discovery, learning, and experimentation within a geo-distributed team through continuous coaching and mentoring.Work closely with architects and engineers to design highly scalable, robust, and extendable services using modern and proven technologies such as Go, Node.js, Kubernetes, Docker, AWS, and Azure.Build and manage data streaming teams utilizing an event-driven architecture and Kafka.Collaborate with product management and engineering leadership to define a platform roadmap that drives the next generation of identity products. Lead the planning, execution, and delivery of data platform services.Implement process improvements that enhance operational excellence and drive efficiency during a significant growth period.
Full-time|$180K/yr - $245K/yr|On-site|Chicago, United States
Global Market Data ManagerThe Global Market Data Manager is pivotal in defining and implementing IMC’s market data strategy globally. This role ensures that data is sourced, governed, and delivered to effectively support trading, research, and risk management initiatives. Positioned at the crossroads of trading, technology, compliance, and vendor relations, you will be responsible for promoting standardization, transparency, and scalability while being attentive to regional requirements. As IMC continues to expand its global trading operations, you will spearhead efforts to centralize market data practices, enhance data usage tracking and compliance, and assess opportunities for building or acquiring data solutions. The ideal candidate will possess extensive industry experience, robust external relationships, and the credibility to serve as a trusted advisor to both traders and technologists.
Join Above Lending as a Senior Data Engineer, where you'll play a pivotal role in transforming our data architecture to enhance decision-making processes. In this position, you will work closely with data scientists and analysts to develop robust data pipelines and ensure data integrity across our systems. Your expertise will drive the optimization of our data storage and retrieval processes, ultimately contributing to our mission of providing superior lending solutions.
Full-time|$185K/yr - $225K/yr|On-site|Chicago, United States
Join our dynamic team at IMC Trading as a Data Platform Operations Engineer focused on performance. In this pivotal role, you'll leverage your data expertise and operational skills to enhance the reliability, accuracy, and efficiency of our data systems that underpin performance analysis and strategic decision-making.As a production-facing engineer, you will collaborate closely with the Performance Team and key stakeholders to ensure that our data workflows are robust, observable, and efficient.Key Responsibilities:Oversee the operational health of the Performance Team's data platform.Monitor and maintain ETL processes and data pipelines to ensure correctness, timeliness, and stability.Design and implement data quality checks, reports, and alerts tailored to performance data.Analyze and resolve data issues and pipeline failures, identifying root causes to prevent recurrence.Manage and troubleshoot GitLab CI/CD pipeline challenges related to performance data workflows.Oversee workloads and resource usage, identifying and rectifying imbalances and bottlenecks.Develop and maintain operational tools, automation, and runbooks.Continuously optimize data pipelines and workloads to enhance performance and resource efficiency.
Join a leading investment bank as a Big Data Engineer in vibrant Chicago, IL! Our client seeks a talented individual to enhance their team, focusing on streamlining data access for clients. Currently, support analysts spend approximately one hour daily accessing various data sources to answer client queries. This project aims to empower clients by granting them direct access to the necessary data. Your Responsibilities: Collaborate with business stakeholders to gather and analyze technical requirements.Design and implement comprehensive end-to-end solutions.Select the most suitable technology stack for effective project execution. Qualifications: Proficient in tools within the Big Data ecosystem.Experience in developing Kafka streaming applications using Java.Hands-on expertise in building Spark applications with Java or Scala.Familiarity with NoSQL databases.Full stack Java development experience.Proficient in developing production applications using Node.js and React. Project Background: Cross Asset Query Tool – designed to enable low latency query capabilities across our Cross Asset data.
Mar 1, 2026
Sign in to browse more jobs
Create account — see all 1,844 results
Tailoring 0 resumes…
Tailoring 0 resumes…
We'll move completed jobs to Ready to Apply automatically.