Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Experience Level
Manager
Qualifications
The ideal candidate will possess:A strong background in data engineering with proven experience in Python, AWS, Airflow, and Snowflake. Demonstrated leadership skills with the ability to inspire and guide a team. Excellent problem-solving skills and a strategic mindset. Strong communication and collaboration skills, with the ability to work effectively across various teams. A degree in Computer Science, Data Science, or a related field.
About the job
Join dev2 as a Manager of Data Engineering and lead our innovative data engineering team. In this pivotal role, you will leverage your expertise in Python, AWS, Airflow, and Snowflake to design and implement robust data solutions that enhance our operational capabilities.
As a key member of our leadership, you will be responsible for driving the strategic vision of our data engineering initiatives, mentoring a talented team, and collaborating with cross-functional partners to ensure data integrity and availability. This hybrid position offers the flexibility of remote work combined with the dynamic environment of our Paris office.
About dev2
dev2 is a leading player in the tech industry, specializing in delivering cutting-edge solutions that help organizations harness the power of data. Our commitment to innovation, excellence, and customer satisfaction sets us apart as a trusted partner for businesses worldwide.
Role overview Jobgether seeks a Senior Python Data Scraping Engineer in France for a partner company. This freelance, remote position centers on building and maintaining large-scale web data extraction systems that power both AI and human-driven workflows. The role requires a strong grasp of scalable solutions and a focus on data quality. What you will do De…
Contract|Remote|Remote — Lyon, Auvergne-Rhône-Alpes, France
Join Toloka AI as a Freelance Data Scraping Engineer specializing in Python. In this dynamic remote role, you will be responsible for extracting and processing data from various web sources to support our innovative AI projects. Your expertise in Python and web scraping will play a crucial role in enhancing our machine learning models and ensuring data accuracy.
Nous sommes à la recherche d’un consultant de haut niveau spécialisé en Data Engineering, disposant d’une expertise approfondie dans les technologies AWS (Kinesis, DMS, S3, etc.), Airflow, Terraform, GitLab CI/CD, Python ainsi qu’une solide maîtrise des bases de données telles qu’Exadata, Oracle, SQL Server et Snowflake. Il sera chargé de concevoir et développer des pipelines de réplication et de traitement de données dans un environnement cloud tout en contribuant activement à la fiabilité, l’automatisation et la scalabilité de notre plateforme data. Rejoignez-nous, épanouissez-vousVos missions : Développement de pipelines ETL/ELT avec Airflow, Python, SQLConnexion aux bases sources (Exadata, Oracle, etc.)Intégration des données dans AWS (S3, Kinesis, DMS, Lambda...)Nettoyage, mapping, déduplication et enrichissement des donnéesCollaboration avec les architectes pour appliquer les modèles ciblesDocumentation technique et référentielsConcevoir et mettre en œuvre les flux de réplicationHarmoniser et transformer les donnéesAutomatiser les pipelines de donnéesDévelopper des Pipelines GitLab CI/CDDévelopper d'IaC pour des architectures sophistiquées full AWS avec Terraform, Ansible et Python.Contribuer au FinOps Vous êtes le bon candidat si :Aptitudes : Être dynamique, proactif, autonome, et ouvert d'esprit, bon communicant avec une grande curiosité scientifiqueTrès bonne maîtrise de l’écosystème AWS (S3, Kinesis, DMS, Lambda…)Très bonne maîtrise des bases de données et des Data warehouses (Exadata, Oracle, SQL Server, Snowflake, etc.)Maîtrise des langages : SQL, PythonMaîtrise des outils de traitement et d’orchestration (Apache Airflow, dbt, Python, SQL)Bonne expérience en Terraform pour l’IaCConnaissance de GitLab CI/CD pour l’automatisation des déploiementsExpérience avec des CRMConnaissance des problématiques de data quality, modélisation, monitoring et alertingExpérience : Justifier d'au moins 6 ans d'expérience en tant que Data Engineer ou Ingénieur Intégration Data dans des environnements similairesUne ou plusieurs certifications (AWS, Snowflake, Terraform, etc.) sont fortement appréciéesLangues : Maîtrise de l'anglais courant Ce que nous avons à vous offrirCAPFI est une entreprise vivante qui place le collectif et le bien-être au cœur de sa culture.
Contract|$58/hr - $58/hr|Remote|Remote — Lyon, Auvergne-Rhône-Alpes, France
Please submit your CV in English and indicate your English proficiency level. Mindrift connects experienced professionals with project-based work focused on testing, evaluating, and improving AI systems for top technology companies. This position is contract-based and not permanent. Role overview The Freelance Data Science Engineer (Python & SQL) works on a variety of projects, each with its own set of tasks. Assignments focus on designing and validating real-world data science challenges that reflect business analytics across sectors like telecom, finance, government, e-commerce, and healthcare. Design computational problems that require advanced Python programming, using libraries such as Pandas, Numpy, Scipy, Scikit-learn, Statsmodels, Matplotlib, and Seaborn. Create tasks that are computationally intensive and not solvable manually within reasonable time. Develop challenges involving complex data processing, statistical analysis, feature engineering, predictive modeling, and extracting insights. Ensure all problems are deterministic and reproducible, minimizing randomness or using fixed seeds. Base scenarios on authentic business cases, including customer analytics, risk assessment, fraud detection, forecasting, optimization, and operational efficiency. Design end-to-end problems covering the full data science pipeline: data ingestion, cleaning, exploratory data analysis, modeling, validation, and deployment considerations. Incorporate big data scenarios that require scalable computational approaches. Validate solutions in Python using standard libraries and statistical methods. Document problem statements clearly within realistic business contexts and provide validated answers. Requirements Minimum 5 years of hands-on data science experience with measurable business results. Portfolio of completed projects or publications that demonstrate real-world problem-solving. Advanced Python programming skills for data science (Pandas, Numpy, Scipy, Scikit-learn, Statsmodels). Strong foundation in statistical analysis and machine learning, with practical application of algorithms and methods. Proficiency in SQL and database operations for data analysis. Experience working with Generative AI technologies, including LLMs, RAG, prompt engineering, and vector databases. Understanding of MLOps practices and model deployment workflows. Familiarity with frameworks such as TensorFlow, PyTorch, or LangChain. Excellent written English at C1 level or above. How to join Apply Pass qualifications Join a project Complete assigned tasks Receive payment Location: Remote , Lyon, Auvergne-Rhône-Alpes, France
Contract|$58/hr - $58/hr|Remote|Remote — Lyon, Auvergne-Rhône-Alpes, France
We invite you to submit your CV in English, along with your English language proficiency level.Mindrift connects experts with project-based AI opportunities in collaboration with leading technology firms, focusing on the testing, evaluation, and enhancement of AI systems. This is a project-based role, not a permanent position.Opportunity Overview:Each project presents unique challenges, and contributors may be tasked to:Craft innovative computational data science problems that mirror real-world analytical processes across various sectors, including telecom, finance, government, e-commerce, and healthcare.Develop problems that necessitate Python programming for resolution, utilizing libraries such as Pandas, Numpy, Scipy, Sklearn, Statsmodels, Matplotlib, and Seaborn.Ensure tasks are computationally demanding, requiring significant time frames for resolution (days to weeks).Create problems that involve complex reasoning in data processing, statistical analysis, feature engineering, predictive modeling, and insight extraction.Design deterministic problems with verifiable outcomes, avoiding stochastic elements or employing fixed random seeds for precise reproducibility.Base scenarios on tangible business challenges, such as customer analytics, risk assessment, fraud detection, forecasting, optimization, and operational efficiency.Construct end-to-end problems that traverse the entire data science pipeline, from data ingestion and cleaning to exploratory data analysis, modeling, validation, and deployment considerations.Incorporate big data processing challenges that require scalable computational solutions.Validate solutions using Python with standard data science libraries and statistical methodologies.Clearly document problem statements with realistic business contexts and provide verified correct solutions.
About UsPhotoroom was founded in 2020 after gaining acceptance into Y Combinator, rapidly becoming the leading AI photo editing platform globally. Our mission is to empower users to generate studio-quality product images within minutes.With over 300 million downloads and processing over 5 billion images annually, we cater to both individual creators and significant enterprises, including Amazon, DoorDash, and Decathlon, through our B2C application and B2B API offerings.As a profitable, remote-friendly organization backed by Series B funding, we are aiming for a remarkable 40% year-over-year growth. Our team of 100+ dedicated professionals emphasizes innovation, craftsmanship, and collaboration, making a significant impact for entrepreneurs and businesses worldwide. We are in search of a talented Python engineer who will take charge of our public API, which serves as the backbone for how developers incorporate Photoroom into their applications. This pivotal role sits at the crossroads of developer experience and applied AI, where you will influence the interface utilized by both self-service users and large corporate clients. Salary Range: 75k – 110k* + Stock Options/BSPCE Enjoy the flexibility of working from anywhere in Europe, with fully covered monthly trips to Paris, or feel free to visit our office more frequently. We offer relocation assistance (up to €10k), which includes support for visa and housing arrangements. Engage in company socials, retreats, offsites, and regular team events. Join a diverse international team in an English-speaking environment, with optional language classes available.We can accommodate higher salaries for exceptional candidates and adjust for cost of living where necessary. About the Role You will conceptualize, construct, and enhance our public API product — the central interface through which developers harness Photoroom’s AI capabilities.You will deliver features utilized by both self-service developers and large enterprise clients, ensuring the API remains dependable, scalable, and user-friendly.You will manage the API surface from design choices (naming, versioning, structure) through to implementation, performance, and long-term sustainability.You will rapidly iterate based on real-world usage — we deploy multiple times a week — leveraging data and user feedback.
Join Sopra Steria as a Big Data Engineer Intern in the vibrant aerospace sector! This role is perfect for motivated individuals looking to gain hands-on experience in cutting-edge technologies such as Python, SQL, and AWS. You will be part of a dynamic team, contributing to innovative projects that shape the future of aviation.
Jobgether is looking for a Senior Full-Stack Engineer with strong experience in Python and React. This position is based in France and focuses on building and improving applications that support the company’s growth. Role overview This role centers on developing and maintaining full-stack features using Python and React. The Senior Full-Stack Engineer will work closely with colleagues from various departments, contributing to both technical discussions and hands-on implementation. What you will do Collaborate with cross-functional teams to deliver new features and improvements Participate in architecture discussions and help shape technical decisions Develop and maintain applications that improve user experience Requirements Strong background in Python and React Experience building and maintaining full-stack applications Ability to work effectively with other teams and contribute to product direction
Join dev2 as a Manager of Data Engineering and lead our innovative data engineering team. In this pivotal role, you will leverage your expertise in Python, AWS, Airflow, and Snowflake to design and implement robust data solutions that enhance our operational capabilities.As a key member of our leadership, you will be responsible for driving the strategic vision of our data engineering initiatives, mentoring a talented team, and collaborating with cross-functional partners to ensure data integrity and availability. This hybrid position offers the flexibility of remote work combined with the dynamic environment of our Paris office.
Role OverviewJoin our dynamic team at Mistral as a Data and Analytics Engineer. In this pivotal role, you will play a crucial part in designing, optimizing, and managing our data infrastructure. You will handle vast datasets, enabling our product teams to access secure and reliable data swiftly. Your expertise will be instrumental in enhancing our cutting-edge AI models and empowering business users to make data-driven decisions.Key Responsibilities• Design, construct, and maintain robust data pipelines, ETL workflows, and analytics frameworks. Implement automation for data quality assessments and validation processes.• Collaborate with cross-functional teams to identify data requirements and deliver actionable, high-quality solutions, specifically partnering with machine learning teams to facilitate model training, deployment pipelines, and feature stores.• Enhance data storage, retrieval, processing, and query performance while ensuring scalability and cost-effectiveness.• Establish and uphold data governance, metadata management, and data lineage protocols.• Safeguard data integrity, security, and compliance with prevailing industry standards.Your Profile• Master’s degree in Computer Science, Engineering, Statistics, or a related discipline.• A minimum of 3 years of experience in data engineering, analytics engineering, or a similar position.• Proficient in Python and SQL.• Familiarity with dbt.• Hands-on experience with cloud platforms (e.g., AWS, GCP, Azure) and data warehousing solutions (e.g., Snowflake, BigQuery, Redshift, Clickhouse).• Excellent analytical and problem-solving skills with a strong attention to detail.• Proven ability to communicate complex data concepts effectively to both technical and non-technical stakeholders.Preferred Qualifications• Experience with data visualization tools (e.g., Tableau, Looker) is a plus.• Familiarity with machine learning concepts and practices.
Join Meritis as a Senior Python Developer / DevOps Engineer and become a key player in our dynamic team. We are looking for an experienced professional who is passionate about software development and cloud technologies. In this role, you will design, develop, and maintain robust Python applications while leveraging DevOps practices to enhance our deployment processes.
jobgether is seeking a Senior Data Engineer based in France. This position centers on shaping and advancing the company’s data architecture and engineering projects. Role overview This role involves building and optimizing data solutions that support business operations. Collaboration with other skilled professionals is a core part of the work, with a focus on using modern technologies to address data challenges. What you will do Drive key data engineering initiatives Contribute to the design and improvement of data architecture Work closely with other team members to deliver reliable data solutions Location This position is based in France.
Your ImpactJoin our dynamic team as a Senior Software Engineer in the AI Health Companion division, where your expertise will help us revolutionize healthcare. You will play a crucial role in evolving the Doctolib application into a comprehensive health companion powered by artificial intelligence, enabling patients to manage their health with precision and compassion.Your primary responsibility will be to collaborate within a feature team to develop an AI-driven chat interface designed to address patients' inquiries regarding their health and the health of their loved ones. Your contributions will directly enhance the experience of healthcare teams and patients alike.
Role overview Alten is looking for a Senior Data Engineer to join the team in Paris. This role focuses on designing and building data solutions that support business intelligence and analytics efforts. What you will do Design and implement data architectures and pipelines Support business intelligence and analytics projects with reliable data solutions Work to optimize data systems for performance and integrity Location This position is based in Paris.
Join our dynamic team as a Senior Data Engineer at Sopra Steria in Courbevoie, France. In this pivotal role, you will leverage your expertise to design, implement, and maintain robust data solutions that drive business intelligence and analytics. Collaborate with cross-functional teams to ensure our data architecture is optimized for performance and scalability.
Join Sopra Steria, a leading European consulting, digital services, and software development company, as a Senior Data Engineer in our DCoE France team. We are seeking a highly skilled and motivated data engineer to enhance our data processing capabilities and drive innovation within our projects.
As a Python Data Scientist Developer, you will play a pivotal role in enhancing our Python platform. Your core responsibilities will include:Development and Programming: Contribute to the evolution of the architecture, develop new modules and libraries, and maintain existing software.Performance Improvement: Engage in continuous performance optimization, including prompt engineering and the development of AI processing chains.Deployment Pipelines: Master deployment pipelines to ensure seamless integration.Code Reviews: Play an active role in code reviews, fostering a culture of quality and collaboration.Collaboration: Work closely with business teams and analysts to translate business needs into innovative technical solutions.Project Leadership: Independently manage technical project oversight.Research and Innovation: Stay updated with the latest advancements in generative AI and propose innovative ideas for projects and enhancements.Documentation: Create clear and comprehensive technical documentation.
About TeadsTeads is a premier omnichannel advertising platform dedicated to delivering exceptional outcomes for brand and performance advertisers across various screens. Our mission is to create significant business results for branding and performance goals by employing advanced predictive AI technology. By connecting high-quality media with captivating brand creatives, we ensure context-driven addressability and measurement. Teads collaborates directly with over 10,000 publishers and 20,000 advertisers around the globe. Our headquarters is located in New York, New York, and we are proud to have a diverse team of approximately 1,700 professionals across more than 30 countries.For more details, visit www.teads.com.Our Engineering Challenges at TeadsDevelop highly efficient and user-friendly web products utilized by thousands of professionals in the world’s leading publishing, advertising, and agency sectors.Leverage a rich and diverse tech stack and system architecture optimized for performance, scalability, resiliency, and cost-effectiveness, primarily utilizing Scala and TypeScript.Operate in a high-traffic environment (2.2 billion users monthly, 100 billion events daily) while ensuring low latency and high availability (2 million requests per second with responses under 150 milliseconds).Manage large datasets with millisecond access times to support near real-time complex auction resolution algorithms (18 million predictions per second).Adapt swiftly to a fast-paced environment while collaborating closely with Product teams to continuously refine our Cloud infrastructure in line with new features.
Role Overview Alten is hiring a Python / Vue.js Developer focused on Data and DevOps for the Boulogne-Billancourt office. This position centers on building applications and tools that improve data management and streamline deployment processes. What You Will Do Develop applications and tools using Python and Vue.js to support data analytics and DevOps workflows Work closely with teams from different disciplines to deliver reliable, maintainable software Contribute to projects that aim to improve operational efficiency for clients Collaboration This role involves regular interaction with cross-functional colleagues to ensure software solutions align with client requirements and quality standards.
About UsAt MARGO, our consultants focus on what truly matters: engaging in complex projects that blend intellectual challenge with real business impact. We support the most prominent players in finance, industry, and technology by designing and developing high-performance software solutions, including distributed architectures, cloud platforms, real-time systems, and large-scale critical applications.Why Join the Software Engineering Practice?You will be part of a team led by Paul Blois, who has been the Practice Manager at MARGO for five years. Working alongside him means benefiting from an in-depth understanding of the company, receiving close mentorship in your career development, and contributing to demanding IT projects with significant impact for our clients.Your ResponsibilitiesYou will join a team of 12 professionals tasked with translating complex financial mechanisms into an elegant and resilient software architecture. At the heart of a leading energy company's market activities, you will design the contract modeling engine that manages the group’s risk exposure.As a Senior Python Developer, your challenges will include:- Architecture & Design: Developing and evolving APIs and microservices capable of modeling rich and varied contractual life cycles.- Engineering Excellence: Ensuring code quality and maintainability (Clean Code, extensive unit and integration testing) in an environment where even a minor miscalculation can have direct financial repercussions.- Domain Modeling: Collaborating with business experts to convert complex management rules (gas purchases, P&L tracking, European regulatory constraints) into high-performing data models.- Performance & Scalability: Optimizing services for the aggregation of massive data volumes necessary for real-time risk management tools.