Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Experience Level
Mid to Senior
About the job
Join our innovative team at Moving Walls India Pvt Ltd as a Data Engineer, where you will play a crucial role in driving data solutions and analytics to support our business objectives. We are looking for a passionate individual who thrives in a fast-paced environment and is eager to tackle complex data challenges.
Join our innovative team at Moving Walls India Pvt Ltd as a Data Engineer, where you will play a crucial role in driving data solutions and analytics to support our business objectives. We are looking for a passionate individual who thrives in a fast-paced environment and is eager to tackle complex data challenges.
About Gen Digital Gen Digital is a global company focused on digital freedom and security. Our brands include Norton, Avast, LifeLock, and MoneyLion, serving nearly 500 million users in over 150 countries. We provide cybersecurity, online privacy, identity protection, and financial wellness products. Our mission centers on helping people manage and secure their digital and financial lives. We value diverse experiences and ideas, and we see AI as a partner for innovation. Gen Digital encourages autonomy, supports career growth, and offers flexible work options, generous time off, competitive pay, and wellness programs. The company culture emphasizes customer satisfaction, open discussion, experimentation, and continuous learning. Team members collaborate in an environment that respects and values differences as strengths. Senior Staff Data Engineer – Role Overview The Senior Staff Data Engineer will serve as a senior technical leader within the organization. This role focuses on designing and implementing large-scale data solutions that support Gen Digital’s cybersecurity platform strategy. The position combines deep technical skill with organizational influence. Key responsibilities include: Designing complex data architectures for enterprise-scale needs Implementing solutions that support a multi-petabyte data infrastructure Mentoring and guiding engineering teams Shaping the technical vision for data systems serving millions of users Location Chennai, India
About Us:BigID is a pioneering tech startup specializing in cutting-edge solutions for data security, compliance, privacy, and AI data management. We are at the forefront of the data landscape, empowering our customers to mitigate risks, foster business innovation, achieve compliance, build trust, make informed decisions, and maximize the value of their data.We are committed to building a global team united by a passion for innovation and advanced technology. BigID has received numerous accolades, including:Named a Hot Company in Artificial Intelligence and Machine Learning at the Global InfoSec AwardsListed in Citizens JMP Cyber 66 as one of the Hottest Privately Held Cybersecurity CompaniesCRN 100 list recognizes BigID as one of the 20 Coolest Identity Access Management and Data Protection Companies for three consecutive yearsRanked among the DUNS 100 Best Tech Companies to Work forFeatured as a Top 3 Big Data and AI Vendor to Watch in the 2023 BigDATAwire Readers' and Editors' Choice AwardsIncluded in the 2024 Inc. 5000 list for the fourth consecutive year!Shortlisted for the 2024 AI Awards in the Best Use of AI in Cybersecurity categoryAt BigID, our team is the cornerstone of our success. Join our dynamic, people-centric culture where you’ll have the opportunity to collaborate with some of the most talented professionals in the industry who prioritize innovation, diversity, integrity, and teamwork.Who We Are Looking For:We are on the hunt for a Senior Data Platform Engineer to enhance our Data Platform team. The ideal candidate will possess substantial experience in data engineering, particularly with Kafka and Elasticsearch, to design and maintain our robust data platforms. You will collaborate closely with cross-functional teams to ensure the scalability and reliability of our data solutions.Role Overview:As a Senior Data Platform Engineer, you will be instrumental in the design, development, maintenance, troubleshooting, and implementation of our big data architecture. Your proficiency in Elastic, Kafka, and Node.js will play a vital role in ensuring the scalability and performance of our data systems.Key Responsibilities:Develop data processing pipelines utilizing Kafka for real-time data streaming.Enhance and manage search functionalities leveraging Elastic technologies.Work alongside product managers, data analysts, and stakeholders to gather requirements and translate them into technical specifications.Lead code reviews and promote best practices in coding and data handling.
At PDI Technologies, we empower many of the world’s leading convenience retail and petroleum brands through innovative technology solutions that foster growth and enhance operational efficiency.By connecting convenience globally, we enable businesses to boost productivity, make informed decisions, and engage customers rapidly through loyalty programs, shopper insights, and unparalleled real-time market intelligence via mobile applications like GasBuddy. We are a global team dedicated to excellence, collaboration, and delivering real impact. Join us and be part of a company that prioritizes diversity, integrity, and growth.Role Overview:PDI Technologies is in search of a Product Owner specializing in Data to enhance the data capabilities of our loyalty solutions. This role is perfect for a passionate Product Owner with experience in driving data-centric products and enabling teams that deliver scalable data services, analytics, reporting capabilities, and data pipelines. As the Product Owner for the Loyalty Data team, you will collaborate with engineering, Product Managers, analysts, customers, and business stakeholders to define and implement high-impact data features that enhance consumer-facing loyalty experiences, operational reporting, segmentation, personalization, and downstream integrations. You will be responsible for driving product requirements related to data ingestion, data modeling, data quality, APIs, event streams, and insights, ensuring that the Loyalty Platform remains robust, reliable, and optimized for data-driven business outcomes. Prior experience with data platforms, data engineering workflows, analytics tools, or consumer data systems (CRM, loyalty, personalization engines) is highly preferred.
Join our dynamic team as a Technology Engineer specializing in DevOps, containerization, and big data technologies. In this pivotal role, you will drive enterprise-level digital and data platform initiatives, ensuring the design, implementation, and optimization of scalable infrastructure and data solutions.Key ResponsibilitiesDevOps & CI/CDDesign, implement, and maintain robust CI/CD pipelines leveraging tools such as Jenkins and GitOps.Automate build, deployment, and release processes to enhance operational efficiency and reliability.Containerization & OrchestrationDeploy and manage containerized applications utilizing Kubernetes and OpenShift.Ensure high availability, scalability, and resilience of applications.Infrastructure as Code (IaC)Develop and manage infrastructure using Terraform, Ansible, or comparable tools.Maintain version-controlled infrastructure to promote consistency and scalability.Big Data EngineeringArchitect and implement data solutions with Hadoop, Spark, and Kafka.Manage large-scale data processing and streaming pipelines.Distributed SystemsDesign and oversee distributed data architectures.Optimize data storage and processing performance across various systems.CollaborationEngage closely with engineering, DevOps, and data teams to deliver comprehensive solutions.Translate business and technical requirements into scalable implementations.Monitoring & Performance OptimizationImplement monitoring, logging, and alerting solutions.Continuously enhance system performance, reliability, and cost-efficiency.Security & ComplianceEnsure that infrastructure and data platforms adhere to security best practices.Maintain compliance with enterprise and regulatory standards.
Join our dynamic team at minderacraft as a Senior Data Engineer, where your expertise will be pivotal in shaping our data infrastructure. We are seeking a highly skilled individual with a deep understanding of big data technologies, ETL/ELT processes, and data modeling methodologies. Your primary focus will be to design, optimize, and maintain robust data pipelines, ensuring the integrity of our data and supporting our analytics initiatives.
About UsAt Arcadia, we are at the forefront of empowering energy innovators and consumers in combating the climate crisis. Our cutting-edge software and APIs are transforming an industry constrained by outdated systems, granting unprecedented access to the data and clean energy necessary for a decarbonized energy grid.Since our inception in 2014, we have been dedicated to dismantling the fossil fuel monopoly by breaking down institutional barriers that hinder decarbonization. To date, we've connected hundreds of thousands of consumers and small businesses with premium clean energy options. Today, we are expanding our vision even further with the launch of Arc, a groundbreaking SaaS platform that enables developers and energy innovators to create tailored energy experiences, accelerating the transition from traditional energy systems to a digitized network.We believe that solving one of the world's most significant challenges requires innovative thinking and diverse perspectives. We are building a team of individuals from various backgrounds, industries, and educational experiences. If you share our commitment to ushering in the era of clean energy, we would love to see what unique qualities you can bring to Arcadia! Visit us at www.arcadia.com.Position OverviewThe Custom Data Integration (CDI) team is the operational backbone that allows Arcadia to efficiently ingest and standardize utility data, surpassing the limitations of our core product and automation capabilities. This team addresses non-standard customer requirements, including integrations with sustainability platforms (e.g., Envizi, ESPM, NZC), custom data deliveries, reports, and the processing of various bill formats through specialized internal workflows.As the Team Lead for CDI, you will have complete ownership of the Chennai CDI pod, overseeing throughput, quality, team capabilities, and operational alignment with Product, Engineering, Customer Success, Project Management, and Data Engineering teams in both India and the US. Your responsibility includes transforming complex, unstructured requirements into standardized, audit-ready outputs that facilitate customer sustainability reporting, billing audit workflows, and compliance needs.This is a working-leadership role that combines hands-on project management with team leadership. You will personally oversee the most intricate projects from start to finish while guiding Data Analysts and Senior Data Analysts through simpler projects under your mentorship. Additionally, you will design SOPs, onboarding processes, QA frameworks, and delivery standards for both one-off and recurring projects, consistently identifying opportunities for automation to minimize manual interventions.
Full-time|On-site|Bengaluru, Karnataka, India; Chennai, Tamil Nadu, India
About Zuora Zuora helps businesses adapt and grow by offering solutions that support modern revenue models. The Zuora platform handles everything from subscription services and usage-based pricing to AI-powered offerings, making it easier for companies to introduce new products, manage complex billing, and build steady, recurring revenue streams. With over ten years of experience leading the Subscription Economy, Zuora continues to evolve its quote-to-cash platform. The company focuses on giving organizations a flexible, AI-ready foundation to monetize products and services. Role Overview: Senior Data Scientist Zuora is looking for a Senior Data Scientist in Bengaluru or Chennai to help shape AI and machine learning solutions that drive real business outcomes. This role suits someone who combines strong technical expertise with a strategic approach to solving business challenges. What You Will Do Spot opportunities where AI and machine learning can solve customer and business problems with measurable results Work closely with stakeholders to guide decisions using data-driven insights Design and deploy machine learning models, including large language models, using both structured and unstructured data Partner with engineering teams to build scalable data and ML pipelines Take ownership of projects from initial concept through to production, focusing on business impact Who Thrives Here This position is a strong fit for those who bring deep technical skills and business awareness, and who enjoy turning complex problems into practical, production-ready AI solutions.
Role Overview minderacraft is hiring a Data Engineer in Chennai, Tamil Nadu, India. This position focuses on building and improving the company’s data infrastructure using AWS, Snowflake, and dbt. The role centers on designing, developing, and maintaining data pipelines that deliver reliable, high-quality data for business needs. What You Will Do Design and build data pipelines with AWS, Snowflake, and dbt Maintain and optimize existing data workflows to ensure timely and accurate data delivery Work with teams across the company to understand their data requirements and translate them into technical solutions Identify and resolve data issues to improve data quality Support and implement data governance practices What Helps You Succeed Hands-on experience with AWS, Snowflake, and dbt Strong analytical and problem-solving abilities Ability to communicate with both technical and non-technical teams Attention to data quality and process improvement
Join our dynamic Data Team at Mindera, where as a Senior Data Engineer, you will play a pivotal role in creating the data pipelines and tables that drive our business-critical dashboards, empower self-service analytics, and support advanced machine learning models and real-time data products. Utilizing state-of-the-art tools such as DBT, Spark, and Airflow, you will convert high-volume raw event data into user-friendly, impactful datasets.You will collaborate cross-functionally with Machine Learning Engineers, Data Scientists, and BI Developers to facilitate data-driven decision-making throughout the organization. Our engineers benefit from a culture of autonomy, innovation, and continuous learning, supported by structured career progression paths and access to training resources.As a Senior Data Engineer, your responsibilities will include:Designing and constructing scalable data pipelines, models, and feature stores to support analytics and machine learning workloads.Deploying and managing cloud-native data applications on AWS, leveraging CI/CD pipelines to automate builds, tests, and releases.Ensuring the technical quality, performance, and reliability of production-grade data pipelines through robust observability and engineering best practices.
Join our dynamic team at gsstech-group as a Data Engineer specializing in real-time streaming and event-driven architectures. We seek a talented individual who will take charge of creating scalable data pipelines, enhancing streaming systems, and achieving optimal performance in distributed environments.Key ResponsibilitiesDesign and implement real-time data streaming pipelines utilizing technologies such as Apache Flink, Kafka, and JavaBuild and sustain event-driven architectures for extensive distributed systemsConduct JVM tuning and performance optimization for streaming applicationsDevelop and deploy applications utilizing containerization tools (Docker, Kubernetes)Utilize the Cloudera platform for data engineering and pipeline orchestrationApply robust design patterns while upholding high coding standardsTroubleshoot and resolve challenges within a distributed systems ecosystemCollaborate with DevOps teams to manage CI/CD pipelines (GitHub, Jenkins)Operate on Linux-based systems, including configuration and shell scriptingEnhance data processing through caching mechanisms (e.g., Redis – considered a plus)Required Skills & ExperienceExtensive hands-on experience in Real-Time Streaming (Flink / Kafka / Java)Comprehensive understanding of event-driven architectureExperience with JVM performance tuningProficiency in Docker and KubernetesStrong background in Linux OS and shell scriptingKnowledge of design patterns and scalable system designFamiliarity with CI/CD tools such as GitHub and JenkinsPractical troubleshooting experience in distributed systemsNice to HaveFamiliarity with Redis or other caching systemsExposure to Cloudera Data Platform engineeringPrior experience in the banking or financial domain is advantageous
Vultr is expanding its global cloud infrastructure with a new data center in Chennai. The Data Center Operations Engineer will play a central role in preparing this facility for launch and ensuring it meets Vultr’s operational standards from day one. This position acts as the on-site operational lead during the initial ramp-up phase, before permanent staff are in place. The engineer will coordinate readiness activities, make deployment decisions, and serve as Vultr’s technical and operational point of contact at the new site. The role offers significant autonomy and direct influence on how the Chennai data center comes online and supports Vultr’s growth in new markets. What you will do Act as the primary operational owner for the Chennai data center during launch and ramp-up phases Evaluate site conditions to apply Vultr’s infrastructure standards and operational procedures Plan, coordinate, and sequence deployment activities to bring systems online safely and efficiently Track and report on operational performance metrics throughout the launch process Employee benefits Annual medical insurance stipend 9 paid company holidays Generous leave policy, including a 1-month paid sabbatical every 5 years and an anniversary bonus each year Professional development reimbursement Fitness membership reimbursement Company-sponsored Wellable subscription
ValGenesis builds digital validation platforms for life sciences organizations. Its products support pharmaceutical and biotech companies as they move toward digital processes, maintain regulatory compliance, and ensure manufacturing quality throughout the product lifecycle. Thirty of the top fifty global firms in this industry use ValGenesis solutions. More details about the company's work in paperless validation are available at valgenesis.com/about. Role overview The Senior Software Engineer - Data Engineering position is based in Chennai. This role focuses on developing and maintaining data engineering solutions to support ValGenesis platforms. Work will center on building systems that help life sciences clients manage and analyze data for compliance and quality throughout their operations.
We are seeking a talented Senior Data Engineer with extensive knowledge of real-time data streaming and distributed data processing to architect, develop, and enhance state-of-the-art data platforms. This pivotal role is essential for advancing event-driven architecture and real-time analytics within critical banking systems, particularly in risk and compliance domains.In this position, you will work synergistically with data architects, platform engineers, and business stakeholders to create low-latency, high-throughput data pipelines that empower sophisticated analytics and informed decision-making.Key ResponsibilitiesDesign, develop, and maintain robust real-time streaming pipelines utilizing Apache Kafka, PySpark, and FlinkConstruct scalable and fault-tolerant event-driven data architecturesHandle high-volume streaming data ensuring low latency and high reliabilityIntegrate diverse data sources into centralized data platforms (Data Lake / Lakehouse)Enhance data pipelines for performance, scalability, and cost-effectivenessUphold data quality, governance, and compliance in line with banking regulationsCollaborate with cross-functional teams to convert business needs into technical solutionsMonitor and debug streaming jobs and production pipelinesRequired Skills & Experience5+ years of experience in Data EngineeringDemonstrated proficiency in:PySpark / Spark StreamingApache Kafka (Producers, Consumers, Kafka Streams)Apache Flink or other real-time processing frameworksProven experience in building real-time / near real-time data pipelinesStrong understanding of distributed systems and event-driven architectureProficiency in Python / Java / ScalaExperience with data lakes, ETL/ELT pipelines, and big data ecosystemsFamiliarity with cloud platforms (AWS / Azure / GCP) is advantageousKnowledge of banking, risk, or compliance data systems is highly preferredPreferred QualificationsExperience in the financial services or banking domainExposure to data governance, regulatory reporting, or compliance systemsUnderstanding of CI/CD pipelines and DevOps practices for data platforms
We are on the lookout for an exceptional Senior Data Engineer to architect, construct, and sustain scalable data pipelines for our enterprise-level data platforms focused on the Risk & Compliance sector. The successful candidate will possess a deep understanding of PySpark, Python, and data engineering best practices, emphasizing data quality, governance, and security.Key ResponsibilitiesDesign, develop, and enhance scalable data pipelines leveraging PySpark and PythonCreate robust ETL/ELT workflows to manage substantial volumes of both structured and unstructured dataCollaborate with data scientists, analysts, and business stakeholders to produce high-quality datasetsGuarantee data integrity, accuracy, and reliability through comprehensive validation frameworks and monitoringImplement data security and access control mechanisms that align with compliance standardsPartner closely with Risk & Compliance teams to fulfill regulatory and reporting obligationsOptimize the performance of data processing jobs and queriesMaintain and upgrade existing data architecture and pipelinesRequired Skills & Experience6+ years of experience in Data EngineeringExtensive hands-on experience with PySpark and PythonSolid background in SQL and Oracle databasesProven experience in constructing and managing large-scale data pipelinesStrong understanding of data warehousing concepts and ETL frameworksExperience with data validation, data quality, and governance frameworksFamiliarity with cloud platforms (AWS/Azure/GCP) is a plusExperience in the banking, financial services, or risk & compliance domain is preferredKey CompetenciesStrong analytical and problem-solving skillsAdept at working in a fast-paced, collaborative environmentExcellent communication and stakeholder management abilitiesMeticulous attention to detail, especially regarding data quality and securityNice to HaveExperience with Big Data ecosystems (Hadoop, Spark)Knowledge of data security and regulatory compliance frameworksPrior experience with enterprise data platforms
Join our dynamic team as a Senior AI Data Scientist, where you will leverage your expertise in artificial intelligence and data science to drive innovative solutions for our clients. You will be responsible for developing advanced machine learning models, conducting data analysis, and collaborating with cross-functional teams to implement AI-driven strategies. This role is ideal for a passionate and skilled data scientist eager to make a significant impact.
Speechify’s mission is to remove reading barriers from learning. With a user base of over 50 million, our products convert materials like PDFs, books, Google Docs, news articles, and websites into audio. People use Speechify to read faster, understand more, and remember what matters. Our platform includes apps for iOS, Android, Mac, Chrome, and the web. Recent recognition includes Chrome Extension of the Year from Google and Apple’s 2025 Design Award for Inclusivity. Our distributed team includes nearly 200 professionals: frontend and backend engineers, AI researchers, and others from companies such as Amazon, Microsoft, and Google, as well as alumni of top PhD programs and founders of startups like Stripe and Vercel. Role Overview Speechify is hiring a Software Engineer to join the AI team’s data division in Chennai, India. This role centers on data collection for model training. The team’s focus: building high-quality datasets at petabyte scale while keeping costs low through a blend of infrastructure, engineering, and research. What You Will Do Find and integrate new audio data sources into the ingestion pipeline. Manage and improve cloud infrastructure for data ingestion, using GCP and Terraform. Work with scientists to push the boundaries of cost, throughput, and data quality for next-generation models. Collaborate with AI team members and leadership to shape the dataset roadmap for future consumer and enterprise products. Qualifications BS, MS, or PhD in Computer Science or a related discipline. At least 5 years of software development experience. Strong skills in bash and Python scripting on Linux. Expertise with Docker and Infrastructure-as-Code, plus hands-on work with at least one major cloud provider (GCP preferred). Background with web crawlers and large-scale data processing is a plus. Comfort with multitasking and shifting priorities. Clear written and verbal communication skills.
About Gen Digital Inc. Gen Digital Inc. brings together trusted consumer brands like Norton, Avast, LifeLock, and MoneyLion, serving nearly 500 million users in over 150 countries. The company’s mission centers on digital freedom, cybersecurity, online privacy, identity protection, and financial wellness. Gen’s legacy is built on helping people secure and manage their digital and financial lives. Employees at Gen benefit from flexible work options, comprehensive support, and resources to help them succeed. The company values open communication, experimentation, and continuous learning, and actively welcomes diverse backgrounds and perspectives. Gen offers competitive pay, benefits, and wellness programs to support work-life balance. Role Overview: Senior Data Platform Engineer Location: Chennai, India The Senior Data Platform Engineer joins the Data Platform Operations team, focusing on the daily management and monitoring of Gen’s data platforms and pipelines. This role is key to ensuring smooth operations, ongoing improvements, and reliable maintenance of data infrastructure. What You Will Do Oversee daily operations and health of data platforms and pipelines Drive enhancements for operational efficiency and platform maintenance Work closely with data engineers, analysts, and platform teams Maintain platform stability, observability, and cost-effectiveness Support readiness for new data use cases and business needs What Gen Values Customer focus and a collaborative approach Openness to new ideas and continuous improvement Respect for diverse experiences and backgrounds Supportive teamwork and recognition of individual strengths If the mission and values at Gen resonate, consider exploring a career with the team in Chennai.
About ValGenesis ValGenesis stands at the forefront of digital validation solutions for the life sciences sector. Our comprehensive suite of products empowers 30 of the top 50 global pharmaceutical and biotech firms to embrace digital transformation, ensuring total compliance, and achieving excellence in manufacturing intelligence across their product lifecycle.Discover the opportunity to be part of ValGenesis, the leading standard for paperless validation in the Life Sciences industry: valgenesis.com/aboutAbout the Role:As a Senior Software Engineer in Data Engineering, you will play a pivotal role in developing scalable data solutions that support our innovative products.
AECOM, a global leader in infrastructure and environmental services, is seeking a Principal Engineer specializing in Structures for our Data Center projects. In this role, you will leverage your expertise to design and analyze innovative structure solutions that meet project specifications and regulatory requirements.Your responsibilities will include collaborating with multidisciplinary teams, overseeing project execution, and ensuring that all engineering practices align with AECOM's standards of quality and safety.
Join our innovative team at Moving Walls India Pvt Ltd as a Data Engineer, where you will play a crucial role in driving data solutions and analytics to support our business objectives. We are looking for a passionate individual who thrives in a fast-paced environment and is eager to tackle complex data challenges.
About Gen Digital Gen Digital is a global company focused on digital freedom and security. Our brands include Norton, Avast, LifeLock, and MoneyLion, serving nearly 500 million users in over 150 countries. We provide cybersecurity, online privacy, identity protection, and financial wellness products. Our mission centers on helping people manage and secure their digital and financial lives. We value diverse experiences and ideas, and we see AI as a partner for innovation. Gen Digital encourages autonomy, supports career growth, and offers flexible work options, generous time off, competitive pay, and wellness programs. The company culture emphasizes customer satisfaction, open discussion, experimentation, and continuous learning. Team members collaborate in an environment that respects and values differences as strengths. Senior Staff Data Engineer – Role Overview The Senior Staff Data Engineer will serve as a senior technical leader within the organization. This role focuses on designing and implementing large-scale data solutions that support Gen Digital’s cybersecurity platform strategy. The position combines deep technical skill with organizational influence. Key responsibilities include: Designing complex data architectures for enterprise-scale needs Implementing solutions that support a multi-petabyte data infrastructure Mentoring and guiding engineering teams Shaping the technical vision for data systems serving millions of users Location Chennai, India
About Us:BigID is a pioneering tech startup specializing in cutting-edge solutions for data security, compliance, privacy, and AI data management. We are at the forefront of the data landscape, empowering our customers to mitigate risks, foster business innovation, achieve compliance, build trust, make informed decisions, and maximize the value of their data.We are committed to building a global team united by a passion for innovation and advanced technology. BigID has received numerous accolades, including:Named a Hot Company in Artificial Intelligence and Machine Learning at the Global InfoSec AwardsListed in Citizens JMP Cyber 66 as one of the Hottest Privately Held Cybersecurity CompaniesCRN 100 list recognizes BigID as one of the 20 Coolest Identity Access Management and Data Protection Companies for three consecutive yearsRanked among the DUNS 100 Best Tech Companies to Work forFeatured as a Top 3 Big Data and AI Vendor to Watch in the 2023 BigDATAwire Readers' and Editors' Choice AwardsIncluded in the 2024 Inc. 5000 list for the fourth consecutive year!Shortlisted for the 2024 AI Awards in the Best Use of AI in Cybersecurity categoryAt BigID, our team is the cornerstone of our success. Join our dynamic, people-centric culture where you’ll have the opportunity to collaborate with some of the most talented professionals in the industry who prioritize innovation, diversity, integrity, and teamwork.Who We Are Looking For:We are on the hunt for a Senior Data Platform Engineer to enhance our Data Platform team. The ideal candidate will possess substantial experience in data engineering, particularly with Kafka and Elasticsearch, to design and maintain our robust data platforms. You will collaborate closely with cross-functional teams to ensure the scalability and reliability of our data solutions.Role Overview:As a Senior Data Platform Engineer, you will be instrumental in the design, development, maintenance, troubleshooting, and implementation of our big data architecture. Your proficiency in Elastic, Kafka, and Node.js will play a vital role in ensuring the scalability and performance of our data systems.Key Responsibilities:Develop data processing pipelines utilizing Kafka for real-time data streaming.Enhance and manage search functionalities leveraging Elastic technologies.Work alongside product managers, data analysts, and stakeholders to gather requirements and translate them into technical specifications.Lead code reviews and promote best practices in coding and data handling.
At PDI Technologies, we empower many of the world’s leading convenience retail and petroleum brands through innovative technology solutions that foster growth and enhance operational efficiency.By connecting convenience globally, we enable businesses to boost productivity, make informed decisions, and engage customers rapidly through loyalty programs, shopper insights, and unparalleled real-time market intelligence via mobile applications like GasBuddy. We are a global team dedicated to excellence, collaboration, and delivering real impact. Join us and be part of a company that prioritizes diversity, integrity, and growth.Role Overview:PDI Technologies is in search of a Product Owner specializing in Data to enhance the data capabilities of our loyalty solutions. This role is perfect for a passionate Product Owner with experience in driving data-centric products and enabling teams that deliver scalable data services, analytics, reporting capabilities, and data pipelines. As the Product Owner for the Loyalty Data team, you will collaborate with engineering, Product Managers, analysts, customers, and business stakeholders to define and implement high-impact data features that enhance consumer-facing loyalty experiences, operational reporting, segmentation, personalization, and downstream integrations. You will be responsible for driving product requirements related to data ingestion, data modeling, data quality, APIs, event streams, and insights, ensuring that the Loyalty Platform remains robust, reliable, and optimized for data-driven business outcomes. Prior experience with data platforms, data engineering workflows, analytics tools, or consumer data systems (CRM, loyalty, personalization engines) is highly preferred.
Join our dynamic team as a Technology Engineer specializing in DevOps, containerization, and big data technologies. In this pivotal role, you will drive enterprise-level digital and data platform initiatives, ensuring the design, implementation, and optimization of scalable infrastructure and data solutions.Key ResponsibilitiesDevOps & CI/CDDesign, implement, and maintain robust CI/CD pipelines leveraging tools such as Jenkins and GitOps.Automate build, deployment, and release processes to enhance operational efficiency and reliability.Containerization & OrchestrationDeploy and manage containerized applications utilizing Kubernetes and OpenShift.Ensure high availability, scalability, and resilience of applications.Infrastructure as Code (IaC)Develop and manage infrastructure using Terraform, Ansible, or comparable tools.Maintain version-controlled infrastructure to promote consistency and scalability.Big Data EngineeringArchitect and implement data solutions with Hadoop, Spark, and Kafka.Manage large-scale data processing and streaming pipelines.Distributed SystemsDesign and oversee distributed data architectures.Optimize data storage and processing performance across various systems.CollaborationEngage closely with engineering, DevOps, and data teams to deliver comprehensive solutions.Translate business and technical requirements into scalable implementations.Monitoring & Performance OptimizationImplement monitoring, logging, and alerting solutions.Continuously enhance system performance, reliability, and cost-efficiency.Security & ComplianceEnsure that infrastructure and data platforms adhere to security best practices.Maintain compliance with enterprise and regulatory standards.
Join our dynamic team at minderacraft as a Senior Data Engineer, where your expertise will be pivotal in shaping our data infrastructure. We are seeking a highly skilled individual with a deep understanding of big data technologies, ETL/ELT processes, and data modeling methodologies. Your primary focus will be to design, optimize, and maintain robust data pipelines, ensuring the integrity of our data and supporting our analytics initiatives.
About UsAt Arcadia, we are at the forefront of empowering energy innovators and consumers in combating the climate crisis. Our cutting-edge software and APIs are transforming an industry constrained by outdated systems, granting unprecedented access to the data and clean energy necessary for a decarbonized energy grid.Since our inception in 2014, we have been dedicated to dismantling the fossil fuel monopoly by breaking down institutional barriers that hinder decarbonization. To date, we've connected hundreds of thousands of consumers and small businesses with premium clean energy options. Today, we are expanding our vision even further with the launch of Arc, a groundbreaking SaaS platform that enables developers and energy innovators to create tailored energy experiences, accelerating the transition from traditional energy systems to a digitized network.We believe that solving one of the world's most significant challenges requires innovative thinking and diverse perspectives. We are building a team of individuals from various backgrounds, industries, and educational experiences. If you share our commitment to ushering in the era of clean energy, we would love to see what unique qualities you can bring to Arcadia! Visit us at www.arcadia.com.Position OverviewThe Custom Data Integration (CDI) team is the operational backbone that allows Arcadia to efficiently ingest and standardize utility data, surpassing the limitations of our core product and automation capabilities. This team addresses non-standard customer requirements, including integrations with sustainability platforms (e.g., Envizi, ESPM, NZC), custom data deliveries, reports, and the processing of various bill formats through specialized internal workflows.As the Team Lead for CDI, you will have complete ownership of the Chennai CDI pod, overseeing throughput, quality, team capabilities, and operational alignment with Product, Engineering, Customer Success, Project Management, and Data Engineering teams in both India and the US. Your responsibility includes transforming complex, unstructured requirements into standardized, audit-ready outputs that facilitate customer sustainability reporting, billing audit workflows, and compliance needs.This is a working-leadership role that combines hands-on project management with team leadership. You will personally oversee the most intricate projects from start to finish while guiding Data Analysts and Senior Data Analysts through simpler projects under your mentorship. Additionally, you will design SOPs, onboarding processes, QA frameworks, and delivery standards for both one-off and recurring projects, consistently identifying opportunities for automation to minimize manual interventions.
Full-time|On-site|Bengaluru, Karnataka, India; Chennai, Tamil Nadu, India
About Zuora Zuora helps businesses adapt and grow by offering solutions that support modern revenue models. The Zuora platform handles everything from subscription services and usage-based pricing to AI-powered offerings, making it easier for companies to introduce new products, manage complex billing, and build steady, recurring revenue streams. With over ten years of experience leading the Subscription Economy, Zuora continues to evolve its quote-to-cash platform. The company focuses on giving organizations a flexible, AI-ready foundation to monetize products and services. Role Overview: Senior Data Scientist Zuora is looking for a Senior Data Scientist in Bengaluru or Chennai to help shape AI and machine learning solutions that drive real business outcomes. This role suits someone who combines strong technical expertise with a strategic approach to solving business challenges. What You Will Do Spot opportunities where AI and machine learning can solve customer and business problems with measurable results Work closely with stakeholders to guide decisions using data-driven insights Design and deploy machine learning models, including large language models, using both structured and unstructured data Partner with engineering teams to build scalable data and ML pipelines Take ownership of projects from initial concept through to production, focusing on business impact Who Thrives Here This position is a strong fit for those who bring deep technical skills and business awareness, and who enjoy turning complex problems into practical, production-ready AI solutions.
Role Overview minderacraft is hiring a Data Engineer in Chennai, Tamil Nadu, India. This position focuses on building and improving the company’s data infrastructure using AWS, Snowflake, and dbt. The role centers on designing, developing, and maintaining data pipelines that deliver reliable, high-quality data for business needs. What You Will Do Design and build data pipelines with AWS, Snowflake, and dbt Maintain and optimize existing data workflows to ensure timely and accurate data delivery Work with teams across the company to understand their data requirements and translate them into technical solutions Identify and resolve data issues to improve data quality Support and implement data governance practices What Helps You Succeed Hands-on experience with AWS, Snowflake, and dbt Strong analytical and problem-solving abilities Ability to communicate with both technical and non-technical teams Attention to data quality and process improvement
Join our dynamic Data Team at Mindera, where as a Senior Data Engineer, you will play a pivotal role in creating the data pipelines and tables that drive our business-critical dashboards, empower self-service analytics, and support advanced machine learning models and real-time data products. Utilizing state-of-the-art tools such as DBT, Spark, and Airflow, you will convert high-volume raw event data into user-friendly, impactful datasets.You will collaborate cross-functionally with Machine Learning Engineers, Data Scientists, and BI Developers to facilitate data-driven decision-making throughout the organization. Our engineers benefit from a culture of autonomy, innovation, and continuous learning, supported by structured career progression paths and access to training resources.As a Senior Data Engineer, your responsibilities will include:Designing and constructing scalable data pipelines, models, and feature stores to support analytics and machine learning workloads.Deploying and managing cloud-native data applications on AWS, leveraging CI/CD pipelines to automate builds, tests, and releases.Ensuring the technical quality, performance, and reliability of production-grade data pipelines through robust observability and engineering best practices.
Join our dynamic team at gsstech-group as a Data Engineer specializing in real-time streaming and event-driven architectures. We seek a talented individual who will take charge of creating scalable data pipelines, enhancing streaming systems, and achieving optimal performance in distributed environments.Key ResponsibilitiesDesign and implement real-time data streaming pipelines utilizing technologies such as Apache Flink, Kafka, and JavaBuild and sustain event-driven architectures for extensive distributed systemsConduct JVM tuning and performance optimization for streaming applicationsDevelop and deploy applications utilizing containerization tools (Docker, Kubernetes)Utilize the Cloudera platform for data engineering and pipeline orchestrationApply robust design patterns while upholding high coding standardsTroubleshoot and resolve challenges within a distributed systems ecosystemCollaborate with DevOps teams to manage CI/CD pipelines (GitHub, Jenkins)Operate on Linux-based systems, including configuration and shell scriptingEnhance data processing through caching mechanisms (e.g., Redis – considered a plus)Required Skills & ExperienceExtensive hands-on experience in Real-Time Streaming (Flink / Kafka / Java)Comprehensive understanding of event-driven architectureExperience with JVM performance tuningProficiency in Docker and KubernetesStrong background in Linux OS and shell scriptingKnowledge of design patterns and scalable system designFamiliarity with CI/CD tools such as GitHub and JenkinsPractical troubleshooting experience in distributed systemsNice to HaveFamiliarity with Redis or other caching systemsExposure to Cloudera Data Platform engineeringPrior experience in the banking or financial domain is advantageous
Vultr is expanding its global cloud infrastructure with a new data center in Chennai. The Data Center Operations Engineer will play a central role in preparing this facility for launch and ensuring it meets Vultr’s operational standards from day one. This position acts as the on-site operational lead during the initial ramp-up phase, before permanent staff are in place. The engineer will coordinate readiness activities, make deployment decisions, and serve as Vultr’s technical and operational point of contact at the new site. The role offers significant autonomy and direct influence on how the Chennai data center comes online and supports Vultr’s growth in new markets. What you will do Act as the primary operational owner for the Chennai data center during launch and ramp-up phases Evaluate site conditions to apply Vultr’s infrastructure standards and operational procedures Plan, coordinate, and sequence deployment activities to bring systems online safely and efficiently Track and report on operational performance metrics throughout the launch process Employee benefits Annual medical insurance stipend 9 paid company holidays Generous leave policy, including a 1-month paid sabbatical every 5 years and an anniversary bonus each year Professional development reimbursement Fitness membership reimbursement Company-sponsored Wellable subscription
ValGenesis builds digital validation platforms for life sciences organizations. Its products support pharmaceutical and biotech companies as they move toward digital processes, maintain regulatory compliance, and ensure manufacturing quality throughout the product lifecycle. Thirty of the top fifty global firms in this industry use ValGenesis solutions. More details about the company's work in paperless validation are available at valgenesis.com/about. Role overview The Senior Software Engineer - Data Engineering position is based in Chennai. This role focuses on developing and maintaining data engineering solutions to support ValGenesis platforms. Work will center on building systems that help life sciences clients manage and analyze data for compliance and quality throughout their operations.
We are seeking a talented Senior Data Engineer with extensive knowledge of real-time data streaming and distributed data processing to architect, develop, and enhance state-of-the-art data platforms. This pivotal role is essential for advancing event-driven architecture and real-time analytics within critical banking systems, particularly in risk and compliance domains.In this position, you will work synergistically with data architects, platform engineers, and business stakeholders to create low-latency, high-throughput data pipelines that empower sophisticated analytics and informed decision-making.Key ResponsibilitiesDesign, develop, and maintain robust real-time streaming pipelines utilizing Apache Kafka, PySpark, and FlinkConstruct scalable and fault-tolerant event-driven data architecturesHandle high-volume streaming data ensuring low latency and high reliabilityIntegrate diverse data sources into centralized data platforms (Data Lake / Lakehouse)Enhance data pipelines for performance, scalability, and cost-effectivenessUphold data quality, governance, and compliance in line with banking regulationsCollaborate with cross-functional teams to convert business needs into technical solutionsMonitor and debug streaming jobs and production pipelinesRequired Skills & Experience5+ years of experience in Data EngineeringDemonstrated proficiency in:PySpark / Spark StreamingApache Kafka (Producers, Consumers, Kafka Streams)Apache Flink or other real-time processing frameworksProven experience in building real-time / near real-time data pipelinesStrong understanding of distributed systems and event-driven architectureProficiency in Python / Java / ScalaExperience with data lakes, ETL/ELT pipelines, and big data ecosystemsFamiliarity with cloud platforms (AWS / Azure / GCP) is advantageousKnowledge of banking, risk, or compliance data systems is highly preferredPreferred QualificationsExperience in the financial services or banking domainExposure to data governance, regulatory reporting, or compliance systemsUnderstanding of CI/CD pipelines and DevOps practices for data platforms
We are on the lookout for an exceptional Senior Data Engineer to architect, construct, and sustain scalable data pipelines for our enterprise-level data platforms focused on the Risk & Compliance sector. The successful candidate will possess a deep understanding of PySpark, Python, and data engineering best practices, emphasizing data quality, governance, and security.Key ResponsibilitiesDesign, develop, and enhance scalable data pipelines leveraging PySpark and PythonCreate robust ETL/ELT workflows to manage substantial volumes of both structured and unstructured dataCollaborate with data scientists, analysts, and business stakeholders to produce high-quality datasetsGuarantee data integrity, accuracy, and reliability through comprehensive validation frameworks and monitoringImplement data security and access control mechanisms that align with compliance standardsPartner closely with Risk & Compliance teams to fulfill regulatory and reporting obligationsOptimize the performance of data processing jobs and queriesMaintain and upgrade existing data architecture and pipelinesRequired Skills & Experience6+ years of experience in Data EngineeringExtensive hands-on experience with PySpark and PythonSolid background in SQL and Oracle databasesProven experience in constructing and managing large-scale data pipelinesStrong understanding of data warehousing concepts and ETL frameworksExperience with data validation, data quality, and governance frameworksFamiliarity with cloud platforms (AWS/Azure/GCP) is a plusExperience in the banking, financial services, or risk & compliance domain is preferredKey CompetenciesStrong analytical and problem-solving skillsAdept at working in a fast-paced, collaborative environmentExcellent communication and stakeholder management abilitiesMeticulous attention to detail, especially regarding data quality and securityNice to HaveExperience with Big Data ecosystems (Hadoop, Spark)Knowledge of data security and regulatory compliance frameworksPrior experience with enterprise data platforms
Join our dynamic team as a Senior AI Data Scientist, where you will leverage your expertise in artificial intelligence and data science to drive innovative solutions for our clients. You will be responsible for developing advanced machine learning models, conducting data analysis, and collaborating with cross-functional teams to implement AI-driven strategies. This role is ideal for a passionate and skilled data scientist eager to make a significant impact.
Speechify’s mission is to remove reading barriers from learning. With a user base of over 50 million, our products convert materials like PDFs, books, Google Docs, news articles, and websites into audio. People use Speechify to read faster, understand more, and remember what matters. Our platform includes apps for iOS, Android, Mac, Chrome, and the web. Recent recognition includes Chrome Extension of the Year from Google and Apple’s 2025 Design Award for Inclusivity. Our distributed team includes nearly 200 professionals: frontend and backend engineers, AI researchers, and others from companies such as Amazon, Microsoft, and Google, as well as alumni of top PhD programs and founders of startups like Stripe and Vercel. Role Overview Speechify is hiring a Software Engineer to join the AI team’s data division in Chennai, India. This role centers on data collection for model training. The team’s focus: building high-quality datasets at petabyte scale while keeping costs low through a blend of infrastructure, engineering, and research. What You Will Do Find and integrate new audio data sources into the ingestion pipeline. Manage and improve cloud infrastructure for data ingestion, using GCP and Terraform. Work with scientists to push the boundaries of cost, throughput, and data quality for next-generation models. Collaborate with AI team members and leadership to shape the dataset roadmap for future consumer and enterprise products. Qualifications BS, MS, or PhD in Computer Science or a related discipline. At least 5 years of software development experience. Strong skills in bash and Python scripting on Linux. Expertise with Docker and Infrastructure-as-Code, plus hands-on work with at least one major cloud provider (GCP preferred). Background with web crawlers and large-scale data processing is a plus. Comfort with multitasking and shifting priorities. Clear written and verbal communication skills.
About Gen Digital Inc. Gen Digital Inc. brings together trusted consumer brands like Norton, Avast, LifeLock, and MoneyLion, serving nearly 500 million users in over 150 countries. The company’s mission centers on digital freedom, cybersecurity, online privacy, identity protection, and financial wellness. Gen’s legacy is built on helping people secure and manage their digital and financial lives. Employees at Gen benefit from flexible work options, comprehensive support, and resources to help them succeed. The company values open communication, experimentation, and continuous learning, and actively welcomes diverse backgrounds and perspectives. Gen offers competitive pay, benefits, and wellness programs to support work-life balance. Role Overview: Senior Data Platform Engineer Location: Chennai, India The Senior Data Platform Engineer joins the Data Platform Operations team, focusing on the daily management and monitoring of Gen’s data platforms and pipelines. This role is key to ensuring smooth operations, ongoing improvements, and reliable maintenance of data infrastructure. What You Will Do Oversee daily operations and health of data platforms and pipelines Drive enhancements for operational efficiency and platform maintenance Work closely with data engineers, analysts, and platform teams Maintain platform stability, observability, and cost-effectiveness Support readiness for new data use cases and business needs What Gen Values Customer focus and a collaborative approach Openness to new ideas and continuous improvement Respect for diverse experiences and backgrounds Supportive teamwork and recognition of individual strengths If the mission and values at Gen resonate, consider exploring a career with the team in Chennai.
About ValGenesis ValGenesis stands at the forefront of digital validation solutions for the life sciences sector. Our comprehensive suite of products empowers 30 of the top 50 global pharmaceutical and biotech firms to embrace digital transformation, ensuring total compliance, and achieving excellence in manufacturing intelligence across their product lifecycle.Discover the opportunity to be part of ValGenesis, the leading standard for paperless validation in the Life Sciences industry: valgenesis.com/aboutAbout the Role:As a Senior Software Engineer in Data Engineering, you will play a pivotal role in developing scalable data solutions that support our innovative products.
AECOM, a global leader in infrastructure and environmental services, is seeking a Principal Engineer specializing in Structures for our Data Center projects. In this role, you will leverage your expertise to design and analyze innovative structure solutions that meet project specifications and regulatory requirements.Your responsibilities will include collaborating with multidisciplinary teams, overseeing project execution, and ensuring that all engineering practices align with AECOM's standards of quality and safety.