Paytm logoPaytm logo

Principal Engineer - Data DevOps

PaytmNoida, Uttar Pradesh
On-site Full-time

Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.


Experience Level

Manager

Qualifications

● Experience: 8+ years in DevOps/Data DevOps or related fields, with a minimum of 4 years in a leadership capacity.● Proven experience managing large-scale big data infrastructure and leading engineering teams.● Strong hands-on proficiency with AWS services and infrastructure automation tools (Terraform, Ansible, CloudFormation).● Extensive knowledge and practical experience with Kafka, Hive HMS, Apache Ranger, Apache Airflow, EMR, Spark, Trino, Jupyter Notebooks, and Looker.● Proficiency in Kubernetes/EKS, Docker, ECS, and CI/CD tools.● In-depth understanding of networking, cloud security, and compliance requirements.● Excellent communication, stakeholder management, and decision-making skills.● Familiarity with SQL and data query optimization is a plus.

About the job

Role Overview

Paytm seeks a Principal Engineer - Data DevOps for its Noida, Uttar Pradesh office. This role leads the development, management, and optimization of secure, large-scale big data platforms. The position calls for a blend of technical depth and leadership, driving best practices in cloud infrastructure, automation, CI/CD, and big data technologies. The Principal Engineer will guide cross-functional goals, mentor engineers, and ensure delivery of scalable data solutions.

Main Responsibilities

  • Lead, mentor, and develop a high-performing Data DevOps team with a focus on technical excellence and accountability.
  • Direct the architecture, design, and implementation of cloud and data infrastructures to meet scalability, performance, and security needs.
  • Work closely with Data Engineering, Data Science, Analytics, and Product teams to deliver reliable and efficient data platforms.
  • Manage and optimize AWS-based infrastructure, including VPC, EC2, S3, EMR, EKS, SageMaker, Lambda, CloudFront, CloudWatch, and IAM.
  • Scale and oversee big data platforms using Kafka, Hive HMS, Apache Ranger, Apache Airflow, EMR, Spark, Trino, Looker, and Jupyter Notebooks.
  • Establish and maintain CI/CD pipelines and infrastructure automation with Terraform, Ansible, and CloudFormation.
  • Ensure observability, proactive incident management, and compliance with SLAs.
  • Promote cloud security practices, including API security, TLS/HTTPS, and access control policies.
  • Collaborate with stakeholders to set priorities, manage budgets, and optimize cloud and operational spending.

Required Qualifications

  • At least 8 years of experience in DevOps, Data DevOps, or related fields, with a minimum of 4 years in a leadership role.
  • Demonstrated success managing large-scale big data infrastructure and leading engineering teams.
  • Hands-on expertise with AWS services and infrastructure automation tools such as Terraform, Ansible, and CloudFormation.
  • Extensive experience with Kafka, Hive HMS, Apache Ranger, Apache Airflow, EMR, Spark, Trino, Jupyter Notebooks, and Looker.
  • Proficiency with Kubernetes/EKS, Docker, ECS, and CI/CD tools.
  • Strong understanding of networking, cloud security, and compliance requirements.
  • Excellent communication, stakeholder management, and decision-making skills.
  • Familiarity with SQL and data query optimization is considered a plus.

About Paytm

Paytm is a leading digital payments platform, providing innovative financial solutions and services to millions of users across India. With a strong commitment to technology and customer satisfaction, Paytm is at the forefront of creating a cashless economy.

Similar jobs

Browse all companies, explore by city & role, or SEO search pages. View directory listings: all jobs, search results, location & role pages.

Tailoring 0 resumes

We'll move completed jobs to Ready to Apply automatically.