Senior GCP Data Engineer Consultant
Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Experience Level
Senior
Qualifications
About Lingarogroup
Lingarogroup is a leading provider of innovative data solutions. Our mission is to empower businesses with the tools and insights they need to thrive in a data-driven world. We value creativity, collaboration, and continuous learning, and we are dedicated to fostering a culture that supports professional growth.
Similar jobs
Browse all companies, explore by city & role, or SEO search pages.
Search for Senior GCP Data Engineer Consultant
16,625 results
Lingarogroup
We are seeking a talented and experienced Senior GCP Data Engineer Consultant to join our dynamic team. In this role, you will leverage your expertise in Google Cloud Platform to design, develop, and optimize data engineering solutions that drive business value. You will work collaboratively with cross-functional teams to implement data pipelines, ensure dat…
Join blend360 as a Senior Data Engineer specializing in Google Cloud Platform (GCP). In this pivotal role, you will drive the development and optimization of data pipelines, ensuring efficient data processing and management. Collaborate with cross-functional teams to leverage data for strategic decision-making and enhance business outcomes.
Jobs for Humanity
Join our innovative team as a Senior GCP Data Engineer at Jobs for Humanity, where we prioritize inclusivity and empowerment. This role is designed for talented individuals with disabilities who are eager to make a significant impact in the field of data engineering.As a Senior GCP Data Engineer, you will leverage your expertise in Google Cloud Platform to design, implement, and maintain data processing systems that support our mission of creating equitable job opportunities. Your contributions will enhance our data architecture and facilitate seamless data flow across various applications.
Smart Working Solutions
Smart Working Solutions brings together skilled professionals and global teams, focusing on building long-term careers in a remote-first setting. The company emphasizes professional growth, well-being, and a supportive culture. Based in Ahmedabad, the team works without geographical barriers and is recognized for a strong commitment to employee success. Role overview The Senior Data Engineer (GCP, BigQuery, Looker) joins the data platform group, working remotely with an Ahmedabad-based team. This is a senior, ongoing role intended for someone interested in strategic, long-term contributions rather than short-term contracts. The position involves close collaboration with engineering, analytics, commercial, product, and marketing teams. The main focus: keep data systems reliable, scalable, and actionable, while taking substantial ownership over standards and architecture. What you will do Manage every stage of the data lifecycle, including ingestion, transformation, modeling, and visualization Build and maintain data pipelines using Google Cloud Platform, BigQuery, and Looker Work with cross-functional teams to ensure data quality and relevance Help define and enforce standards for data integrity and architecture Contribute to a collaborative, remote-first engineering culture Requirements Hands-on experience with GCP, BigQuery, and Looker Ability to work independently and take ownership of projects Interest in shaping data architecture and setting standards Strong communication skills for working with both technical and non-technical colleagues This remote position is designed for those seeking long-term growth and meaningful impact, with opportunities to influence how data shapes business decisions and company standards.
Capco
Role Overview Capco is hiring a GCP Data Architect in Pune, India. This role focuses on designing, building, and managing data solutions on Google Cloud Platform (GCP). The position involves working closely with teams across different functions to create data architectures that support business goals. What You Will Do Design and implement scalable data solutions on GCP Work with cross-functional teams to deliver cloud-based data architectures Help turn raw data into actionable insights for better decision-making
Capco
Role Overview Capco is hiring a GCP Data Architect in Pune, India. This role focuses on designing and implementing scalable data solutions using Google Cloud Platform. The position involves working closely with teams from different disciplines to shape data strategy and improve data architecture for business needs.
Orion Innovation
Orion Innovation is a distinguished, award-winning global business and technology services firm. We specialize in delivering transformative business solutions and innovative product development, rooted in digital strategy, experience design, and engineering. Our unique blend of agility, scale, and maturity allows us to cater to a diverse clientele across various industries, including financial services, telecommunications, media, consumer products, automotive, professional sports, life sciences, e-commerce, and education.Position: Data Consultant (Manager / Senior Manager)Experience Required: 7+ yearsRole Overview: We are looking for a seasoned Data Consultant who will support the delivery of enterprise-grade data products. This pivotal role bridges the gap between business needs, data architecture, and engineering, facilitating the transformation of business requirements into high-quality, reusable data assets.The ideal candidate is a proactive self-starter capable of independently driving discussions, identifying gaps in requirements, and ensuring that data products are designed to deliver clear business value, maintain high data quality, and ensure end-to-end data lineage.
Sutherland
Role Overview Sutherland is hiring a Databricks & GCP Data Platform Architect in Hyderabad. This architect will design and build data solutions using Databricks and Google Cloud Platform (GCP). The work centers on shaping data architecture, optimizing data workflows, and delivering analytics platforms that help clients make better decisions. What You Will Do Design and implement data solutions with Databricks and GCP technologies Optimize data processes for performance and scalability Develop analytics platforms to support business needs Work closely with clients to understand requirements and deliver effective solutions Who We’re Looking For Experience architecting data platforms with Databricks and GCP Strong background in designing scalable and reliable data solutions Comfortable working in a collaborative, changing environment Motivated by using data to solve real business problems
Join Blend360 as a Senior Data Engineer specializing in Google Cloud Platform (GCP). In this role, you will leverage your expertise to design, build, and maintain scalable data pipelines, ensuring that our data architecture is efficient and robust. You will collaborate with cross-functional teams to implement data solutions that drive business intelligence and analytics initiatives.
Bosch Group
Bosch Group seeks a GCP Consultant based in Bangalore. This position centers on supporting organizations as they transition to Google Cloud Platform. Role overview The GCP Consultant will collaborate with internal teams and clients to design cloud architecture and implement workable cloud solutions. The role involves providing guidance throughout the migration process and ensuring that solutions align with project needs. Key responsibilities Guide organizations through the adoption of Google Cloud Platform Work with teams to shape and refine cloud architecture Deliver practical, effective cloud solutions Location This role is based in Bangalore.
Join Capco as a GCP Platform Engineer, where you will play a vital role in designing, implementing, and maintaining robust cloud solutions on the Google Cloud Platform. You will collaborate with cross-functional teams to optimize cloud infrastructure, ensuring high availability, performance, and security for our clients. Your expertise will be instrumental in driving innovation and delivering exceptional value.
TTEC Digital
At TTEC Digital, our mission is to empower clients to create a workplace where employees feel appreciated and supported, because we believe that exceptional customer experiences begin with a satisfied workforce. Our vision is to cultivate an environment where every individual can flourish.Position OverviewAs a Senior Data Engineer specializing in Google Cloud Platform (GCP), you will be an integral member of our Data & Analytics team. Your primary responsibilities will include the design, construction, and maintenance of scalable data pipelines and cloud-based data platforms. This role emphasizes the transformation of raw data into dependable and efficient data systems that facilitate advanced analytics, insightful reporting, and informed decision-making.In this position, you will work closely with Data Scientists, Architects, Project Managers, and Business Stakeholders to drive implementation, migration, modernization, and optimization initiatives.Key Responsibilities• Architect, develop, and sustain scalable data pipelines on GCP• Create and enhance batch and real-time data processing solutions• Construct and oversee data lakes and data warehouses• Generate and maintain datasets for analytics and reporting• Enhance data quality, reliability, and operational efficiency• Produce solution designs and technical documentation• Collaborate autonomously and within cross-functional teams• Support pre-sales activities with effort estimates and technical insights• Partner with Project Management to ensure timely and budget-friendly delivery• Implement DevOps and CI/CD practices for data pipelinesCompetenciesPersonal• Strong analytical and problem-solving skills• High energy, ownership mentality, and accountability• Adaptable and results-focused• Excellent communication and stakeholder management skillsLeadership• Ability to mentor junior team members• Collaborate effectively with diverse technical and business teams• Provide technical guidance and best practice recommendationsOperations• Effectively manage multiple projects and priorities• Deliver customer-centric solutions• Optimize performance and resource utilizationTechnical• Profound understanding of cloud-native data architecture• Ability to convey complex technical concepts clearly• Familiarity with Agile/Scrum methodologies
lingarogroup
Join our dynamic team as a Senior Azure Data Engineer Consultant, where you will leverage your expertise in data engineering and cloud solutions to drive impactful business outcomes. You will work with cutting-edge Azure technologies to design, build, and optimize data pipelines, ensuring data integrity and availability. Collaborate with cross-functional teams to implement data strategies that align with organizational goals, while mentoring junior engineers and promoting best practices in data management.
Tech Holding
About Us:At Tech Holding, we believe that a job is more than just a position; it’s a chance to contribute to something significant. As a comprehensive consulting firm, we are committed to delivering consistent results and high-quality solutions for our clients. Our founders and team members bring extensive industry experience, having held senior roles in diverse organizations—from innovative startups to prominent Fortune 50 companies. We've harnessed this collective expertise to craft a distinctive approach based on deep knowledge, integrity, transparency, and reliability.About the Role:We are on the lookout for a Senior DevOps Engineer with over 5 years of experience, specializing in Google Cloud Platform (GCP) and possessing a solid understanding of AWS and/or GCP. This critical role involves leading comprehensive cloud infrastructure and DevOps practices.The focus of this position will be on designing and managing secure, scalable, and production-quality cloud environments, leveraging GCP-native services, Kubernetes, Infrastructure as Code, CI/CD, observability, and cloud governance. You will collaborate closely with engineering teams and clients to deliver systems that are highly available, cost-effective, and compliant.
Join Capco as a GCP Platform Engineer, where you will be at the forefront of innovative cloud solutions. In this role, you'll leverage your expertise in Google Cloud Platform to architect, develop, and optimize cloud environments. You will collaborate with cross-functional teams to design scalable and secure cloud solutions that meet our clients' evolving business needs.
DemandMatrix
Role Overview:We are on the lookout for a dynamic Big Data Lead who will play a pivotal role in taking our data capabilities to the next level. Your primary focus will be on solving intricate data challenges using cutting-edge big data technologies, dedicating almost 50% of your time to hands-on coding tasks.Key Responsibilities:Develop and manage large-scale text data processing and event-driven data pipelines, optimizing performance across CPU, network IO, and disk IO.Utilize cloud-native services on platforms such as AWS and GCP to enhance data processing and storage capabilities.About You:You possess a strong foundation in computer engineering, with expertise in Unix, data structures, and algorithms.You have successfully designed and implemented multiple big data modules and data pipelines capable of handling vast volumes of data.Passionate about technology, you have experience in initiating and executing projects from the ground up.
Join our dynamic team as a GCP Infrastructure Engineer at Endava in Bengaluru. We are looking for a skilled professional who is passionate about cloud technologies and infrastructures, particularly Google Cloud Platform (GCP). You will play a key role in designing, implementing, and managing GCP solutions to enhance our client delivery services.
Endava
Join our dynamic team at Endava as an AI/ML GCP Engineer. In this role, you will leverage your expertise in artificial intelligence and machine learning to design and implement innovative solutions on the Google Cloud Platform. You will collaborate with cross-functional teams to enhance our client delivery services and drive impactful projects that utilize cutting-edge technologies.
Smart Working Solutions
Join our dynamic team at Smart Working Solutions as a Senior Backend Engineer. In this fully remote position, you will leverage your expertise in Python, PostgreSQL, and GCP to design and implement scalable backend systems that drive our innovative solutions.
T-Systems Information and Communication Technology India Private Limited
Role: Senior Cloudera Developer (Data Engineer)Experience: 6 to 10 YearsLocation: PuneJob Description:Expertise in Spark programming and architecture, with a solid understanding of fault tolerance mechanisms.Proficient in utilizing Spark DataFrames and Spark SQL for querying structured datasets.Experience in optimizing Spark execution plans is advantageous.Skilled in performing Extract, Transform, Load (ETL) processes using Spark.Experience integrating Spark Streaming with technologies like Kafka is a plus.Familiarity with the Hadoop ecosystem, including HDFS, Hive, and Cloudera stack, is beneficial.Experience in deploying and managing Spark applications on Hadoop clusters or GCP Dataproc.Strong proficiency in Python, with experience in Java being a bonus.Familiar with DevOps tools and practices, including CI/CD and Docker.Hands-on experience with GCP services such as Dataproc, Cloud Functions, Cloud Run, Pub/Sub, and BigQuery.Responsibilities:Design and implement data solutions leveraging Cloudera technologies like Hadoop, Spark, and Hive.Collaborate with data engineering teams to enhance data pipelines and processing workflows.Work closely with data analysts and scientists to ensure data quality and integrity.Diagnose and resolve issues related to data processing and storage systems.Stay abreast of the latest developments and best practices in Cloudera development.Participate in code reviews and contribute constructive feedback to peers.
Sign in to browse more jobs
Create account — see all 16,625 results
Browse all companies, explore by city & role, or SEO search pages.
Search for Senior GCP Data Engineer Consultant
16,625 results
Lingarogroup
We are seeking a talented and experienced Senior GCP Data Engineer Consultant to join our dynamic team. In this role, you will leverage your expertise in Google Cloud Platform to design, develop, and optimize data engineering solutions that drive business value. You will work collaboratively with cross-functional teams to implement data pipelines, ensure dat…
Join blend360 as a Senior Data Engineer specializing in Google Cloud Platform (GCP). In this pivotal role, you will drive the development and optimization of data pipelines, ensuring efficient data processing and management. Collaborate with cross-functional teams to leverage data for strategic decision-making and enhance business outcomes.
Jobs for Humanity
Join our innovative team as a Senior GCP Data Engineer at Jobs for Humanity, where we prioritize inclusivity and empowerment. This role is designed for talented individuals with disabilities who are eager to make a significant impact in the field of data engineering.As a Senior GCP Data Engineer, you will leverage your expertise in Google Cloud Platform to design, implement, and maintain data processing systems that support our mission of creating equitable job opportunities. Your contributions will enhance our data architecture and facilitate seamless data flow across various applications.
Smart Working Solutions
Smart Working Solutions brings together skilled professionals and global teams, focusing on building long-term careers in a remote-first setting. The company emphasizes professional growth, well-being, and a supportive culture. Based in Ahmedabad, the team works without geographical barriers and is recognized for a strong commitment to employee success. Role overview The Senior Data Engineer (GCP, BigQuery, Looker) joins the data platform group, working remotely with an Ahmedabad-based team. This is a senior, ongoing role intended for someone interested in strategic, long-term contributions rather than short-term contracts. The position involves close collaboration with engineering, analytics, commercial, product, and marketing teams. The main focus: keep data systems reliable, scalable, and actionable, while taking substantial ownership over standards and architecture. What you will do Manage every stage of the data lifecycle, including ingestion, transformation, modeling, and visualization Build and maintain data pipelines using Google Cloud Platform, BigQuery, and Looker Work with cross-functional teams to ensure data quality and relevance Help define and enforce standards for data integrity and architecture Contribute to a collaborative, remote-first engineering culture Requirements Hands-on experience with GCP, BigQuery, and Looker Ability to work independently and take ownership of projects Interest in shaping data architecture and setting standards Strong communication skills for working with both technical and non-technical colleagues This remote position is designed for those seeking long-term growth and meaningful impact, with opportunities to influence how data shapes business decisions and company standards.
Capco
Role Overview Capco is hiring a GCP Data Architect in Pune, India. This role focuses on designing, building, and managing data solutions on Google Cloud Platform (GCP). The position involves working closely with teams across different functions to create data architectures that support business goals. What You Will Do Design and implement scalable data solutions on GCP Work with cross-functional teams to deliver cloud-based data architectures Help turn raw data into actionable insights for better decision-making
Capco
Role Overview Capco is hiring a GCP Data Architect in Pune, India. This role focuses on designing and implementing scalable data solutions using Google Cloud Platform. The position involves working closely with teams from different disciplines to shape data strategy and improve data architecture for business needs.
Orion Innovation
Orion Innovation is a distinguished, award-winning global business and technology services firm. We specialize in delivering transformative business solutions and innovative product development, rooted in digital strategy, experience design, and engineering. Our unique blend of agility, scale, and maturity allows us to cater to a diverse clientele across various industries, including financial services, telecommunications, media, consumer products, automotive, professional sports, life sciences, e-commerce, and education.Position: Data Consultant (Manager / Senior Manager)Experience Required: 7+ yearsRole Overview: We are looking for a seasoned Data Consultant who will support the delivery of enterprise-grade data products. This pivotal role bridges the gap between business needs, data architecture, and engineering, facilitating the transformation of business requirements into high-quality, reusable data assets.The ideal candidate is a proactive self-starter capable of independently driving discussions, identifying gaps in requirements, and ensuring that data products are designed to deliver clear business value, maintain high data quality, and ensure end-to-end data lineage.
Sutherland
Role Overview Sutherland is hiring a Databricks & GCP Data Platform Architect in Hyderabad. This architect will design and build data solutions using Databricks and Google Cloud Platform (GCP). The work centers on shaping data architecture, optimizing data workflows, and delivering analytics platforms that help clients make better decisions. What You Will Do Design and implement data solutions with Databricks and GCP technologies Optimize data processes for performance and scalability Develop analytics platforms to support business needs Work closely with clients to understand requirements and deliver effective solutions Who We’re Looking For Experience architecting data platforms with Databricks and GCP Strong background in designing scalable and reliable data solutions Comfortable working in a collaborative, changing environment Motivated by using data to solve real business problems
Join Blend360 as a Senior Data Engineer specializing in Google Cloud Platform (GCP). In this role, you will leverage your expertise to design, build, and maintain scalable data pipelines, ensuring that our data architecture is efficient and robust. You will collaborate with cross-functional teams to implement data solutions that drive business intelligence and analytics initiatives.
Bosch Group
Bosch Group seeks a GCP Consultant based in Bangalore. This position centers on supporting organizations as they transition to Google Cloud Platform. Role overview The GCP Consultant will collaborate with internal teams and clients to design cloud architecture and implement workable cloud solutions. The role involves providing guidance throughout the migration process and ensuring that solutions align with project needs. Key responsibilities Guide organizations through the adoption of Google Cloud Platform Work with teams to shape and refine cloud architecture Deliver practical, effective cloud solutions Location This role is based in Bangalore.
Join Capco as a GCP Platform Engineer, where you will play a vital role in designing, implementing, and maintaining robust cloud solutions on the Google Cloud Platform. You will collaborate with cross-functional teams to optimize cloud infrastructure, ensuring high availability, performance, and security for our clients. Your expertise will be instrumental in driving innovation and delivering exceptional value.
TTEC Digital
At TTEC Digital, our mission is to empower clients to create a workplace where employees feel appreciated and supported, because we believe that exceptional customer experiences begin with a satisfied workforce. Our vision is to cultivate an environment where every individual can flourish.Position OverviewAs a Senior Data Engineer specializing in Google Cloud Platform (GCP), you will be an integral member of our Data & Analytics team. Your primary responsibilities will include the design, construction, and maintenance of scalable data pipelines and cloud-based data platforms. This role emphasizes the transformation of raw data into dependable and efficient data systems that facilitate advanced analytics, insightful reporting, and informed decision-making.In this position, you will work closely with Data Scientists, Architects, Project Managers, and Business Stakeholders to drive implementation, migration, modernization, and optimization initiatives.Key Responsibilities• Architect, develop, and sustain scalable data pipelines on GCP• Create and enhance batch and real-time data processing solutions• Construct and oversee data lakes and data warehouses• Generate and maintain datasets for analytics and reporting• Enhance data quality, reliability, and operational efficiency• Produce solution designs and technical documentation• Collaborate autonomously and within cross-functional teams• Support pre-sales activities with effort estimates and technical insights• Partner with Project Management to ensure timely and budget-friendly delivery• Implement DevOps and CI/CD practices for data pipelinesCompetenciesPersonal• Strong analytical and problem-solving skills• High energy, ownership mentality, and accountability• Adaptable and results-focused• Excellent communication and stakeholder management skillsLeadership• Ability to mentor junior team members• Collaborate effectively with diverse technical and business teams• Provide technical guidance and best practice recommendationsOperations• Effectively manage multiple projects and priorities• Deliver customer-centric solutions• Optimize performance and resource utilizationTechnical• Profound understanding of cloud-native data architecture• Ability to convey complex technical concepts clearly• Familiarity with Agile/Scrum methodologies
lingarogroup
Join our dynamic team as a Senior Azure Data Engineer Consultant, where you will leverage your expertise in data engineering and cloud solutions to drive impactful business outcomes. You will work with cutting-edge Azure technologies to design, build, and optimize data pipelines, ensuring data integrity and availability. Collaborate with cross-functional teams to implement data strategies that align with organizational goals, while mentoring junior engineers and promoting best practices in data management.
Tech Holding
About Us:At Tech Holding, we believe that a job is more than just a position; it’s a chance to contribute to something significant. As a comprehensive consulting firm, we are committed to delivering consistent results and high-quality solutions for our clients. Our founders and team members bring extensive industry experience, having held senior roles in diverse organizations—from innovative startups to prominent Fortune 50 companies. We've harnessed this collective expertise to craft a distinctive approach based on deep knowledge, integrity, transparency, and reliability.About the Role:We are on the lookout for a Senior DevOps Engineer with over 5 years of experience, specializing in Google Cloud Platform (GCP) and possessing a solid understanding of AWS and/or GCP. This critical role involves leading comprehensive cloud infrastructure and DevOps practices.The focus of this position will be on designing and managing secure, scalable, and production-quality cloud environments, leveraging GCP-native services, Kubernetes, Infrastructure as Code, CI/CD, observability, and cloud governance. You will collaborate closely with engineering teams and clients to deliver systems that are highly available, cost-effective, and compliant.
Join Capco as a GCP Platform Engineer, where you will be at the forefront of innovative cloud solutions. In this role, you'll leverage your expertise in Google Cloud Platform to architect, develop, and optimize cloud environments. You will collaborate with cross-functional teams to design scalable and secure cloud solutions that meet our clients' evolving business needs.
DemandMatrix
Role Overview:We are on the lookout for a dynamic Big Data Lead who will play a pivotal role in taking our data capabilities to the next level. Your primary focus will be on solving intricate data challenges using cutting-edge big data technologies, dedicating almost 50% of your time to hands-on coding tasks.Key Responsibilities:Develop and manage large-scale text data processing and event-driven data pipelines, optimizing performance across CPU, network IO, and disk IO.Utilize cloud-native services on platforms such as AWS and GCP to enhance data processing and storage capabilities.About You:You possess a strong foundation in computer engineering, with expertise in Unix, data structures, and algorithms.You have successfully designed and implemented multiple big data modules and data pipelines capable of handling vast volumes of data.Passionate about technology, you have experience in initiating and executing projects from the ground up.
Join our dynamic team as a GCP Infrastructure Engineer at Endava in Bengaluru. We are looking for a skilled professional who is passionate about cloud technologies and infrastructures, particularly Google Cloud Platform (GCP). You will play a key role in designing, implementing, and managing GCP solutions to enhance our client delivery services.
Endava
Join our dynamic team at Endava as an AI/ML GCP Engineer. In this role, you will leverage your expertise in artificial intelligence and machine learning to design and implement innovative solutions on the Google Cloud Platform. You will collaborate with cross-functional teams to enhance our client delivery services and drive impactful projects that utilize cutting-edge technologies.
Smart Working Solutions
Join our dynamic team at Smart Working Solutions as a Senior Backend Engineer. In this fully remote position, you will leverage your expertise in Python, PostgreSQL, and GCP to design and implement scalable backend systems that drive our innovative solutions.
T-Systems Information and Communication Technology India Private Limited
Role: Senior Cloudera Developer (Data Engineer)Experience: 6 to 10 YearsLocation: PuneJob Description:Expertise in Spark programming and architecture, with a solid understanding of fault tolerance mechanisms.Proficient in utilizing Spark DataFrames and Spark SQL for querying structured datasets.Experience in optimizing Spark execution plans is advantageous.Skilled in performing Extract, Transform, Load (ETL) processes using Spark.Experience integrating Spark Streaming with technologies like Kafka is a plus.Familiarity with the Hadoop ecosystem, including HDFS, Hive, and Cloudera stack, is beneficial.Experience in deploying and managing Spark applications on Hadoop clusters or GCP Dataproc.Strong proficiency in Python, with experience in Java being a bonus.Familiar with DevOps tools and practices, including CI/CD and Docker.Hands-on experience with GCP services such as Dataproc, Cloud Functions, Cloud Run, Pub/Sub, and BigQuery.Responsibilities:Design and implement data solutions leveraging Cloudera technologies like Hadoop, Spark, and Hive.Collaborate with data engineering teams to enhance data pipelines and processing workflows.Work closely with data analysts and scientists to ensure data quality and integrity.Diagnose and resolve issues related to data processing and storage systems.Stay abreast of the latest developments and best practices in Cloudera development.Participate in code reviews and contribute constructive feedback to peers.
Sign in to browse more jobs
Create account — see all 16,625 results
Browse all companies, explore by city & role, or SEO search pages.
