Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Qualifications
To excel in this role, you should possess:Proven experience in Google Cloud Platform (GCP) services and architecture. Strong understanding of cloud security practices and compliance standards. Excellent problem-solving skills and the ability to work in a fast-paced environment. Familiarity with Infrastructure as Code (IaC) tools like Terraform or CloudFormation. Ability to communicate effectively with technical and non-technical stakeholders.
About the job
Join Capco as a GCP Platform Engineer, where you will be at the forefront of innovative cloud solutions. In this role, you'll leverage your expertise in Google Cloud Platform to architect, develop, and optimize cloud environments. You will collaborate with cross-functional teams to design scalable and secure cloud solutions that meet our clients' evolving business needs.
About Capco
Capco is a global technology and consulting firm dedicated to the financial services industry. We are passionate about driving innovation and delivering exceptional client value. Our team of experts works collaboratively to provide tailored solutions that empower our clients to navigate complex challenges and seize new opportunities.
Role Overview Capco is hiring a GCP Data Architect in Pune, India. This role focuses on designing, building, and managing data solutions on Google Cloud Platform (GCP). The position involves working closely with teams across different functions to create data architectures that support business goals. What You Will Do Design and implement scalable data solutions on GCP Work with cross-functional teams to deliver cloud-based data architectures Help turn raw data into actionable insights for better decision-making
Role Overview Capco is hiring a GCP Data Architect in Pune, India. This role focuses on designing and implementing scalable data solutions using Google Cloud Platform. The position involves working closely with teams from different disciplines to shape data strategy and improve data architecture for business needs.
Role Overview metromakro is hiring a Data Architect in Pune. This role shapes the design and implementation of data solutions that support business growth. The Data Architect defines and maintains the data architecture, making sure it meets standards for performance, scalability, and security. What You Will Do Design and implement data architectures that align with company objectives Guide the overall data strategy for the organization Work with cross-functional teams to improve data infrastructure Enable data-driven decision-making by ensuring reliable and accessible data systems Who We’re Looking For The ideal candidate brings leadership and vision to data architecture. Strong collaboration skills are essential to work across teams and build solutions that support business needs.
Join our innovative team at Metromakro as a Lead Enterprise Data Architect. In this pivotal role, you will design and implement data architecture strategies that align with our business objectives. You will work closely with cross-functional teams to ensure data integrity and drive data-driven decision-making across the organization. Your expertise will help shape our data landscape and propel our analytics capabilities to new heights.
Position Title: Azure Data ArchitectExperience Level: 8 to 14 YearsRole Overview: As an Azure Data Architect, you will be responsible for designing and implementing comprehensive data solutions utilizing Microsoft Azure. This includes crafting data lakes, data warehouses, and managing ETL/ELT processes. You will develop scalable data architectures to support extensive data processing and analytics workloads while ensuring performance, security, and compliance across Azure data solutions.Key Responsibilities:Design end-to-end data solutions on Microsoft Azure, focusing on data lakes, warehouses, and ETL/ELT processes.Create efficient data architectures that facilitate large-scale data processing and analytics.Maintain high standards of performance, security, and compliance in Azure data solutions.Implement data architecture techniques such as lakehouse and warehouse.Evaluate and select suitable Azure services, including Azure SQL Database, Azure Synapse Analytics, Azure Data Lake Storage, Azure Databricks, Unity Catalog, and Azure Data Factory, with deep expertise in these services.Ideally, possess knowledge and experience with Microsoft Fabric.Collaborate with business and technical teams to translate data needs into robust architecture solutions.Ensure compliance with data governance and privacy regulations.Exhibit strong communication skills and work effectively with cross-functional teams.Lead and mentor the development team in implementing data engineering solutions.Coordinate with Data Scientists and Analysts to align data architectures with business objectives.Optimize cloud-based data infrastructure for enhanced performance, cost-effectiveness, and scalability.Analyze and optimize data workloads for performance tuning and cost management.Monitor performance and availability issues in cloud data solutions.Proficient in programming languages such as SQL, Python, and Scala, with hands-on experience in MS SQL Server, Oracle, or similar RDBMS platforms.Experience with Azure DevOps and CI/CD pipeline development.High-level architectural experience in data science or related fields.Strong understanding of database structures and principles.Experience in distributed data processing for big data pipelines, both batch and streaming.Familiarity with data visualization tools like Power BI and Tableau.Proficient in data modeling with the ability to convert OLTP data structures into Star Schema, with preferred experience in DBT.
Join Databricks as a Solutions Architect specializing in Data and AI, where you will play a pivotal role in designing innovative solutions that solve complex business challenges. Collaborate with cross-functional teams to harness the power of data and artificial intelligence, driving impactful results for our clients.
Join Capco as a GCP Platform Engineer, where you will play a vital role in designing, implementing, and maintaining robust cloud solutions on the Google Cloud Platform. You will collaborate with cross-functional teams to optimize cloud infrastructure, ensuring high availability, performance, and security for our clients. Your expertise will be instrumental in driving innovation and delivering exceptional value.
Join Capco as a GCP Technical Lead, where you will spearhead cloud transformation initiatives and guide your team through the design and implementation of innovative cloud solutions. Your expertise in Google Cloud Platform will drive our projects to success, ensuring excellence in delivery and client satisfaction.
Join Capco as a GCP Technical Lead and play a pivotal role in driving cloud solutions for our clients. You will lead a talented team to design and implement Google Cloud Platform architectures, ensuring the highest standards of performance and scalability.
Join Capco as a GCP Platform Engineer, where you will be at the forefront of innovative cloud solutions. In this role, you'll leverage your expertise in Google Cloud Platform to architect, develop, and optimize cloud environments. You will collaborate with cross-functional teams to design scalable and secure cloud solutions that meet our clients' evolving business needs.
Role Overview:We are on the lookout for a dynamic Big Data Lead who will play a pivotal role in taking our data capabilities to the next level. Your primary focus will be on solving intricate data challenges using cutting-edge big data technologies, dedicating almost 50% of your time to hands-on coding tasks.Key Responsibilities:Develop and manage large-scale text data processing and event-driven data pipelines, optimizing performance across CPU, network IO, and disk IO.Utilize cloud-native services on platforms such as AWS and GCP to enhance data processing and storage capabilities.About You:You possess a strong foundation in computer engineering, with expertise in Unix, data structures, and algorithms.You have successfully designed and implemented multiple big data modules and data pipelines capable of handling vast volumes of data.Passionate about technology, you have experience in initiating and executing projects from the ground up.
About UsAt Hitachi Digital Services, we are a pioneering global digital solutions and transformation firm, driven by an ambitious vision for the potential of our world. Our people-centric approach empowers us to create a positive impact daily—whether by future-proofing urban spaces, conserving natural resources, safeguarding rainforests, or saving lives. We leverage innovation, technology, and profound expertise to transition our clients from their current state to a transformative future, fueled by the power of acceleration.We recognize that the diversity of life experiences, character, perspectives, and a shared passion for meaningful accomplishments are just as crucial as technical skills. We welcome unique individuals who can contribute to our mission.Our TeamWe take pride in being leaders in pioneering innovations, harnessing the transformative capabilities of cloud technologies, and offering converged and hyperconverged solutions. Our goal is to empower clients to securely store, manage, and modernize their digital core, unlocking valuable insights and driving data-centric value.This collaborative, diverse, and robust team of technology professionals partners with various departments to assist our clients in storing, enriching, activating, and monetizing their data, ensuring that every aspect of their business derives value.Position OverviewWe are in search of a seasoned Data Architect specializing in Workday Reporting and data automation. The ideal candidate will possess 10-12 years of experience, demonstrating a strong foundation in data architecture, reporting, and process automation.
Senior Software Engineer - Data Architect Join Convera, a leader in Foreign Exchange payments processing, as a Senior Software Engineer. This role allows for Work From Home options in Pune. We are looking for a seasoned Senior Software Engineer to spearhead the development and optimization of our data systems. Reporting to the Senior Manager, you will be part of a vibrant team dedicated to enhancing data architecture strategies and implementing robust data governance across our organization. This position demands an in-depth comprehension of business processes, technology, data management, and compliance with regulatory standards. You will collaborate closely with business and IT leaders to align the enterprise data architecture with our business objectives while ensuring adherence to data governance policies. Your tasks will involve working alongside data modelers, data engineers, analysts, and cross-functional teams to build a new data platform, integrate diverse data sources, and guarantee data availability for various applications and reporting needs. Familiarity with AI/ML technologies and collaboration with data scientists to fulfill their data requirements is also expected. Key Responsibilities: Design and Develop Data Architecture Solutions: Create scalable and efficient data architecture solutions on the enterprise data platform using tools like Snowflake, Tableau, and AWS. Collaborate on Data Models: Engage with stakeholders to understand data requirements. Lead the architecture and design of data models, schemas, data mappings, and transformations that meet business and analytical needs. Data Integration and Governance: Work with teams to design and maintain data integration solutions, ensuring high data integrity and compliance with security regulations. Business Intelligence Development: Implement semantic models to enhance business intelligence capabilities and reporting solutions.
T-Systems Information and Communication Technology India Private Limited
Contract|On-site|Pune
Role: Senior Cloudera Developer (Data Engineer)Experience: 6 to 10 YearsLocation: PuneJob Description:Expertise in Spark programming and architecture, with a solid understanding of fault tolerance mechanisms.Proficient in utilizing Spark DataFrames and Spark SQL for querying structured datasets.Experience in optimizing Spark execution plans is advantageous.Skilled in performing Extract, Transform, Load (ETL) processes using Spark.Experience integrating Spark Streaming with technologies like Kafka is a plus.Familiarity with the Hadoop ecosystem, including HDFS, Hive, and Cloudera stack, is beneficial.Experience in deploying and managing Spark applications on Hadoop clusters or GCP Dataproc.Strong proficiency in Python, with experience in Java being a bonus.Familiar with DevOps tools and practices, including CI/CD and Docker.Hands-on experience with GCP services such as Dataproc, Cloud Functions, Cloud Run, Pub/Sub, and BigQuery.Responsibilities:Design and implement data solutions leveraging Cloudera technologies like Hadoop, Spark, and Hive.Collaborate with data engineering teams to enhance data pipelines and processing workflows.Work closely with data analysts and scientists to ensure data quality and integrity.Diagnose and resolve issues related to data processing and storage systems.Stay abreast of the latest developments and best practices in Cloudera development.Participate in code reviews and contribute constructive feedback to peers.
T-Systems Information and Communication Technology India Private Limited
Full-time|On-site|Pune
We are actively looking for a talented and driven AI Architect to enhance our expanding data team.In this role, you will collaborate with diverse teams to transform data into key insights that inform strategic choices and drive operational advancements.This position demands exceptional analytical abilities, hands-on experience in statistical modeling, and a robust understanding of machine learning principles.Key Responsibilities:Develop, test, and implement predictive models and machine learning algorithms utilizing both structured and unstructured data.Conduct exploratory data analysis (EDA) to uncover trends, patterns, and opportunities for enhancement.Design and execute experiments (e.g., A/B tests) to support business decision-making processes.Work closely with business stakeholders, engineers, and product managers to define data challenges and provide actionable insights.Effectively communicate complex data findings in a clear and succinct manner to non-technical stakeholders while maintaining and optimizing data pipelines.Ensure data integrity and consistency while contributing to the development of internal tools, reusable code, and model deployment processes.Monitor model performance over time and retrain models as needed to ensure sustained accuracy.
About UsCapco, a proud member of the Wipro family, stands as a leading global technology and management consulting firm. We have been honored with the Consultancy of the Year award at the British Bank Awards and recognized among the Top 100 Best Companies for Women in India 2022 by Avtar & Seramount. With a footprint in 32 cities globally, we serve over 100 clients across banking, financial services, and energy sectors. Our expertise lies in delivering transformative execution and results.Why Join Capco?At Capco, you will engage in stimulating projects with some of the largest international and local banks, insurance companies, payment service providers, and other key industry players. These projects are set to revolutionize the financial services landscape.Make an ImpactWe prioritize innovative thinking, excellence in delivery, and thought leadership to facilitate our clients' transformations. Collaborating with our clients and industry partners, we create disruptive solutions that are reshaping energy and financial services.#BeYourselfAtWorkCapco fosters an open, inclusive culture that celebrates diversity and creativity.Career AdvancementWith a flat organizational structure, everyone at Capco has the opportunity to thrive and shape their career path as we grow together.Diversity & InclusionWe firmly believe that diverse perspectives and backgrounds give us a competitive edge.Role: GCP Data EngineerJob Summary:We are seeking a passionate GCP Data Engineer to join our innovative and expanding project team. In this role, you will hone your skills while contributing to large-scale data initiatives on the Google Cloud Platform. Collaborating closely with senior engineers and data analysts, you will build and maintain robust data pipelines, transforming raw data into actionable insights. This position offers an exceptional opportunity to kickstart your career in data engineering within a supportive and collaborative environment.Key Responsibilities:· Design, develop, test, and maintain data pipelines and ETL/ELT processes utilizing Python, SQL, and GCP services.· Construct and optimize complex SQL queries in BigQuery for data transformation, extraction, and analysis.· Collaborate on the design and implementation of data models within our data warehouse.
As a Power BI Architect, you will leverage your expertise in data modeling and analysis to develop high-quality datasets and create visually compelling reports. Your role will involve designing and developing data models that effectively support business requirements, ensuring the accuracy and reliability of the data presented in dashboards and reports. You will need to be proficient in Power BI Desktop and have a strong command of SQL and DAX. Your projects may vary from short-term individual client engagements to extensive multiyear delivery projects involving large, diverse teams. Key Responsibilities:Design and implement data models to support reporting needs.Develop and maintain ETL processes for data cleansing and preparation.Collaborate with stakeholders to gather requirements and deliver actionable solutions.Stay updated with best practices and innovations from the Power BI community and documentation.Communicate effectively with both technical and non-technical teams.Utilize Azure data platforms, including ADLS, SQL Server, ADF, and Databricks.
About UsAt Abacus Insights, we are revolutionizing the management of healthcare data for health plans. Our mission is straightforward: to make healthcare data actionable, empowering those responsible for care and cost decisions to act swiftly and confidently.We facilitate the dismantling of data silos, fostering a unified and trustworthy data foundation that enhances decision-making, improves outcomes, reduces waste, and enriches experiences for both members and providers.Supported by $100 million from leading investors, we are addressing significant challenges in an industry ripe for transformation. Our platform is designed for GenAI applications, delivering clean, connected, and reliable healthcare data that underpins automation, prioritization, and decision workflows—making us pioneers in this space.Our innovation is driven by our people. We value boldness, curiosity, and collaboration, believing that the best ideas emerge from teamwork. Are you ready to contribute to our mission? Join us in shaping the future.About the RoleWe are seeking a knowledgeable Solution Architect to join our Solution Architecture team. In this role, you will be instrumental in supporting existing and new implementations as we anticipate significant growth. Collaborating with clients and various internal teams, you will design highly scalable, flexible, and resilient cloud architectures that address customer business challenges and promote the adoption of Abacus’ solutions. You will exemplify best practices in advanced cloud solutions and healthcare payer systems, including claims processing (Medical, Vision, Prescription), clinical informatics (Lab Results), and provider systems like EHR, along with CMS Interoperability solutions. As a trusted advisor, you will guide clients in effectively adopting Abacus’ core data management solutions.Your Day-to-Day ResponsibilitiesDevelop project architecture design documents, including data flows, sequence diagrams, and overall system architecture, primarily focusing on CMS Interoperability solutions, alongside other data solutions.Engage with clients to understand their requirements and translate them into effective architectural solutions.Collaborate with cross-functional teams to ensure the successful implementation of designed architectures.Provide ongoing support and optimization for existing solutions to enhance performance and user satisfaction.
Join Assent as a Technical Lead in Data Engineering and play a pivotal role in transforming our data architecture. You will lead a talented team of data engineers, driving innovative solutions that enhance our data capabilities. Collaborate across teams to design, develop, and implement robust data pipelines and frameworks, ensuring data integrity and availability for analytical and operational needs.
Role Overview Beghou Consulting is seeking a Senior Consultant with deep experience in Snowflake architecture. This position focuses on designing and implementing data warehousing and analytics solutions for clients. The goal: help organizations build scalable, efficient, and secure data platforms that support informed decision-making.
Apr 16, 2026
Sign in to browse more jobs
Create account — see all 276 results
Tailoring 0 resumes…
Tailoring 0 resumes…
We'll move completed jobs to Ready to Apply automatically.