Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Unlock Your Potential
Generate Job-Optimized Resume
One Click And Our AI Optimizes Your Resume to Match The Job Description.
Is Your Resume Optimized For This Role?
Find Out If You're Highlighting The Right Skills And Fix What's Missing
Experience Level
Entry Level
Qualifications
Key Responsibilities:Analyze large datasets to identify trends and patterns. Develop predictive models and machine learning algorithms. Collaborate with cross-functional teams to implement data-driven solutions. Communicate findings to stakeholders clearly and effectively. Qualifications:Proven experience in data analysis and statistical modeling. Strong programming skills in Python or R. Familiarity with data visualization tools (e.g., Tableau, Power BI). Ability to work in a team and communicate complex concepts.
About the job
Deutsche Telekom IT Solutions seeks a Data Scientist (REF5387K) to join its team in Budapest or Debrecen. The role centers on analyzing complex datasets, developing statistical models, and applying machine learning methods to support decisions across several business areas.
Key Responsibilities
Examine large and diverse data sets to spot trends and patterns
Develop and implement statistical models and machine learning algorithms
Convert data insights into practical recommendations for various business teams
Collaborate with colleagues to address challenging data questions
Who Thrives Here
This position fits those who value teamwork and are eager to tackle complex challenges using data-driven approaches.
About Deutsche Telekom IT Solutions
Deutsche Telekom IT Solutions is a leading provider of innovative IT services and solutions, dedicated to enhancing the digital transformation of businesses. We pride ourselves on our commitment to quality and customer satisfaction, leveraging cutting-edge technology to deliver exceptional results for our clients.
The Opportunity:Join Wealthmonitor, a leading division of ION Analytics, where we empower the wealth management and banking sectors with critical intelligence and data solutions. Our mission is to help organizations connect with high-net-worth individuals by pinpointing those who stand to gain from significant liquidity events, particularly in the area of M&A activity. We are currently looking for enthusiastic entry-level Data Associates fluent in one or more of the following languages: Spanish, German, Nordics, in addition to English. The role involves conducting comprehensive research on individuals and corporations, utilizing a blend of manual and automated research techniques. A keen eye for detail and the ability to interpret nuanced information are crucial for maintaining and enhancing our database. You will need a proactive and methodical approach to continuously refine our data gathering and management processes. Strong analytical and research skills, coupled with a problem-solving mindset, are essential for contributing to our business growth. Excellent communication and organizational skills are also vital for success in this position.
Join metgroup as a Data Architect and play a pivotal role in shaping our data architecture and strategy. You will collaborate with cross-functional teams to design and implement robust data solutions that drive our business forward. Your expertise in data modeling, database design, and data integration will be essential in optimizing our data systems.
The Exciting Opportunity This position plays a vital role in architecting and enhancing our platform to meet business demands while optimizing our systems. In this role, you will have the opportunity to develop new data pipelines, manage platforms hosted on data streams for both batch and real-time loading, and create real-time visualizations. Key Responsibilities: Maintain and enhance our existing data platform. Develop processes to ingest data from Kafka, APIs, and databases using AWS MSK Connect. Design and maintain real-time data processing applications utilizing frameworks such as Spark Structured Streaming and Kafka Streams. Implement transformations on data streams. Participate in data modeling adhering to standards like Inmon, Kimball, and Data Vault. Ensure data quality by verifying consistency and accuracy. Stay current with research and advancements in technology to improve our data platform. Possess an investigative mindset to troubleshoot issues creatively and manage incidents effectively. Take full ownership of assigned projects and tasks while collaborating within a team environment. Document processes thoroughly and conduct knowledge-sharing sessions. What We're Looking For: Essential Qualifications: Proven experience with modern cloud database technologies, especially Snowflake. Expertise in orchestrating data pipelines using Airflow. Proficient in AWS Glue. Familiarity with Apache Iceberg. Strong experience with SQL and Data Integration Tools. Proficiency in programming languages such as Python or Scala. Knowledge of AWS Services like S3, Lambda, API Gateways, DMS, and RDS. Development experience in Microsoft and Linux/Cloud environments. Exceptional analytical and problem-solving skills.
Full-time|Hybrid|Budapest, London, Manchester, Amsterdam, Rotterdam, Dublin, Zagreb, Split
Join our team as a Lead Data Engineer and play a pivotal role in shaping data engineering strategies at DEPT®. Our hybrid work environment spans across vibrant cities including Budapest, London, Manchester, Amsterdam, Rotterdam, Dublin, Zagreb, and Split.At DEPT®, we empower the world’s most ambitious brands to accelerate their growth, blending technology and marketing through our expert team of over 4,000 specialists. We are proud to partner with industry leaders such as Google, Lufthansa, Meta, eBay, and OpenAI, and have maintained our B Corp and Climate Neutral certifications since 2021.In our Data & AI practice, we are dedicated to producing groundbreaking work that leverages Data & AI across various sectors. As a member of our EMEA Data craft team, you will collaborate with data strategists, scientists, and analysts to tackle complex challenges faced by beloved global brands.As a Lead Data Engineer, you will guide your team in delivering innovative, enterprise-scale data solutions, combining your technical expertise with strong business insights to meet client objectives.
Full-time|On-site|Budapest, Hungary; Munich, Germany; Tel Aviv, Israel
Role Overview Tulip is hiring a Data Operations Engineer to help manage and improve data operations. This role focuses on maintaining data quality, refining workflows, and supporting the efficiency of daily processes. The position is based in Budapest, Munich, or Tel Aviv. What You Will Do Work closely with teams across the company to ensure data remains accurate and reliable Identify opportunities to streamline data-related processes Support ongoing efforts to improve the efficiency of data operations Location Budapest, Hungary Munich, Germany Tel Aviv, Israel
Deutsche Telekom IT Solutions seeks a Data Scientist (REF5387K) to join its team in Budapest or Debrecen. The role centers on analyzing complex datasets, developing statistical models, and applying machine learning methods to support decisions across several business areas. Key Responsibilities Examine large and diverse data sets to spot trends and patterns Develop and implement statistical models and machine learning algorithms Convert data insights into practical recommendations for various business teams Collaborate with colleagues to address challenging data questions Who Thrives Here This position fits those who value teamwork and are eager to tackle complex challenges using data-driven approaches.
Join Kpler as a Senior BI Data Engineer and be a pivotal force in shaping our data architecture. In this vital role within our Business Intelligence & Insights team, you will develop scalable data pipelines, create robust data models, and establish reliable infrastructure that empowers teams across the organization with access to high-quality, accurate data. This position is ideal for someone who thrives in a fast-paced, international environment and enjoys addressing complex data challenges while designing systems that drive insights, reporting, and machine learning at scale. You will report directly to the Director of BI, collaborating closely with cross-functional teams to enhance data-driven decision-making across the company.
Bosch Group seeks a Data Engineering Intern based in Budapest. This internship provides practical experience in data management, analytics, and the development of data pipelines. What you will do Support the team in managing and organizing data Assist with analytics tasks and reporting Help build and maintain data pipelines for ongoing projects Work closely with experienced data engineers on real-world assignments Requirements Interest in data engineering and analytics Willingness to learn new data technologies Ability to collaborate with team members Based in Budapest or able to work from this location
Join our dynamic team at NielsenIQ as a Junior Data Operations Analyst where you will play a crucial role in managing and analyzing data operations. This entry-level position is perfect for detail-oriented individuals who are eager to kickstart their careers in data analytics and operations.As a Junior Data Operations Analyst, you will be responsible for supporting data processes, ensuring data accuracy and integrity, and assisting in the development of data solutions. You will work closely with various departments to enhance data flow and reporting capabilities.
Established in 1999 in Vienna, the Qualysoft Group is a distinguished IT consulting and services firm that operates independently of any manufacturer. We pride ourselves on delivering innovative IT solutions that enhance the competitiveness and economic efficiency of our international clientele.Our areas of expertise include financial services, telecommunications, the automotive sector, and energy services. With a dedicated team of over 400 professionals across six subsidiaries, we collaborate to provide cutting-edge solutions tailored to our clients' needs.We are currently seeking enthusiastic new members to join our Qualysoft teams, engaging in diverse projects that foster continuous learning and professional growth. Our shared mission is to cultivate an environment of integrity, development, and stability while exploring the latest technologies. We eagerly await your application for the role outlined below!
Join our esteemed client, a well-established IT consulting firm with over two decades of expertise in executing large-scale technology projects across various sectors. Their enduring collaborations with major organizations demonstrate their technical prowess and financial reliability, bolstered by consistently high external credit ratings. Operating with a streamlined core team, the company boasts low turnover rates and fosters a collaborative culture built on trust, long-term partnerships, and a commitment to high-quality deliverables. With engagements spanning telecommunications, healthcare, and academia, this role offers engineers the chance to work on diverse and intricate projects while enjoying a flexible and supportive work environment.We are seeking a Senior Data Engineer to help manage extensive data environments characterized by high-volume datasets, intricate integrations, and analytical applications. This hands-on engineering position emphasizes the creation and upkeep of data pipelines, in-depth understanding of end-to-end data flows, and resolution of data quality concerns across distributed systems.You will collaborate closely with engineers, analysts, and stakeholders to ensure the delivery of reliable, scalable, and well-structured data solutions that enhance analytics and machine learning efforts.Key Responsibilities:Design, construct, and sustain scalable data ingestion and ETL/ELT pipelines.Manage large datasets and interfaces that feed into enterprise-level databases.Examine data flows and diagnose inconsistencies across systems.Conduct root cause analyses to trace erroneous outputs back to original source systems.Prepare datasets for analytics and machine learning applications.Participate in data modeling and mapping activities.Assist machine learning initiatives through data preparation, evaluation, and optimization.Establish and implement data quality monitoring frameworks.Create visualizations to facilitate data interpretation and insights.
Join Us as a Field Data Collector!Are you passionate about agriculture and technology? Terry Soot Management Group (TSMG), a leader in field data collection since 2017, is looking for dedicated individuals to join our team in Budapest. Our mission is to collect essential data where automation falls short, empowering informed decisions through meticulous counting, photography, videography, and area scanning.Project OverviewThe project aims to gather images of agricultural fields to assist farmers in their transition to regenerative agriculture, promoting sustainability and profitability. Utilizing a smartphone and a user-friendly app, you will receive comprehensive training to ensure your success. Data collectors will be assigned specific areas for photography, and the project is estimated to last approximately two weeks.If you have a car (compensation provided per kilometer), know the local area well, and are a responsible, reliable individual, we want to hear from you!
Qualysoft Group, established in 1999 in Vienna, is a leading independent IT consulting and services provider, dedicated to enhancing the competitiveness and economic efficiency of its international clientele through innovative IT solutions. Our expertise spans various sectors, including financial services, telecommunications, automotive, and energy services. With a workforce of over 400 professionals across six subsidiaries, we collaborate to deliver state-of-the-art solutions tailored to our clients' needs. We invite enthusiastic individuals to join our Qualysoft teams, where diverse projects offer continuous learning and professional growth. Our collective aim is to foster an environment of integrity, development, and stability, while staying abreast of the latest technologies. We eagerly await your application for the role outlined below!We are in search of a seasoned IT Manager who will also take on the responsibilities of a Data Protection Officer (DPO). This pivotal role encompasses overseeing the organization's IT operations, leading technical teams, ensuring efficient incident and ticket management, as well as maintaining compliance with data protection regulations. The ideal candidate will possess robust leadership capabilities, hands-on technical expertise, and a thorough understanding of audit processes and data privacy.
Join our dynamic team as a Senior Data Engineer where you will play a pivotal role in shaping our data architecture and driving innovative solutions. You will be responsible for designing, building, and maintaining robust data pipelines that ensure our data is accessible, reliable, and ready for analysis. Your expertise will directly contribute to enhancing our data-driven decision-making processes.
Role Overview Betsson Group is hiring a Lead Data DevOps Engineer to guide the Data DevOps team in Budapest. This position focuses on team leadership, technical direction, and ensuring smooth data operations within an agile setup. Key Responsibilities Lead and mentor a team of Data DevOps professionals. Work closely with cross-functional teams to align data operations with business needs. Shape and improve data architecture and data pipelines. Implement and promote strong data management and DevOps practices. Streamline workflows to deliver insights that support company goals. Location This role is based in Budapest.
Join our team as a Senior Data Engineer, where you will play a pivotal role in building and managing scalable data ingestion and Change Data Capture (CDC) capabilities on our Azure-based Lakehouse platform. Your expertise will drive our engineering maturity as we deliver ingestion and CDC preparation through Python projects and reusable frameworks. We are seeking a professional who applies best software engineering practices, including clean architecture, rigorous testing, code reviews, effective packaging, CI/CD, and operational excellence.Our platform emphasizes batch-first processing, allowing for the landing of streaming sources in their raw form while processing them in batch. We are selective in our evolution towards streaming as necessary.As part of the Common Data Intelligence Hub, you will collaborate closely with data architects, analytics engineers, and solution designers to create robust data products and ensure governed data flows across the enterprise.Your team is responsible for end-to-end ingestion and CDC engineering, including design, build, operation, observability, reliability, and reusable components.You will contribute to the development of platform standards, including contracts, layer semantics, and readiness criteria.While you will not primarily manage cloud infrastructure provisioning, you will work with the platform team to define requirements, review changes, and maintain deployable code for pipelines and jobs.Platform Data Engineering & DeliveryDesign and develop ingestion pipelines utilizing Azure and Databricks services, including Azure Data Factory pipelines and Databricks notebooks/jobs/workflows.Implement and manage CDC patterns for inserts, updates, and deletes, accommodating late arriving data and reprocessing strategies.Structure and maintain bronze and silver Delta Lake datasets, focusing on schema enforcement, de-duplication, and performance tuning.Create “transformation-ready” datasets and interfaces with stable schemas, contracts, and metadata expectations for analytics engineers and downstream modeling.Adopt a batch-first approach for data ingestion, ensuring raw landing, replayability, and idempotent batch processing while progressing towards true streaming as required.Software Engineering for Data FrameworksDevelop and maintain Python-based ingestion and CDC components as production-grade software, focusing on modules, packaging, versioning, and releases.Implement engineering best practices such as code reviews, unit/integration tests, static analysis, formatting/linting, type hints, and comprehensive documentation.Establish and enhance CI/CD pipelines for data engineering code and pipeline assets, covering build, testing, security checks, deployment, and rollback patterns.
Overview of the Role:Join KYC6, a distinguished division of ION Analytics, recognized as a leading independent provider of essential intelligence for professionals in Anti-Money Laundering (AML), Anti-Corruption, and Cyber Security. We maintain a comprehensive database drawing from over 240 countries, empowering our clients to make informed decisions throughout the due diligence process while effectively managing risks associated with financial crimes, money laundering, and terrorist financing. In light of our growth initiatives, we are actively seeking talented Data Associates proficient in one or more of the following languages: French, Arabic, Portuguese, Spanish, Italian, or German, in addition to English. This role involves meticulous research on individuals, corporations, and their networks, utilizing both manual methodologies and automated processes. Candidates must possess a keen eye for detail and the ability to discern nuanced information, which is critical for the maintenance and enhancement of our database.The ideal candidate will adopt a systematic and proactive approach to ongoing improvements in data collection, management, and analysis. Strong research capabilities and a solution-oriented mindset are essential to facilitate the expansion of our business. Effective communication skills are a must, enabling collaboration with teams across various technical and business functions and global locations, alongside strong organizational abilities.As a Data Associate, you will develop expertise in AML compliance, actively participating in data innovation to enhance value for our clients while contributing to a culture of inclusivity and openness.
About UsTerry Soot Management Group (TSMG) is a pioneering field data collection company established in 2017, dedicated to delivering high-quality data collection services in scenarios where automation is not feasible. We specialize in counting features, capturing images and videos, recording audio, and meticulously scanning areas to provide you with the essential details required for informed decision-making. Our talented teams operate across Europe and North America, embracing new challenges as they arise.Project OverviewOur current project focuses on gathering high-resolution visuals of streets, notable landmarks, and public spaces throughout EU countries. Utilizing vehicles equipped with advanced 360° cameras and LiDAR technology, drivers will document comprehensive images of their surroundings. This data will be securely stored on onboard SSD drives and subsequently utilized to enhance one of the world's leading online mapping platforms.Data collectors will navigate pre-assigned routes, concentrating on public thoroughfares, commercial districts, and places of interest. To ensure optimal image quality, certain locations may need to be revisited under favorable weather conditions.The duration of the project varies depending on the specific data collection tasks required in each area, with assignments typically lasting between 3 to 6 months.Work Schedule: Full-time engagement during standard business hours (8 AM - 7 PM).Recruitment Process: Candidates for the Driver position will undergo a safety driving test, an interview with a recruiter, and an onboarding process.
Join a global leader in technology dedicated to energy innovation for a sustainable future. Our client is at the forefront of transforming energy solutions with advanced gas engine technologies, digital platforms, and energy services. With operations in over 100 countries, they focus on advancing engineering while prioritizing the well-being of people and the planet.The Senior Data Engineer plays a crucial role in collaboration with Data Analysts, BI Developers, and Requirements Engineers to lay the groundwork for all analytical projects. This position entails building and managing data pipelines that extract, transform, and load (ETL) data from various sources into a centralized repository, subsequently facilitating project-specific data delivery.Your Responsibilities: Architect and deploy scalable and resilient data pipelines that meet the analytics and data processing requirements. Design and maintain robust database architectures, including data lakes and warehouses. Ensure data integrity and consistency through meticulous data cleaning, transformation, and validation processes. Engage with Data Analysts, BI Developers, and Requirements Engineers to gather project requirements and provide data solutions aligned with business goals. Enhance data retrieval processes by developing pipelines and physical data models tailored for reports and various analytical projects. Implement data security and privacy protocols to ensure compliance with legal and regulatory standards. Document all created data pipelines comprehensively.
At Kpler, we empower organizations to navigate complex markets effortlessly. Our innovative approach simplifies global trade information, providing essential insights that drive informed decisions in the commodities, energy, and maritime industries.Founded in 2014, Kpler has continually delivered premier intelligence through intuitive platforms. With a dedicated team of over 700 professionals from more than 35 countries, we transform intricate data into actionable strategies, ensuring our clients remain competitive in a rapidly evolving market. Join us to harness cutting-edge innovation for impactful outcomes and receive unmatched support on your path to success.We are seeking an enthusiastic and skilled Senior Data Analyst to become a pivotal member of our Business Intelligence & Insights team. In this influential role, you will help shape the data architecture that underpins Kpler's commercial and strategic decisions. Reporting directly to the Head of BI, you will be responsible for managing essential data pipelines, designing scalable Looker solutions, and serving as a trusted data partner for stakeholders across the organization.This role is ideal for an individual who excels at the intersection of data engineering and business analytics, capable of building robust, production-quality infrastructure one day and translating complex datasets into actionable insights for commercial teams the next. If you are passionate about working with real-time commodity flow data and eager to influence the BI strategy of a dynamic B2B SaaS company, we encourage you to apply.
Apr 9, 2026
Sign in to browse more jobs
Create account — see all 101 results
Tailoring 0 resumes…
Tailoring 0 resumes…
We'll move completed jobs to Ready to Apply automatically.