Senior Software Engineer - Data at TrustYou | Remote
Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Experience Level
Senior
Qualifications
About TrustYou
TrustYou is an innovative leader in the hospitality industry, leveraging AI technology to enhance guest experiences and empower businesses to achieve remarkable growth. Our commitment to customer satisfaction and continuous improvement shapes our dynamic culture, fostering collaboration and excellence among our diverse, remote workforce.
Similar jobs
Browse all companies, explore by city & role, or SEO search pages.
Search for Data Engineer I - Remote Opportunities
7,182 results
About Dataiku Dataiku provides a unified platform for building, deploying, and managing AI and analytics across the enterprise. The platform connects teams and tools, supporting transparency, collaboration, and centralized governance. Organizations use Dataiku to run analytics, machine learning, and AI projects across multiple vendors and cloud environments.…
LITIT is a joint venture between NTT DATA and Reiz Tech, focused on delivering IT solutions in the DACH region. The company brings together German precision, Japanese work ethics, and Lithuanian talent to provide IT services and support. This remote Data Analytics Engineer position centers on building and improving the IoT Insurance Data Platform (IDP) using AWS. The role involves designing, implementing, and maintaining scalable data pipelines and shared platform services. The work supports analytics, data products, and machine learning applications for industrial IoT and insurance clients. Key responsibilities Architect, deploy, and manage cloud-native data pipelines on AWS. Develop and maintain scalable ETL workflows, data lakes, and data mesh components. Create and optimize PySpark jobs for processing large-scale and time-series data. Manage schemas, tables, and metadata using AWS Glue Data Catalog and Lake Formation. Collaborate with data platform, analytics, and product teams. Desired qualifications Extensive hands-on experience with AWS services, especially Lambda, Glue, and S3; familiarity with Athena, Lake Formation, Step Functions, and DynamoDB is a plus. Strong background in data engineering, including designing and executing ETL/ELT pipelines for large-scale or streaming data. Proficiency in Spark or PySpark for distributed data processing. Understanding of modern data formats such as Apache Iceberg and Parquet. Proven ability to deliver production-grade, enterprise-level data solutions. Experience with API integration, including AWS API Gateway and data exchange APIs. Familiarity with CI/CD pipelines and automated deployment processes. Experience working in cross-functional Scrum teams and an agile mindset. Willingness to travel domestically or internationally as project or client needs arise, sometimes on short notice. Compensation and development Salary range: €4700 - €5700 gross per month. Opportunities for learning and continuous professional growth.
Orcrist Technologies
Data Engineer (Python) Company Overview Orcrist Technologies is at the forefront of innovation with the Orcrist Intelligence Platform (OIP), a cutting-edge data intelligence system built on Kubernetes. Our platform is available as a SaaS solution or can be deployed on-premises, including air-gapped setups. We manage both streaming and batch data pipelines that empower search functionalities, machine learning enrichment, and investigative workflows for our mission-critical clientele. Role Summary As a Data Engineer, you will play a pivotal role in quickly validating new data initiatives from inception to deployment, ensuring they are adoptable and scalable. In this innovative environment, you will prototype effective connectors and pipelines, generate performance assessments, and create handoff packages for productization by our Foundation or delivery team. Key Responsibilities Prototype ingestion and connector patterns (batch and streaming) utilizing NiFi, Kafka, Kafka Connect/Streams, and Change Data Capture approaches. Design schemas and data models that are both prototype-grade and easily adoptable, ensuring semantic clarity and a disciplined approach to evolution. Develop incremental lakehouse datasets using Hudi, Iceberg, and Delta patterns, producing outputs for real-world latency and throughput evaluations. Implement data quality and provenance considerations early in the process, incorporating checks, metadata hooks, and operational basics. Containerize and deploy prototypes on Kubernetes, providing minimal runbooks and configurations for seamless adoption. Create adoption artifacts including schemas, reference implementations, technical design notes, and a backlog for integration. Qualifications Minimum of 3 years of experience in data engineering with a proven track record of delivering real-world data pipelines beyond ad-hoc scripts. Proficient in Python and SQL, skilled in building transformations, validation tools, and pipeline integration code. Solid understanding of streaming and Change Data Capture fundamentals, along with experience in the Kafka ecosystem. Familiar with lakehouse architectures and query layers (e.g., Hudi, Iceberg, Delta, Trino, Hive, Postgres) and their role in making datasets accessible. Comfortable working in Kubernetes and container environments and adept at documenting technical decisions clearly. Must be eligible to work in Germany; EU/NATO citizenship is preferred, and export-control screening will apply. Preferred Qualifications Experience with data quality tools such as Great Expectations or metadata/lineage platforms (OpenMetadata, DataHub, Atlas). Experience with on-premises or air-gapped deployments and awareness of governance and policy for regulated environments. Proficiency in German (B1+) and familiarity with OSINT, GEOINT, or multi-INT data structures. What We Offer A modern data stack with real-world constraints: Kafka, NiFi, and more.
publiccloudgroup
Join publiccloudgroup as a Data Engineer (w/m/d) and take the next step in your career while working remotely from Germany. In this role, you will be responsible for designing, implementing, and maintaining robust data pipelines that support our cloud-based solutions. Collaborate with cross-functional teams to leverage data for business insights and drive decision-making processes.
Join our team as an AI Data Specialist and help enhance AI-generated content in German. Job Type: FreelanceLocation: Work from homeWork Schedule: Part-time - 10+ hours per week. Enjoy the flexibility to work whenever you want.Start Date: ImmediateDuration: To Be ConfirmedRate: Rates vary by location. Help Shape the Future of AIAre you a student, recent graduate, stay-at-home parent, or a professional seeking a flexible work opportunity? Do you want to influence the development and safety of AI models today? Your ResponsibilitiesThis role involves various data-related tasks, including:Data collection, evaluation, and annotation.Conducting pairwise comparisons.Counting tasks.Labeling and tagging objects across different content types (audio, video, images, and datasets). Qualifications- Native-level fluency in German- English Proficiency: Fluent or advanced (B2–C2 levels).- AI & Data Experience (Preferred): Familiarity with machine learning tasks, data collection, preprocessing, evaluation, and annotation. What We Offer- Flexible working hours- Opportunity to earn extra income- Timely payments- Ideal for students, part-timers, or stay-at-home parents
O-I Glass, Inc.
As a Shift Leader at O-I, you will play a crucial role in overseeing operations to ensure efficiency and safety within our production facility. Your leadership will drive our team towards achieving production targets while maintaining the highest quality standards.
Why Join NebiusNebius is at the forefront of cloud computing, catering to the global AI economy. We empower our clients with innovative tools and resources to tackle real-world problems and revolutionize industries, all while minimizing infrastructure expenses and the necessity for extensive in-house AI/ML teams. Our workforce operates at the cutting edge of AI cloud infrastructure, collaborating with some of the most experienced and visionary leaders and engineers in the industry.Your Work EnvironmentWith our headquarters in Amsterdam and listed on Nasdaq, Nebius boasts a global presence with research and development hubs throughout Europe, North America, and Israel. Our team, comprising over 1,400 employees, includes more than 400 highly skilled engineers with extensive knowledge in both hardware and software engineering, supported by an in-house AI R&D division.The RoleThe Data Engineering team is tasked with creating and sustaining a robust data infrastructure that drives analytics and business intelligence across Nebius. Our responsibilities include designing and implementing scalable data pipelines, optimizing data storage and processing, and facilitating data-driven decision-making across the organization. This position involves close collaboration with product teams and business stakeholders to ensure alignment with corporate objectives. As a Data Engineer, you will be responsible for designing, building, and maintaining our data infrastructure and pipelines. Your work will include processing large-scale datasets, optimizing data workflows, and enabling analytics capabilities that support our rapidly expanding cloud platform. Your Responsibilities:Design, develop, and maintain scalable data pipelines.Build and optimize data infrastructure.Implement data quality monitoring and validation frameworks.Enhance data storage, processing, and query performance for large-scale datasets.
Qualifyze GmbH
About Qualifyze:Founded in 2019, Qualifyze has quickly established itself as a premier provider in supply chain compliance management within the Life Sciences sector, gaining the trust of over 1,500 pharmaceutical and healthcare firms worldwide. Our sophisticated digital suite of solutions seamlessly connects manufacturers, suppliers, and a global network of more than 250 auditors and quality professionals.With an impressive portfolio that includes over 4,500 audits conducted across 85+ countries, and boasting the largest, most accurate supplier network along with advanced data analytics tools, Qualifyze is your comprehensive partner for quality compliance and supply chain risk mitigation in the Life Sciences industry.
publiccloudgroup
Join the innovative team at publiccloudgroup as a Data & AI Engineer! In this fully remote role, you will leverage your expertise in data engineering and artificial intelligence to develop cutting-edge solutions for our clients. Collaborate with cross-functional teams to design, implement, and optimize data pipelines, ensuring the seamless integration of AI technologies into our cloud infrastructure.
TrustYou stands at the forefront of AI-driven hospitality solutions, committed to enhancing guest experiences while empowering businesses to excel. Our diverse team of over 120 talented professionals collaborates remotely from various locations worldwide, united in a mission to help companies achieve exceptional customer satisfaction.At TrustYou, our culture is dynamic, shaped by the contributions of our team members. We value open feedback and are dedicated to continuous improvement and achieving excellence in customer service.Every individual’s unique perspective enriches our collective success, fostering an environment of experimentation, learning, and growth.Our innovative products are designed to enhance customer satisfaction, increase customer lifetime value, and minimize unnecessary expenditures.Customer Experience Platform (CXP): Gain AI-powered insights that elevate guest experiences. Enhance service quality using feedback from surveys and reviews, respond to all comments with AI assistance, and enhance your brand reputation.Customer Data Platform (CDP): Transform customer data management with AI for more direct bookings. Integrate and master customer data, manage consent, and convert insights into tailored marketing strategies and personalized journeys.AI Agents: Our intelligent, always-on agents enhance productivity and reduce operational costs. Available round the clock, they offer immediate, personalized recommendations and streamline direct booking processes.Discover more about TrustYou at www.trustyou.comIf you are passionate about enhancing customer happiness and making a significant impact, you belong here. Join us in our journey to innovate and excel in the hospitality industry.Position: Senior Software Engineer - DataLocation: Germany / Spain / Romania (Remote)As a Senior Software Engineer - Data, you will play a pivotal role in the complete development and technical execution of our core data infrastructure. Your primary focus will be on crafting clean and efficient code for building high-performance, data-intensive APIs and intricate processing pipelines that serve as the backbone of our data products.This position is tailored for a technical specialist who thrives on hands-on work with distributed systems and takes full accountability for implementing and delivering robust data solutions.
GetYourGuide
Transform the Travel ExperienceEmbark on an exciting journey with GetYourGuide, where we strive to connect travelers with unforgettable experiences worldwide. With millions relying on us for unique and trustworthy activities, we are dedicated to making every journey remarkable - including yours.Are you ready to unleash your potential within a community of fellow adventurers? Discover your next opportunity at our headquarters in Berlin or in one of our local offices across the globe, from New York to Bangkok. Visit getyourguide.careers to get started.Team MissionAs part of the Traveler Data Engineering team within the strategic Flywheel Data Engineering group, our mission is to establish robust data foundations and self-service capabilities for the traveler side of our marketplace. We ingest, integrate, and structure crucial internal and external datasets into reliable tables, metrics, and data products that enhance the customer experience throughout their journey. Collaboration with Product and Data teams is key to enabling faster and more confident decision-making.Your ResponsibilitiesIndependently Build End-to-End Data Solutions: Create reliable, high-quality datasets and pipelines that support traveler-related decision-making, including acquisition, conversion, engagement, and retention.Act as a Trusted Thought Partner: Collaborate closely with Product and Data teams to translate business needs into actionable outcomes and promote the adoption of self-serve data capabilities.Deliver High Technical Craft: Implement best practices in code quality, data modeling, testing, and monitoring. Contribute to the operational support of your delivered solutions and enhance their reliability over time.Enhance Existing Production Systems: Proactively refactor and simplify existing pipelines/models, address data quality issues at their source, and implement targeted performance and cost improvements.Contribute to Team Outcomes: Engage in planning, roadmapping, code reviews, and knowledge sharing to boost team effectiveness.Maintain a Strong Operational Mindset: Balance operational duties with the development of new solutions, using team SLOs as guidance.
Join n8n as a Senior Engineer I-II and play a pivotal role in enhancing our Core Workflow Engine. As part of our dynamic team, you will have the opportunity to work remotely while collaborating with talented professionals across Europe. Your contributions will help shape the future of our innovative platform, ensuring seamless automation and integration for our users.
GetYourGuide
Transform the future of travel Join GetYourGuide on an exciting journey to connect travelers with unforgettable experiences worldwide. Our platform is trusted by millions seeking unique activities, and we are dedicated to making every journey extraordinary - including yours! Are you ready to unlock your potential with a community of fellow explorers? Explore opportunities at our Berlin headquarters or in one of our local offices worldwide, from New York to Bangkok. Visit getyourguide.careers to embark on your next adventure. Team Mission Within the Growth Data Engineering team, part of the strategic Flywheel Data Engineering group, our mission is to establish trusted data foundations and self-service capabilities for customer growth in our marketplace. We ingest, integrate, and structure essential internal and external datasets into dependable tables, metrics, and data products, empowering GetYourGuide to acquire and retain customers efficiently. Our close collaboration with Growth Analytics, Product, and Data teams enables accelerated and confident decision-making. Your Mission Create end-to-end data solutions independently: Provide reliable, high-quality datasets and pipelines that support traveler-facing decisions (e.g., acquisition, conversion, engagement, retention). Act as a trusted thought partner: Collaborate with Product and Data teams to translate business needs into actionable outcomes and promote the adoption of self-service data capabilities. Exhibit technical excellence: Implement best practices in code quality, data modeling, testing, and monitoring; contribute to operational support to enhance reliability over time. Enhance existing production systems: Pragmatically refactor and simplify current pipelines/models, address data quality issues at their source, and execute targeted performance and cost optimizations within your scope. Contribute to team success: Engage in planning, roadmaps, code reviews, and knowledge sharing to elevate team effectiveness. Maintain an operational mindset: Balance operational responsibilities with new solution development, prioritizing improvements based on team service-level objectives.
Bluefish AI
About the Position: We are experiencing rapid growth at Bluefish AI, and our Engineering team is set to expand significantly in the next three months. We are seeking an Engineering Manager to join our esteemed engineering leadership team. This is not merely a leadership position; it presents a unique opportunity to influence the foundational aspects of a dynamic, product-oriented startup during a pivotal growth phase. In this role, you will oversee our Data Engineering team, tasked with maintaining the pipelines and infrastructure that drive our AI-powered commerce intelligence platform. Our systems handle and process hundreds of millions of synthetic prompts and signals, with expectations for substantial scaling as our traffic surges in the forthcoming months. A key focus of this position will be to guide the evolution of our data platform towards a scalable data lake architecture while ensuring reliability for the critical production pipelines utilized by our enterprise customers. You will spearhead efforts in execution, quality assurance, and cultivating a positive team culture. Your leadership will be essential in transforming early-stage velocity into mature, high-performing teams, all while preserving the agility and pace that define exceptional startups.
Roland Berger seeks a Junior Data Engineer to join the IT department in Munich. This position plays a part in building and maintaining data solutions that support ongoing projects and daily operations. Role overview This role involves supporting team efforts to develop and improve data systems. Collaboration and a willingness to learn from colleagues are central to the work. The environment encourages teamwork and open knowledge sharing. What you will do Assist in creating and maintaining data solutions for business needs Work alongside team members on data-driven initiatives Help improve existing data systems and processes Location This position is based in Munich.
Statista Inc.
As the Head of Data Engineering at Statista, you will lead our data engineering team to architect and implement scalable data solutions that empower our data-driven decision-making process. You will collaborate closely with cross-functional teams to ensure that our data infrastructure supports the growing needs of the organization, while also mentoring and developing your team to foster a culture of innovation and excellence.
We are seeking a skilled DWH Architect / Senior Data Engineer to join our client's team, a prominent insurance company based in Switzerland. If you are ready to tackle complex challenges and actively shape modern data platforms, we would love to hear from you! This position allows for complete remote work from Germany. Your Responsibilities:Design and implement advanced Data Warehouse architectures.Lead migrations and modernization of existing DWH systems.Analyze and translate business and technical requirements into scalable, architecture-compliant solutions.Collaborate closely with internal stakeholders and cross-functional teams.Ensure data governance, security, and compliance with regulatory requirements in Switzerland.Continuously develop and optimize data platforms and reporting solutions.Establish and enforce naming conventions, self-service approaches, delineation, tool recommendations, and guidelines.
SumUp
Join our dynamic Platform tribe at SumUp, where your expertise will help us create an innovative self-service, AI-Ready Data Platform. Our Data Platform team plays a crucial role in supporting our ambitious ventures into Data, AI, and real-time analytics. We are committed to building robust and scalable infrastructure that empowers our global data community.Your ResponsibilitiesDesign and sustain an exceptional data infrastructure that underpins vital processes.Enhance platform self-service capabilities by developing features for ETL and orchestration.Streamline processes to maximize compute resource efficiency and minimize costs.Implement automation solutions to ensure our infrastructure's availability around the clock.Participate in the design and rollout of new services within our platform ecosystem.
Mindrift is seeking talented Python Data Scraping Engineers to contribute to the Tendem project, enhancing our innovative hybrid AI and human system through specialized data scraping workflows.In the capacity of an AI Pilot, as we refer to this role at Mindrift, you will work collaboratively with Tendem Agents executing repetitive tasks, while you apply your critical thinking, domain expertise, and quality assurance skills to deliver precise and actionable data.This part-time remote position is ideally suited for technical experts with substantial experience in web scraping, data extraction, and processing.About UsThe Mindrift platform bridges specialists with AI projects from leading technology innovators. Our mission is to harness the power of Generative AI by leveraging expertise from specialists around the world.Role OverviewThis freelance role is part of the Tendem project. As a Python Data Scraping Engineer, you will be responsible for executing data scraping tasks with a focus on technical accuracy for web extraction and processing, employing various tools such as Apify and OpenRouter, along with your own innovative methods.Key ResponsibilitiesManage end-to-end data extraction processes across intricate websites, ensuring thorough coverage, accuracy, and dependable delivery of structured datasets.Utilize internal tools (Apify, OpenRouter) along with custom workflows to expedite data collection, validation, and task execution while adhering to defined specifications.Guarantee reliable data extraction from dynamic and interactive web sources, modifying strategies as necessary to accommodate JavaScript-rendered content and evolving site behavior.Maintain data quality standards through validation checks, cross-source consistency controls, adherence to formatting guidelines, and systematic verification prior to data delivery.Enhance scraping operations for large datasets through efficient batching or parallelization, monitor failures, and ensure stability in response to minor changes in site structure.CompensationContributors can earn up to $32 per hour, based on their experience and contribution speed. Note that compensation varies by project depending on its scope, complexity, and required expertise.How to ApplyTo get started, simply apply to this posting, qualify, and you will have the opportunity to work on projects that align with your technical skills, on your own schedule. From coding and automation to refining AI outputs, you will play a pivotal role in advancing AI capabilities and real-world applications.
Starte deine Karriere als Verfahrensmechaniker für Glastechnik bei O-I! Du bist verantwortlich für alle Aspekte der Glasherstellung, beginnend mit der Rohstoffannahme bis hin zur Verpackung des fertigen Produkts.In diesem spannenden Berufsfeld erwarten dich vielfältige Aufgaben, darunter:Umrüsten, Einrichten und Bedienen von Produktionsmaschinen.Hochfahren und Überwachen der Maschinen zur Herstellung von Hohlglasbehältern.Durchführen regelmäßiger Qualitätskontrollen und die dazugehörige Dokumentation.Qualitätsprüfungen am fertigen Produkt, wie z.B. Mündung und Glasstärke.
Sign in to browse more jobs
Create account — see all 7,182 results
Browse all companies, explore by city & role, or SEO search pages.
