Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.
Experience Level
Experience
Qualifications
Your ResponsibilitiesBuild and maintain distributed data pipelines utilizing Scala, Spark, and cloud technologies. Work collaboratively with engineers, data scientists, and product teams to deliver reliable, scalable data systems. Design and optimize data ingestion and transformation workflows across both blockchain and traditional datasets. Ensure system accuracy, scalability, and efficiency while processing hundreds of millions of daily data points. Evaluate design options and trade-offs in terms of performance, scalability, reliability, and cost. Contribute to the full lifecycle of data platform development from design and deployment to ongoing enhancements. Enhance pipeline reliability, observability, and automation via code and tooling enhancements. Expand your influence and take on increasing responsibilities as you gain a deeper understanding of our distributed systems and platform architecture. Technical EnvironmentTechnologies we use include: Scala, Spark, Databricks, AWS, Airflow, Kubernetes, Terraform, and Functional Programming. (No Scala experience yet? If you are an experienced data engineer eager to learn, we will support you.)Ideal Candidate AttributesPassionate about writing clean, well-tested, and efficient code. Leverage data and experimentation to guide informed decisions. Thrive in a collaborative environment and embrace challenges with enthusiasm.
About the job
Join Us in Shaping the Future of Blockchain Intelligence
At Elliptic, we are pioneering the intelligence framework for the future of finance. Our dedicated teams are committed to transforming intricate blockchain and off-chain data into actionable insights, equipping financial institutions, regulators, and businesses with the confidence to innovate. Our mission is to make digital-asset intelligence seamlessly accessible, as we design and scale the data streams and services that drive Elliptic’s analytics and decision-making products.
As a Data Engineer, you will architect and optimize systems that process vast blockchain and off-chain datasets, facilitating organizations across the globe in making informed, data-driven decisions.
Your role will involve engaging with platform-focused teams or those directly handling product data, as you tackle challenges related to batch and streaming processing while developing high-quality, scalable solutions in a rapidly changing ecosystem.
About Elliptic
Elliptic is at the forefront of blockchain intelligence, dedicated to building the intelligence layer for the future of finance. We empower organizations with insights derived from complex blockchain data, enabling them to innovate with confidence.
As a Data Engineer at dev2, you will play a pivotal role in designing, building, and maintaining data pipelines that enable our teams to make data-driven decisions. You will work closely with data scientists and analysts to ensure the data infrastructure is robust and scalable.Your responsibilities will include optimizing data flow, ensuring data quality, an…
Join dev2 as an Engineering Manager in the dynamic FinTech sector. In this pivotal role, you will lead a team of talented engineers, driving innovation and excellence in financial technology solutions. Your expertise will shape the development of cutting-edge products that transform the financial landscape.As a key leader, you will be responsible for overseeing engineering projects, ensuring they meet high-quality standards and align with strategic business objectives. You will foster a collaborative environment, mentoring team members and promoting best practices in software development.
Join our culinary team as a Head Chef at Dev2, where your creativity and leadership will shine. As the Head Chef, you will oversee kitchen operations, manage staff, and ensure the highest quality of culinary standards. We are looking for a passionate individual who can bring innovative ideas to our menu and create delightful dishes that will impress our guests.
Join our dynamic team at dev2 as a Senior Sales Recruiter. In this pivotal role, you will be responsible for attracting top talent to our sales division. You will collaborate closely with hiring managers to understand their needs and deliver exceptional candidates that align with our company culture and objectives.Your expertise in sales recruitment will be vital in shaping our team and driving the success of our organization. If you are passionate about recruitment and excited by the challenges of finding the best talent, we would love to hear from you!
Join our dynamic team at dev2 as a Senior Product Designer and play a pivotal role in creating innovative product solutions. Collaborate with cross-functional teams to design user-centric experiences that drive engagement and satisfaction.
About YouLendYouLend is an innovative and rapidly expanding FinTech company, recognized as the leading embedded financing platform for top-tier e-commerce platforms, technology firms, and Payment Service Providers. Our cutting-edge software empowers partners to enhance their offerings by providing customized financing solutions to their merchants under their own branding, all while mitigating capital risk.Backed by EQT, a prominent Private Equity firm, YouLend has achieved remarkable growth, exceeding 100% year-over-year since 2020. Based in London, we also operate across various European countries and the United States, serving renowned partners such as eBay, Amazon, Just Eat, Shopify, and Stripe.Position OverviewWe are on the lookout for a Data Engineer to become an integral part of our expanding Data Engineering & Platform team. This role is pivotal, bridging infrastructure, DevOps, and advanced data tools, with a primary focus on facilitating rapid, secure, and scalable analytics. You will play a crucial role in constructing and scaling a premier data platform that supports a wide array of functions, including dashboards, experimentation, machine learning, and compliance.Key Responsibilities:Develop and oversee the infrastructure for our data platform, utilizing technologies such as AWS, Snowflake, dbt, and Airflow.Design and execute CI/CD pipelines for dbt and various data workflows.Automate data platform operations using Python and infrastructure-as-code tools like Pulumi and Terraform.Collaborate with analytics, machine learning, product, and engineering teams to enhance data solutions.Maintain data quality, lineage, and governance through rigorous testing and monitoring.Utilize cost observability tools to promote efficient platform usage.Qualifications:The ideal candidate will possess the following qualifications:Demonstrable experience with cloud-based data platforms such as Snowflake, Redshift, or BigQuery.Strong proficiency in Python and SQL for automation and analytics purposes.Familiarity with CI/CD processes, particularly in dbt or similar data platforms.Practical experience with Infrastructure as Code (IaC) tools including Pulumi, Terraform, or CloudFormation.Solid understanding of orchestration tools like Airflow.Exceptional communication skills with a history of cross-functional collaboration.Desirable Skills:Experience with AWS services (S3, Lambda, MWAA) and Azure DevOps.Familiarity with monitoring tools such as DataDog.Why Choose YouLend?Recognized as one of the “Best Places to Work in 2024”, YouLend offers a dynamic workplace environment.
Join dev2 as an SCL Shift Engineer, where you will play a crucial role in maintaining and optimizing our systems. In this dynamic position, you will work closely with our technical team to ensure operational efficiency and reliability.
Join our dynamic team at OctoEnergy as a Data Engineer, where you will play a pivotal role in transforming data into actionable insights. We are seeking a motivated and innovative individual who thrives on tackling complex challenges and delivering high-quality solutions.
ASOS Plc seeks a Data Engineer in London to support data processing and analytics for its online fashion platform. This role centers on improving how data is managed and used across the company. Role overview The Data Engineer will work with modern data technologies, contributing to projects that influence data-driven decisions throughout the business. The position involves collaborating with teams to strengthen data workflows and ensure reliable analytics. Location This is a London-based role, working onsite with ASOS Plc's technology and analytics teams.
About SwapAt Swap, we are revolutionizing modern commerce with our unique AI-native platform that seamlessly integrates backend operations with an innovative storefront experience. Our solution is designed for brands aspiring to sell anything, anywhere, by centralizing global operations and enabling intelligent workflows. With real-time data and capabilities, our products cover cross-border transactions, tax management, returns, demand planning, and our state-of-the-art agentic storefront. This empowers merchants with complete transparency and the confidence to make informed decisions.We are fostering a culture at Swap that emphasizes clarity, creativity, and a shared sense of ownership as we redefine the landscape of global commerce.About the RoleWe are seeking enthusiastic, detail-oriented, and adaptable Data Engineers to join our platform team during an exciting phase of growth. This role is ideal for proactive engineers who thrive in dynamic environments and are eager to take charge, contribute significantly, and collaborate closely with product teams.In this pivotal position, you will play a key role in designing, optimizing, and scaling our premier data platform, which fuels our advanced customer-facing agentic systems. You will engage proactively with Product Managers and stakeholders to advocate for best practices, ensuring our platform remains robust and poised for future advancements.This hands-on builder role is suited for knowledgeable and enthusiastic team players excited about the opportunity to accelerate and expand a greenfield platform.Key ResponsibilitiesData Pipeline Engineering: Develop, optimize, and sustain comprehensive end-to-end data pipelines for essential business operations. Prioritize low latency, strong observability, and effective alerting for batch and stream processing to guarantee data reliability.API Development: Support the design and implementation of new features within the API layer, collaborating with product teams to ensure efficient data access.Platform Contribution: Own specific components of the data platform, actively contributing technical enhancements for scalability, future readiness, and alignment with Swap's long-term product strategy.Data Quality & Governance: Implement and advocate for best practices in data quality to ensure the data's integrity and reliability.What We Would Like to SeeA minimum of 3 years of experience in data engineering or related roles, demonstrating a strong understanding of data systems.Proficiency in programming languages such as Python, Java, or Scala.Experience with cloud platforms such as AWS, Azure, or Google Cloud.Strong problem-solving skills and a passion for data-driven decision-making.
About Tilt At Tilt, we are on a mission to Make Commerce Alive. Our platform revolutionizes the traditional e-commerce landscape, moving away from outdated website builders and faceless marketplaces to create vibrant, community-driven experiences for the new generation of merchants. With millions of engaged shoppers across the UK, from sneaker enthusiasts to collectors, Tilt empowers sellers to achieve remarkable earnings, often exceeding £1M. We are just getting started!Your Mission As our Lead Data Engineer, you will take charge of shaping and advancing our comprehensive data platform. This is a hands-on role with significant ownership where you will oversee the architecture, reliability, and scalability of our data warehouse and analytics stack. Your expertise will help refine our existing systems, including Snowflake, Dagster, dbt, Postgres, Metabase, and ContextFlow, while establishing standards for intelligent growth. We seek an individual who not only writes SQL but deeply understands data, thinks critically, and designs enduring systems.What You’ll Do 0–3 monthsGain an in-depth understanding of our Snowflake warehouse architectureAudit and enhance existing dbt models for clarity and performanceStabilize and review Dagster orchestration pipelinesIdentify data quality gaps and implement rigorous testing standardsEnsure key dashboards in Metabase are based on reliable datasetsDocument current systems and define architectural standards3+ monthsRedesign and optimize warehouse architecture where necessaryImplement scalable data modeling patterns across various domainsEnhance cost efficiency and performance in SnowflakeIntroduce advanced observability, testing, and lineage practicesEnable AI and product features by structuring high-quality, production-ready datasetsEstablish clear version control and deployment workflows for data changesCollaborate with product and engineering teams to define data contracts and ownership
Abound is changing how consumer lending works in the UK and beyond. By combining AI with Open Banking data, the company offers personal finance options that go beyond traditional credit scores. Each applicant is evaluated based on real financial habits and repayment history, aiming for fairer access to credit. Since launch, Abound has issued over £1.3bn in loans and kept default rates well below industry averages. Profitability came within 2.5 years. With more than £2bn in investment from Citi, GSR Ventures, Deutsche Bank, and others, Abound is recognized as one of Europe's fastest-growing fintechs. As the company expands into new markets and products, it seeks people eager to learn, take ownership, and grow alongside the team. Role overview The Data Engineer will join the Platform team in London, focusing on the development and improvement of Abound’s Data Lake. This position acts as a connector between Platform and Data Science, maintaining the infrastructure and data pipelines that drive decision-making across the business. The role is individual contributor level, offering autonomy and the chance to make a measurable impact. Main focus for the first 6–12 months Upgrade the Data Lake into a scalable, high-performance platform for analytics and data science. Migrate production workloads, including model calibration, ad-hoc analytics, and reporting, into the Data Lake. Redesign data structures to improve query speed, lower AWS costs, and ensure timely data availability. Collaborate closely with the Data Science team to support rapid experimentation and delivery through reliable data systems. Technology stack Cloud & Compute: AWS, ECS Fargate, AWS Lambda Databases & Data Lake: Aurora (PostgreSQL, MySQL), Athena, DMS, Glue, Iceberg Languages & Infrastructure as Code: Python, Spark, SQL Observability & Tooling: Amazon Managed Prometheus (AMP), incident.io, GitLab
About The Dot CollectiveAt The Dot Collective, we are a forward-thinking consultancy operating across the UK and EU, dedicated to engineering excellence and empowering individuals to create significant impact.Our team utilizes the latest technology stacks and embraces agile scrum methodologies for all our projects.About YouAre you driven by a passion for data and its transformative potential? Do you thrive on making a substantial difference in a short time frame? If so, we may be the perfect fit for you.
Join Our Team! Contentful is seeking a Senior Data Engineer to play a pivotal role in designing, developing, and scaling advanced data solutions that drive analytics, operational reporting, and strategic decision-making within our organization. You will collaborate closely with our data, analytics, and business teams to create robust data pipelines, models, and integrations that ensure clean, reliable, and actionable data. You will be an integral part of a small, geographically diverse engineering team that operates with a product mindset—delivering iterative value while partnering closely with business and analytics stakeholders, and taking ownership of both the development and operational aspects of a critical data platform. Your Responsibilities Design, build, and maintain scalable data pipelines and transformations that integrate multiple systems and sources. Create high-quality data models that cater to analytics, reporting, and operational use cases. Collaborate with analytics, product, and business teams to identify data needs and translate them into effective technical solutions. Establish and enforce strong data quality, validation, and monitoring protocols to ensure the reliability and trustworthiness of our data. Optimize data storage, processing, and performance within cloud data warehousing environments. Contribute to the ongoing improvement of our modern data platform, including tooling, standards, and best practices. Support the operations of the data platform by troubleshooting issues, enhancing reliability, and ensuring SLAs are consistently met. Engage with cross-functional partners on governance, documentation, definitions, and data stewardship. Implement CI/CD practices and Infrastructure-as-Code (IaC) for automated deployment, testing, and environment management. Participate in code reviews, design discussions, and operational on-call rotations as required. Mentor team members and promote data engineering best practices across the organization. Your Skills and Experience Essential Qualifications 5+ years of experience in a Data Engineering or similar technical role. Expertise in SQL and experience working with cloud data warehouses (e.g., Snowflake, Redshift, BigQuery). Experience in building and maintaining ETL/ELT pipelines using tools such as dbt, Airflow, or similar frameworks. Proficiency in Python or another scripting language. Strong understanding of data modeling, data structures, and modern data architecture principles.
Join Eucalyptus as a Data Engineer and play a pivotal role in transforming data into actionable insights. We are looking for innovative thinkers who are passionate about data architecture and analytics. You will work closely with cross-functional teams to design, build, and maintain scalable data pipelines that drive strategic business decisions.
About the RoleJane Street is seeking a skilled Data Engineer to enhance our understanding, management, and dissemination of data that informs our trading strategies. At Jane Street, the ability to comprehend and manipulate data accurately is fundamental to our operations.In this role, you will utilize a combination of proprietary and open-source tools to analyze diverse datasets, identifying anomalies, ensuring consistency in formats and symbologies, automating ETL processes, and ultimately simplifying the process for our traders to derive insightful conclusions.We are looking for someone who is passionate about diving deep into data and articulating findings to various stakeholders, collaborating closely with traders and software engineers.While familiarity with financial data is beneficial, we do not require a financial background. We are eager to hire talented engineers and provide the necessary training.
Join Our Journey Welcome to Zopa! Founded in 2005, Zopa pioneered peer-to-peer lending and has since evolved, launching Zopa Bank in 2020. Our mission is to transform the banking experience by prioritizing customer needs and redefining financial services. We empower our team and customers to challenge the norms and aim high. Discover our innovative offerings at Zopa.com!We take pride in our accomplishments and our outstanding team, which has propelled us to be recognized as one of the UK’s Most Loved Workplaces.At Zopa, we welcome those who thrive on unconventional challenges and are eager to make a significant impact. Follow us on Instagram @zopalife to see our culture in action!About the Team:The Data Engineering team is essential in designing, constructing, and managing the intricate data pipelines vital to our bank's operations. We leverage a blend of software engineering, data analytics, and operational skills while collaborating with various teams to tackle diverse and complex challenges. Our culture values teamwork, practicality, and continuous learning over individual accomplishments.This role offers you the opportunity to join Zopa’s Data Engineering team at a pivotal moment of growth and advancement. You will be instrumental in developing and enhancing the data platforms that drive decision-making throughout the bank, from customer behavior insights to critical applications like fraud detection, where every moment counts.
Full-time|$1K/yr - $1K/yr|On-site|London, United Kingdom
Join Us in Shaping the Future of Blockchain IntelligenceAt Elliptic, we are pioneering the intelligence framework for the future of finance. Our dedicated teams are committed to transforming intricate blockchain and off-chain data into actionable insights, equipping financial institutions, regulators, and businesses with the confidence to innovate. Our mission is to make digital-asset intelligence seamlessly accessible, as we design and scale the data streams and services that drive Elliptic’s analytics and decision-making products.As a Data Engineer, you will architect and optimize systems that process vast blockchain and off-chain datasets, facilitating organizations across the globe in making informed, data-driven decisions.Your role will involve engaging with platform-focused teams or those directly handling product data, as you tackle challenges related to batch and streaming processing while developing high-quality, scalable solutions in a rapidly changing ecosystem.
About the RoleJoin Teza Technologies as a Data Engineer in our dynamic data team, where data is the heartbeat of our systematic trading and vital to every facet of our operations.This hands-on role within a growing team of data engineers offers significant potential for career advancement, as we anticipate rapid growth over the coming years. We seek candidates with exceptional technical expertise, meticulous attention to detail, and a proven track record in architecting and developing robust data platforms.LocationHybrid role based in London, UK, with a requirement of 3 days in-office each week.Key ResponsibilitiesCollaborate with Portfolio Managers and Quantitative Developers to translate business needs into effective technical solutions while providing insights into dataset intricacies.Enhance our data warehouse by designing and integrating new data sources and functionalities; boost system reliability, speed, and scalability while overseeing data access management.Contribute innovative data management, analytics, and technological insights to the team and leadership.Assess and recommend new tools and technologies for organizing, querying, and streaming extensive datasets.Create and implement automated systems for data cleansing, anomaly detection, monitoring, and alerting.Provide support for our production data warehouse as needed.Cultivate and maintain strong vendor partnerships aligned with our business objectives.Essential QualificationsProficiency in Python and Unix/Linux for data manipulation, scripting, and automation.Deep understanding of SQL and familiarity with NoSQL databases, especially Postgres and MongoDB, with skills in query optimization and performance tuning.Strong grasp of data modeling principles, including both normalization and denormalization techniques.Experience with cloud platforms, such as AWS or GCP.Familiarity with Git version control, collaborative workflows (e.g., Github), and CI/CD best practices.Bachelor’s degree in Computer Science, Information Technology, or a related field.Preferred Qualifications...
Who are we?Smarkets: Shaping the Future of BettingAt Smarkets, we operate one of the most advanced prediction markets globally, with a staggering £29 billion in volume processed since our inception in 2010. Our platform engages over 200,000 traders worldwide, revolutionizing betting across various sectors, including sports and political markets, by providing the most competitive prices and fairest odds.Our tech stack is engineered for scalability, reliability, and performance, utilizing Linux, Kafka, Postgres, and Kubernetes, while Python 3, C++, Rust, and React underpin our platform. We construct infrastructure that institutions can rely on while ensuring trading remains accessible to all users. Our resilience is evident as we have thrived through every market trend and competitive landscape.What sets us apart is our exceptional team. We foster a high-performance culture where talent flourishes, merging extensive business knowledge with a strategic approach to drive growth.If you're eager to help redefine the future of prediction markets with innovative technology and a customer-centric approach, Smarkets is your ideal workplace.The TeamOur Data Team plays a crucial role in harnessing the vast amount of data generated at Smarkets to derive insights that propel our business forward. Given the extensive range of data we produce—from sports event metrics to payment information and user analytics—there are abundant opportunities for the team to create significant business impact.Currently, our team's responsibilities encompass three primary domains:Data Engineering: Developing and maintaining ETL pipelines, APIs, and data infrastructure such as Redshift or BigQuery;Data Science and Machine Learning: Exploring data, training ML models, and implementing ML Ops to uncover fresh insights;Analytics and Reporting: Crafting data models and dashboards while automating reporting pipelines for various teams, stakeholders, and third parties.A typical week for a data engineer within our Data Team might involve:Creating a new Python ETL pipeline to segment users based on their sports interests by analyzing behavior, optimizing marketing communications for these users;Developing a new endpoint for a Flask API, implementing unit tests, and deploying the updated version into our production Kubernetes cluster;Training and assessing an ML model to identify specific user patterns, contributing to our data-driven decision-making.