Data Engineer In Finance jobs in Seoul – Browse 443 openings on RoboApply Jobs

Data Engineer In Finance jobs in Seoul

Open roles matching “Data Engineer In Finance” with location signals for Seoul. 443 active listings on RoboApply Jobs.

443 jobs found

1 - 20 of 443 Jobs
Apply
Toss Bank logoToss Bank logo
Full-time|On-site|Seoul

Join Our Dynamic TeamThe Data Engineer (Finance) role is part of the Finance Data Platform team within Toss Bank's Data Division.The Finance Data Platform team is responsible for managing the management accounting and financial ALM simulation systems.We utilize the Hadoop Ecosystem and open-source environments to manage financial data.Your ResponsibilitiesYo…

Mar 9, 2026
Apply
Toss Bank logoToss Bank logo
Full-time|On-site|Seoul

Join Our Team! The Finance Data System Developer at Toss Bank will be part of the Finance Data Platform Team within the Data Division. This team is responsible for managing management accounting and financial ALM simulation systems. We utilize the Hadoop Ecosystem and open-source environments to manage finance data. Your Responsibilities: Collaborate closely with the financial analysis team to design and manage management accounting systems. Work in tandem with the financial ALM team to design and develop ALM simulation systems. Engage in workflow development and automation tasks using Hadoop Ecosystem and open-source solutions. Ideal Candidate: Experience in management accounting or ALM system management within the financial sector is required. Proficiency in implementing business logic using Java or Python is essential. Strong SQL skills are necessary. Experience in designing marts and scheduling for financial analysis or ALM is preferred. Experience in developing management accounting or ALM systems utilizing Frontend/Backend technologies is a plus. Familiarity with data processing technologies in the Hadoop ecosystem is advantageous. Resume Tips: Please include the following in your application: a description of a problem you solved through automation or with a new approach, and an experience where you deeply learned a new technology. Join the Toss Bank Journey: Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Discussion > Final Offer and Joining. Important Notes: False information in your resume or disciplinary issues in your work history may lead to cancellation of your application. Candidates with disqualifying factors as per Toss Bank’s employment regulations may also have their applications canceled. Individuals with disabilities and national veterans are given preference according to relevant laws.

Mar 9, 2026
Apply
Toss Insights logoToss Insights logo
Full-time|On-site|Seoul

Join Toss Insights! Toss Insights serves as the financial management research institute of Toss, acting as a think tank that formulates growth strategies through policy analysis and trend research to enhance understanding of finance and economics. Be a founding member of Korea's first fintech research institute, where you can set new benchmarks in the financial market and lead the industry. Toss possesses a vast array of reliable financial data, including MyData, payment, and app usage statistics, which can be utilized for research purposes within legal boundaries. We cultivate an organizational culture based on “autonomy and responsibility,” allowing for a more innovative research environment in the finance sector. Your Responsibilities: Process and analyze diverse data generated by financial institutions, including: MyData: Dispersed asset information such as account and card details. Customer behavior data (NLP): Analyzing digital log data (MAU, retention, etc.) to understand customer behavior patterns. Financial product usage data: Insights into customer interactions with loans, insurance, deposits, etc. Visualize structured and unstructured data to develop recommendation and personalization models. Generate insights based on banking data analysis and compile research reports. Analyze patterns and trends in financial data to support strategic decision-making within the Toss community. We Are Looking For Candidates Who: Possess problem-solving skills utilizing data analysis and machine learning. Are proficient in data analysis tools such as Python, R, and SQL. Have experience in processing and analyzing financial data or large-scale datasets. Hold a Master's or PhD degree in Industrial Engineering, Financial Engineering, or related fields with relevant experience. Have a background in Financial Engineering, Business Administration, Economics, or related disciplines. Please Ensure the Submission of the Following Documents: Resume (free format, no language restrictions) Submit in any format, either in Korean or English. Include your academic background, work experience, date of birth, and contact information. Attach a list of major research achievements, including research reports. Cover Letter Detail your research and practical experiences related to the field. Steps to Join Toss Insights: Document submission → Job interview → Cultural fit interview → Reference check → Salary negotiation → Final acceptance Document submission period: 2/13 (Thursday) ~ 2/26 (Wednesday)

Mar 23, 2026
Apply
daangn logodaangn logo
Full-time|On-site|SEOUL

Welcome to the Journey of Joining the Daangn Team!At Daangn, we strive to create an environment where individuals can grow alongside the company's growth.The Daangn recruitment team is here to help facilitate those moments of thoughtful collaboration with wonderful colleagues. Introducing the Data Value TeamThe Daangn team is dedicated to uncovering valuable information within local neighborhoods and resolving inconveniences in regional living. To create user value, it's essential to provide trustworthy information that users can easily access and incorporate into their decision-making. While Daangn already utilizes extensive data for decision-making, maximizing the value of our data requires significant changes.The vision of the Data Value Team is to make decisions for users through data every day. To realize this vision, we proactively tackle challenges in data value realization and lead the way in solving them.About the Data Software Engineer RoleThe Data Software Engineer plays a crucial role in addressing the challenges that arise during the process of data value realization through software engineering.In alignment with Daangn's rapid growth, you will design data systems that will not become bottlenecks in the future. You will ensure data reliability through automated testing and system observability. Additionally, you will solve technical problems that arise as Daangn members seek to understand users through data, thereby exponentially enhancing data-informed decision-making through data products (indicator platforms, experiment platforms, etc.).The mission of the Data Value Team's engineers is to facilitate a seamless flow of high-quality data at Daangn, enabling the creation of value without bottlenecks. Discover the Journey of the Data Value Team Growing with Daangn (Google Data Webinar) Learn about Daangn's Indicator Platform, KarrotMetrics Seven Challenges Daangn Faced in Implementing DBT and Airflow Tips for Easy Modeling with DBT from Daangn's Data Engineer (2024 Data Conference) Creating a Data Map at Daangn: Building Column Level Lineage No Need to Always Fetch Everything? Daangn's MongoDB CDC Build

Mar 16, 2026
Apply
Toss Securities logoToss Securities logo
Realtime Data Engineer

Toss Securities

Full-time|On-site|Seoul

Join Our Team!The Realtime Data Engineer will be part of the Realtime Data Team within our Data Division.This team operates a distributed messaging streaming platform, ensuring the stable transmission of large-scale financial transactions.We manage high-volume data pipelines that deliver data with zero latency while maintaining integrity.Moreover, we integrate real-time data into OLAP environments, enabling immediate business decision-making and service enhancement. Your Responsibilities:Operate and optimize our Kafka cluster to ensure high availability of large-scale data from Toss Securities.Utilize tools like CDC, Kafka Connect, Flink, and ksqlDB to construct real-time data pipelines.Manage OLAP to efficiently store and query large volumes of incoming real-time data, optimizing query performance.Enhance architecture for greater throughput and lower latency, proactively assessing and implementing next-gen technologies for reliable data services. Ideal Candidate:Experience managing large-scale data platforms, ensuring infrastructure stability and performance.Proven experience designing and operating Kafka-based architectures or a deep understanding of distributed messaging systems.Intermediate to advanced proficiency in Java (or Kotlin), capable of implementing complex business logic in real-time streaming frameworks (Flink, ksqlDB).Experience building or operating real-time analytics environments using OLAP systems like ClickHouse, StarRocks, Druid, or Pinot.A broad experience in Data Engineering or depth in a specific area, eager to expand your role.Strong foundation in Data Engineering skills, quick to learn new tech stacks, and adept at finding optimal solutions in diverse situations.Excellent communication skills to collaboratively tackle complex problems with the team. Joining Toss Securities:Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation > Final Offer and Onboarding Please Note:Any inaccuracies found in the resume or disciplinary issues during employment history may lead to cancellation of the application.Candidates who are prohibited from hiring or have disqualifying reasons according to Toss Securities' regulations may have their applications canceled.Individuals with disabilities or national veterans are given preferential treatment in accordance with relevant laws. A Note for Future Colleagues:Processing transaction data generated in the securities domain in real-time is of high importance from a business perspective and poses significant technical challenges.The Toss Securities Realtime Data Team is at the forefront of this effort, currently maintaining stable securities services.Toss Securities continues to grow. We hope that the entire process of maintaining the systems of a growing securities firm will be an enjoyable journey.

Mar 10, 2026
Apply
Toss Securities logoToss Securities logo
Full-time|On-site|Seoul

Join Our Team!Toss Securities is rapidly growing under the mission of "Innovating every investment experience for our customers", boasting over 7.4 million registered users and 4 million active monthly users. We are currently leading in foreign stock transaction volumes. Our diverse range of investment products, including stocks, bonds, and options, expands our customers' choices. At the core of this growth is our robust and trustworthy data platform. Toss Securities is at a pivotal moment, needing to design a data architecture capable of handling large-scale real-time trading data, user behavior data, and regulatory compliance data.We are seeking a Head of Data Engineering to design and lead this structural transformation. This role is not just about operations; it is key to defining the data future of Toss Securities and enabling our organization to work data-driven.Your Responsibilities:Design and build an on-premise distributed architecture aligned with our mid-to-long-term business strategy, resolving data silos through Hadoop enhancements and transitioning to a Kafka-centered streaming-first approach.Construct and operate large-scale batch and streaming pipelines based on Spark/Flink and Kafka, ensuring reliable data processing through high-availability ETL/ELT design and performance optimization.Establish and manage data standards (layering, naming, permissions), quality management (DQ rules, SLA, lineage), and regulatory compliance frameworks based on metadata and personal information (PII).Coordinate data interests across services, risk, accounting, AI, and backend teams, establishing and executing a comprehensive data strategy including integration strategies and ownership definitions.Design the goals, structure, and processes of the data organization, leading a growing team through coaching and decision-making while addressing technical debt and fostering a trust-based environment.Oversee the design and operation of ML platforms and infrastructure for LLM/recommendation services, collaborating with service teams to build model deployment, operation, monitoring standards, and automated pipelines.Ideal Candidate:10+ years of experience in data engineering or platform architecture.Experience designing and operating large-scale clusters (Hadoop, Kafka).Proficiency in real-time streaming and batch data processing architecture design.Experience in building data governance, quality, and permission management systems.Leadership experience in engineering organizations (5-50 team members) is preferred.Excellent coordination and communication skills across diverse organizational stakeholders.Additional Preferred Qualifications:Experience with financial data in securities, banking, or fintech.Experience handling data within regulatory environments (e.g., PIPA, Financial Transaction Act).Experience in building semantic layers or data meshes.Experience with real-time transaction or advertising/shopping service data.Experience designing and operating large-scale model training, serving, and MLOps/LLMOps pipelines (e.g., Kubeflow, Argo, H100/H200 GPU clusters, vLLM/Triton).Experience with feature stores for real-time recommendations, model optimization, profiling (e.g., BentoML, ONNX, torchserve), and LLM fine-tuning/RAG operations.

Mar 9, 2026
Apply
Toss Securities logoToss Securities logo
Full-time|On-site|Seoul

Join Our Dynamic Team!The Data Analytics Engineer (Data Engineer) at Toss Securities is an integral part of the Data Warehouse Team within the Data Division.Your focus will be on Data Platform and Data Mart, with opportunities to collaborate cross-functionally.The Mart responsibilities include structuring and managing data from the Toss Securities domain to facilitate analysis through data warehouse and aggregation table creation.Our current team of approximately 7 members brings diverse experiences ranging from 2 to 14 years, with backgrounds in various sectors such as portals, banking, gaming, and startups. Curious About Our Data Division?The Data Division at Toss Securities strives to become a world-class securities firm by leveraging data technology, services, and data-driven decision-making.We foster close collaboration among various data roles, creating an enjoyable working environment.Regular Tech Weekly sessions are held to share expertise, allowing you to engage with and learn from other roles as per your interest. Your Responsibilities Will Include:Designing clear and reliable table structures that can be easily understood and utilized, encompassing architecture design, compliance with standards, data processing logic management, data integrity validation, DQ monitoring, security reviews, and documentation using meta management systems.Collaborating with data users to design data marts and establish pipelines for key business performance analysis.Setting the groundwork for effective data asset utilization through data cataloging and standard management.Proactively addressing essential data processing tasks in a rapidly growing service environment with your colleagues.Enhancing system efficiency by effectively refactoring and optimizing various existing mart tables through data modeling that considers consistency, reusability, and scalability.Designing data marts and constructing pipelines for external/public reporting requirements. We Are Looking For Someone Who:Has a deep understanding of the securities domain or has actively engaged in stock trading.Can clearly define key concepts of the securities domain as a DW data modeler and take the lead in designing easy-to-understand data structures.Has experience simplifying complex data models or automating repetitive issues.Can propose efficient data processing methods while adhering to data standards through smooth communication with various stakeholders.Has experience structuring enterprise tables through defining data standards and building data catalogs.Is capable of independently conducting data warehouse/mart modeling, pipeline construction, and operational tasks.Can present standards from a clear data structure and efficient utilization perspective, rather than just processing simple requests.Is proficient in SQL and can write organized queries considering readability and efficiency.Has experience developing data pipelines based on Hadoop, Airflow, and DBT.May need to have intermediate to advanced skills in PySpark, depending on the situation.Would benefit from having experience with BI tools such as Tableau. Resume Tips:Detail impactful projects you have worked on.If you have improved services, quantify the results (omit sensitive external information).Elaborate on your work related to data governance.Include business analysis or reporting experience.

Mar 10, 2026
Apply
Toss Securities logoToss Securities logo
Full-time|On-site|Seoul

About the Team You'll Join The Data Analytics Engineer at Toss Securities is part of the Data Warehouse Team within the Data Division. Your responsibilities will be focused on Data Platform and Data Mart tasks. While your primary focus will vary, you will also engage in cross-functional projects. The Platform tasks involve maintaining and optimizing ETL/Pipeline Tools to effectively manage the DW Mart tables. You will explore and implement new methods to reduce DW operation time with limited resources. Our goal is to maximize data utilization across the organization using tables managed by the DW team. The current team consists of approximately 7 members with varying experience levels ranging from 2 to 14 years, coming from diverse backgrounds including portals, banking, gaming, and startups. Curious about the Data Division? The Data Division at Toss Securities aims to become the world's leading securities firm in data handling, contributing through data technology, services, and data-driven decision-making. We foster close collaboration among various data professionals and enjoy our work. Regular Tech Weekly sessions allow us to share expertise, and you can freely engage with different teams to learn from each other. Your Responsibilities Experience and contribute to an efficient DW environment within a rapidly growing agile organization. Design data marts and develop and automate DW Data Workflows based on the Hadoop Ecosystem and open-source solutions. Identify and implement methods for structuring and automating numerous DW/Mart tables. Process large volumes of data swiftly and effectively to create and manage various features. Establish Data Quality Checks and Governance within the data marts. Experience in deriving and establishing system requirements for large data processing and analysis is a plus. Ideal Candidate At least 5 years of experience as a Data Engineer is essential. You should possess a fundamental understanding of RDBMS, Hadoop Ecosystem, and Data Warehousing. Proven experience in leading the design, construction, and operation of data marts is required. You should be capable of installing, operating, and troubleshooting Airflow, DBT, and Django, with the ability to modify open-source tools to develop features needed for securities DW. Experience in simplifying complex problems or automating repetitive tasks using data models is critical. Extensive experience in efficiently processing big data using Spark is highly desirable. Intermediate proficiency in Python and advanced skills in SQL are required. Resume Tips If you have resolved critical issues while operating platforms or optimized performance and system resource usage, please include those experiences. Be specific about impactful projects you have worked on. If you have addressed bugs or issues while using open-source tools, or developed or enhanced features, please detail those experiences. Highlight the results of any improvements made in actual services, quantified if possible (please exclude sensitive information if necessary). Join Toss Securities Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation >...

Mar 10, 2026
Apply
daangn logodaangn logo
Full-time|On-site|SEOUL

Welcome to the daangn Team!At daangn, we strive to create an environment where individuals can grow alongside the company’s success.We are here to assist you in making meaningful connections with fantastic colleagues. Introducing the Data Valuation TeamThe daangn team is dedicated to discovering valuable information that connects neighborhoods and resolving inconveniences in local living. To generate this user value, we must provide easy access to reliable information for decision-making. While we already utilize vast amounts of data for our decisions, maximizing the value of our data requires substantial change.The vision of the Data Valuation Team is to "make user-centric decisions through daily data utilization." We take the lead in addressing and solving the issues of data valuation.Role of the Data Analytics EngineerThe Data Analytics Engineer ensures that data is utilized reliably and consistently, working across data modeling, engineering, and analysis to deliver value to the business and its users.In the diverse service environment of daangn, the Data Analytics Engineer will design and improve the overall flow of data collection to utilization, enabling analysts, engineers, and product teams to leverage data reliably.Moreover, the role involves designing data marts, managing quality, operating data governance frameworks, and performing basic analyses or experiments to support data-driven decision-making. Discover the Journey of the Data Valuation Team with daangn (Google Data Webinar) Learn about daangn's Metric Platform KarrotMetrics The 7 Challenges daangn Faced While Implementing DBT and Airflow How daangn's Data Engineers Simplified Modeling with DBT (2024 Data Conference) Mapping daangn Data: Column Level Lineage Building No Need to Always Pull Everything? Building MongoDB CDC at daangn Why daangn Implements User Activation Engagement in 3 Key Areas

Feb 2, 2026
Apply
Toss Careers logoToss Careers logo
Full-time|On-site|Seoul

Role overview Toss Careers is seeking a Data Engineer in Seoul to strengthen its advertising initiatives. The position centers on creating data solutions that guide advertising choices and enhance campaign results. Close collaboration with colleagues from different teams is a key part of the role. What you will do Build and maintain data pipelines and systems tailored for advertising analytics Use modern tools to process and analyze large volumes of data Work with team members to turn business requirements into technical solutions Transform raw data into practical insights that shape advertising strategy Location This role is located in Seoul.

Apr 27, 2026
Apply
Toss logoToss logo
Full-time|On-site|Seoul

Join Our Data Platform Team!As a Data Engineer at Toss, you will be part of our Data Platform Team.The team consists of Data Engineers and Data Analytics Engineers.We are responsible for building platforms and data pipelines essential for analyzing the services provided by Toss.Your ResponsibilitiesDevelop and operate OLAP (Online Analytical Processing) based data pipelines.Design and optimize systems for reliable operation of large-scale data analysis and real-time/batch data pipelines.Develop and manage batch and streaming pipelines to load various types of data generated at Toss.Continuously improve data models and processing logic based on service requirements.We Are Looking For Candidates Who HaveExperience operating services in a Kubernetes (K8s) based environment.Experience in designing and operating data streaming pipelines using Kafka and Kafka Connect.Experience in processing large volumes of data using Apache Spark (Batch/Structured Streaming).Additional Skills That Would Be a PlusExperience in operating and tuning MPP/OLAP engines like StarRocks or ClickHouse.Experience in building Data Lakehouses using Open Table Formats such as Apache Iceberg, Hudi, or Delta Lake.Experience in real-time data processing using streaming frameworks like Kafka Streams or Apache Flink.Experience in designing and operating ETL pipelines based on Airflow.Your Journey to Join TossApplication Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation > Final Offer

Mar 9, 2026
Apply
Toss logoToss logo
Full-time|On-site|Seoul

Join our dynamic team at Toss as a Data Analytics Engineer, where you will play a crucial role in transforming raw data into actionable insights. You will collaborate with cross-functional teams to identify opportunities for improving our services and enhancing user experiences.

Apr 9, 2026
Apply
Toss Payments logoToss Payments logo
Full-time|On-site|Seoul

Join our innovative team at Toss Payments as a Data Engineer, where you'll play a crucial role in building and optimizing our data infrastructure. You will work with cutting-edge technologies, collaborating with cross-functional teams to drive data-driven insights and solutions. Your expertise will help us enhance our payment systems and improve user experiences.

Mar 9, 2026
Apply
Toss Bank logoToss Bank logo
Full-time|On-site|Seoul

Join Our Dynamic Team!The Data Engineer for Workflow Platform is an integral member of Toss Bank's Data Division, specifically within the Data Platform team.This team comprises three key areas: Data Infrastructure & Hadoop, Streaming Platform, and Workflow Platform.We operate various Data Platforms, including Hadoop, Kafka, CDC, and Airflow.Our mission is to ensure the reliability and scalability of the enterprise data infrastructure, ensuring all data is securely collected and processed.Your Responsibilities:Design and operate a large-scale data workflow execution platform in an on-premise Kubernetes environment.Optimize resources to ensure the stable execution of large workflows across various data organizations, enhancing platform performance and reliability.Collaborate with enterprise data engineers to improve the execution quality of the overall data pipeline and enhance developer experience.Monitor workflow execution status, design and improve systems for automated fault detection, alerts, and recovery procedures.Safely manage workflow executions in accordance with internal control standards of the financial sector, advancing a systematic history management system.Continuously review and implement new technologies and open-source solutions to enhance the performance and scalability of the workflow platform.We Are Looking For:Experience operating an Airflow-based workflow orchestration system with proven improvements in stability, scalability, and execution efficiency.Background in developing Python-based data workflows and platform services.Understanding of container technologies (Docker, Kubernetes, etc.) and experience in automating service deployment and configuration using tools like Helm.Ability to understand company environments and communicate effectively with various teams during service development.A keen interest in improving operational efficiency and optimization in large-scale workflow environments.Desire to enhance platform user experience to facilitate easier and safer pipeline development and operations for in-house data engineers.A proactive approach to analyzing, modifying, and improving open-source solutions at the code level to solve issues.Resume Submission Tips:Clearly outline impactful projects you have worked on in your career.Focus on experiences related to data platforms, particularly with Airflow, Kubernetes, and Python.

Mar 9, 2026
Apply
Toss logoToss logo
Full-time|On-site|Seoul

Join Us and Engage in Exciting Work! After completing a comprehensive onboarding process to familiarize yourself with the Toss data environment, you will be part of the Data Warehouse Team, undertaking the following responsibilities: Develop a data quality platform that enhances table consistency, advances DQ rules, and establishes health check metrics. We aim to create a reliability management platform allowing all data users to work without questioning, 'Can I trust this data?'.Enhance the GraphRAG pipeline. Build a knowledge graph construction pipeline that extracts entities by parsing ontology YAML, SQL, and code, followed by vector embedding for indexing in Elasticsearch, making Toss's data assets easily navigable for everyone.Design and operate MSA architectures. Split necessary services for the ontology platform into microservices, ensuring each is designed, implemented, and operated reliably.Develop AI agent infrastructure. Create a multi-agent workflow execution environment based on open-source agent frameworks like CrewAI. Establish an MCP Tool Registry and develop integration infrastructure with external MCP servers.Build an early warning platform. Create a monitoring system that detects anomalies in data lineage, code, and trends, automatically performing alerts and analyses to identify issues before they escalate.Develop a lineage tracking engine. Create a system that automatically analyzes the extent of impacts by parsing SQL to extract column-wise influence relationships, determining how far changes propagate.

Apr 1, 2026
Apply
Tossbank logoTossbank logo
Full-time|On-site|Seoul

About the Team The Data Engineer (CPC) position will be part of the CPC Platform team within the Data Division of Tossbank. The CPC Platform team is dedicated to building a reliable data framework to meet various regulatory requirements (CPC, disclosures, regular reports, etc.) from financial authorities, enhancing both Tossbank's external credibility and internal operational efficiency through automation and advancement. This team thrives in an environment that demands frequent and complex queries, high standards of data integrity, and reproducible data management, allowing members to solve diverse problems while deepening their understanding of the banking domain and data engineering skills. Key Responsibilities Implement and respond to external data requests from financial authorities (CPC, disclosures, periodic reports) through system integration. Design, develop, and operate necessary data marts and dashboards for the required data. Ensure the reliability of reporting systems through data quality management (DQ). Collaborate with various departments such as lending, deposits, cards, and delinquency to provide data support for data-driven decision-making. Systematically manage reporting tasks based on banking laws and regulations. Ideal Candidate Profile Proficient in SQL, capable of extracting necessary data accurately and efficiently. Interested in exploratory data analysis and deriving meaningful insights. Enthusiastic about building and managing information systems or data warehouses and dashboards. Possess a strong sense of responsibility for data quality, capable of leading smooth collaboration with related departments. Curious about technologies and tools for automation and system optimization beyond simple data tasks. Understanding or interest in open-source and Hadoop-based data environments is a plus. Welcomes junior candidates with a willingness to learn and adaptability to new technologies over financial domain expertise. Resume Submission Tips Detail impactful projects you've worked on, highlighting your proactive roles and the outcomes of your contributions. If applicable, describe your experiences responding to requests from financial authorities or external agencies, including your analysis of requirements and resolution strategies. Share experiences where you deeply engaged with your work, demonstrating your analysis of root causes and structural improvements. Discuss any instances where you proposed or considered improvements beyond existing practices, focusing on the process and results. Application Journey at Tossbank Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Salary Negotiation > Final Acceptance and Onboarding Important Notes Any discrepancies in submitted resumes or disciplinary actions during employment history may lead to cancellation of the hiring process. Candidates who meet disqualification criteria as per Article 8 of Tossbank's employment regulations may have their applications canceled. Individuals with disabilities or veterans will receive preferential treatment as per relevant laws during the application process. A Word for Future Colleagues "You can grow your career in various paths such as Data or Finance sectors at Tossbank."

Mar 9, 2026
Apply
Toss logoToss logo
Full-time|On-site|Seoul

About the Data Reliability Team The Data Reliability Team at Toss, part of the Data Platform Tribe, focuses on monitoring the company’s data assets end-to-end. The team identifies critical data points, manages data quality checks, and oversees the full data lifecycle. Formed to address the lack of visibility into how backend and frontend code deployments affect data, features, models, and serving processes, the team plays a key role in maintaining trust in Toss’s data infrastructure. What You Will Do Design and build pipelines to generate features used across the organization, including inputs for machine learning models, Elasticsearch indices, Redis, and API responses. Develop and operate systems for feature quality management, such as retention policies and data quality maintenance. Directly develop common features that serve multiple teams within Toss. Review existing metadata for features and models, and work to fill any gaps in metadata coverage. Create processes to detect ad-hoc features and promote them to Verified Features status. Systematically manage data quality and assess the impact of data as it flows into online serving and machine learning systems. Who We’re Looking For Experience in feature engineering or in designing and building a feature store. Proven ability to design and build large-scale data pipelines using tools such as Spark, Flink, or Kafka. Familiarity with data lineage or metadata management systems is a plus. Experience building or operating data quality monitoring systems is also valued. Application Tips If you have designed or operated a feature store, please describe its structure and the challenges it addressed in your application. Include examples of how you have systematically resolved data quality issues. Hiring Process Application Job Interview Cultural Fit Interview Reference Check Compensation Negotiation Final Acceptance and Onboarding Why Join Toss as a Data Analytics Engineer This role offers the chance to set the standard for organization-wide features at Toss. Help design processes and management systems, and lead the development and operation of shared data infrastructure that supports the company’s growth. Location Seoul

Apr 15, 2026
Apply
Toss logoToss logo
Full-time|On-site|Seoul

# Join Us in the Domain- Commerce: Our commerce domain, initiated to enhance simple payment solutions, has grown to provide both group buying and administrative services for sellers. You will own the data for the commerce domain and play a crucial role in expanding our services, significantly contributing to Toss's growth.- Ads: You will take ownership of the advertising domain data to deliver meaningful ads to Toss users. Toss Ads is the most effective performance advertising platform in Korea, and you will help build a solid foundation for its efficient utilization.- Pay: As a key area of finance, the payment domain significantly steers Toss's growth. You will facilitate connections with various merchants and external organizations to develop Toss Pay, ensuring smooth analysis and insights from payment data.- Growth: Focused on enhancing user engagement and economic value, you will strategize and execute plans for the growth of new services and cross-activation of Toss users.- Business: Placed within the business organization that drives revenue growth for Toss, you will utilize comprehensive data across various domains to provide insights for business analytics and revenue growth strategies.- An interview process will determine the most synergistic domain for your placement, considering your strengths and the organization's needs.- **Want to learn more about Toss's Data Organization?** [→ *Toss Data Division Wiki*](https://recruit-data-division.oopy.io/)

Mar 9, 2026
Apply
Toss Securities logoToss Securities logo
Full-time|On-site|Seoul

Join Our TeamThe Data Analytics Engineer at Toss Securities is a key member of the Data Warehouse Team within the Data Division.Your responsibilities will focus on our Data Warehouse Platform, Business Mart, and CPC Mart.The CPC (Central Point of Contact) Mart is designed to establish a reliable data infrastructure that meets various regulatory demands (CPC, disclosures, periodic reports) while enhancing Toss Securities' external credibility and internal operational efficiency through automation and advancement.We are enhancing and refining the information shared with Business Mart and the proprietary data required for CPC.Our team consists of around 7 members with diverse backgrounds, ranging from 2 to 14 years of experience in various sectors such as portals, finance, gaming, and startups.Your ResponsibilitiesImplement and respond to external requests from regulatory authorities (CPC, disclosures, periodic reporting) through our systems.Design, build, and manage necessary data marts and dashboards for the required reports.Ensure the reliability of reporting systems through data integrity and quality management (DQ).Collaborate with various departments (domestic/international trading ledgers, accounts, compliance, PM, etc.) to provide data support for data-driven decision-making.Systematically manage reporting tasks based on legal frameworks.Establish a foundation for effectively utilizing data assets through data cataloging and standard management.Proactively resolve essential data processing tasks in our rapidly growing services through collaboration with colleagues.Enhance system efficiency by refactoring and optimizing existing mart tables through data modeling that considers consistency, reusability, and scalability.Who We Are Looking ForExperience in CPC-related tasks is preferred.A strong knowledge of the securities domain or active experience in stock trading is a plus.You should be able to clearly define key concepts in the securities domain as a DW data modeler and lead the design of understandable and clear data structures.We need someone who has experience simplifying complex data models or automating repetitive problems.You should be able to propose efficient data processing methods while maintaining data standards based on smooth communication with various stakeholders.Experience in defining enterprise data standards and structuring tables through data cataloging would be beneficial.You must possess the capability and experience to take the lead in data warehouse/mart modeling, pipeline construction, and operations.You should be able to provide standards from the perspective of clear data structures and efficient data utilization, rather than merely processing simple requests.Strong SQL skills are required, with the ability to write efficient and readable code.Experience in data pipeline development based on Hadoop, Airflow, and DBT is a plus.In some cases, a basic understanding of pySpark may be required.Experience with BI tools (such as Tableau) is a plus.Resume RecommendationsDetail your experience designing and building Data Warehouses considering requirements and data infrastructure environments.Include the problems you wish to solve and how you approached and resolved them.Highlight your important table design methodologies while building Data Warehouses in your resume.Detail any work you have done related to data governance.Include specific experiences where you boosted data utilization by leveraging DW tables, such as business analysis or reporting automation.Be specific about your experience managing data quality, such as handling duplicates or outliers.

Mar 10, 2026
Apply
Toss Securities logoToss Securities logo
Full-time|On-site|Seoul

Join Our Dynamic TeamThe Data Engineer (AI) position is part of the AI Data Platform Team at Toss Securities.The AI Data Platform Team comprises Data Engineers, Machine Learning Engineers, Server Engineers, and Product Operation Managers, fostering collaboration across various roles.Our mission is to develop a unique data moat for Toss Securities through the integration of diverse securities domain data and AI technologies, providing essential insights for investors.We utilize external LLMs and conduct training and evaluation of our internally developed models while leveraging various data platform technologies.Your ResponsibilitiesProactively identify and lead projects to solve business challenges at Toss Securities, overseeing the entire process from data architecture design to development and operation.Build and manage a securities data platform that integrates, processes, and serves global market data.Establish and maintain a knowledge graph platform for real-time domain data.Create and operate data pipelines that underpin AI service products.Develop and manage a feature store for personalized recommendation services in real-time.Ensure data integrity by designing, developing, and operating data quality verification and monitoring systems.We Seek Candidates WhoHave over 5 years of experience in data engineering.Can comprehend requirements and analyze technical trade-offs to determine the optimal data architecture in a given environment.Possess a solid understanding and experience in large-scale distributed processing and data platforms.Have experience sharing knowledge with peers and junior engineers, contributing to the technical growth of the entire team.Are interested in leveraging AI beyond mere tools, understanding its principles to innovate engineering productivity.Can coordinate with colleagues across various functions and provide constructive feedback.Are eager to take on new challenges and proactively learn and grow.Preferred ExperiencesExperience with Kafka-based stream processing and large-scale distributed data processing (Hadoop/ClickHouse/ElasticSearch).Experience building and operating data pipelines using Airflow, Docker, and Kubernetes.Experience in monitoring and managing data integrity and quality.Stay up-to-date with the latest trends in AI/data technologies and have an interest in automation and productivity enhancement.

Mar 10, 2026

Sign in to browse more jobs

Create account — see all 443 results

Tailoring 0 resumes

We'll move completed jobs to Ready to Apply automatically.