Data Analytics Engineer Data Warehouse Platform jobs in Seoul – Browse 408 openings on RoboApply Jobs

Data Analytics Engineer Data Warehouse Platform jobs in Seoul

Open roles matching “Data Analytics Engineer Data Warehouse Platform” with location signals for Seoul. 408 active listings on RoboApply Jobs.

408 jobs found

1 - 20 of 408 Jobs
Apply
companyToss Securities logo
Full-time|On-site|Seoul

About the Team You'll Join The Data Analytics Engineer at Toss Securities is part of the Data Warehouse Team within the Data Division. Your responsibilities will be focused on Data Platform and Data Mart tasks. While your primary focus will vary, you will also engage in cross-functional projects. The Platform tasks involve maintaining and optimizing ETL/Pipeline Tools to effectively manage the DW Mart tables. You will explore and implement new methods to reduce DW operation time with limited resources. Our goal is to maximize data utilization across the organization using tables managed by the DW team. The current team consists of approximately 7 members with varying experience levels ranging from 2 to 14 years, coming from diverse backgrounds including portals, banking, gaming, and startups. Curious about the Data Division? The Data Division at Toss Securities aims to become the world's leading securities firm in data handling, contributing through data technology, services, and data-driven decision-making. We foster close collaboration among various data professionals and enjoy our work. Regular Tech Weekly sessions allow us to share expertise, and you can freely engage with different teams to learn from each other. Your Responsibilities Experience and contribute to an efficient DW environment within a rapidly growing agile organization. Design data marts and develop and automate DW Data Workflows based on the Hadoop Ecosystem and open-source solutions. Identify and implement methods for structuring and automating numerous DW/Mart tables. Process large volumes of data swiftly and effectively to create and manage various features. Establish Data Quality Checks and Governance within the data marts. Experience in deriving and establishing system requirements for large data processing and analysis is a plus. Ideal Candidate At least 5 years of experience as a Data Engineer is essential. You should possess a fundamental understanding of RDBMS, Hadoop Ecosystem, and Data Warehousing. Proven experience in leading the design, construction, and operation of data marts is required. You should be capable of installing, operating, and troubleshooting Airflow, DBT, and Django, with the ability to modify open-source tools to develop features needed for securities DW. Experience in simplifying complex problems or automating repetitive tasks using data models is critical. Extensive experience in efficiently processing big data using Spark is highly desirable. Intermediate proficiency in Python and advanced skills in SQL are required. Resume Tips If you have resolved critical issues while operating platforms or optimized performance and system resource usage, please include those experiences. Be specific about impactful projects you have worked on. If you have addressed bugs or issues while using open-source tools, or developed or enhanced features, please detail those experiences. Highlight the results of any improvements made in actual services, quantified if possible (please exclude sensitive information if necessary). Join Toss Securities Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation >...

Mar 10, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

Join Our Dynamic Team!The Data Analytics Engineer (Data Engineer) at Toss Securities is an integral part of the Data Warehouse Team within the Data Division.Your focus will be on Data Platform and Data Mart, with opportunities to collaborate cross-functionally.The Mart responsibilities include structuring and managing data from the Toss Securities domain to facilitate analysis through data warehouse and aggregation table creation.Our current team of approximately 7 members brings diverse experiences ranging from 2 to 14 years, with backgrounds in various sectors such as portals, banking, gaming, and startups. Curious About Our Data Division?The Data Division at Toss Securities strives to become a world-class securities firm by leveraging data technology, services, and data-driven decision-making.We foster close collaboration among various data roles, creating an enjoyable working environment.Regular Tech Weekly sessions are held to share expertise, allowing you to engage with and learn from other roles as per your interest. Your Responsibilities Will Include:Designing clear and reliable table structures that can be easily understood and utilized, encompassing architecture design, compliance with standards, data processing logic management, data integrity validation, DQ monitoring, security reviews, and documentation using meta management systems.Collaborating with data users to design data marts and establish pipelines for key business performance analysis.Setting the groundwork for effective data asset utilization through data cataloging and standard management.Proactively addressing essential data processing tasks in a rapidly growing service environment with your colleagues.Enhancing system efficiency by effectively refactoring and optimizing various existing mart tables through data modeling that considers consistency, reusability, and scalability.Designing data marts and constructing pipelines for external/public reporting requirements. We Are Looking For Someone Who:Has a deep understanding of the securities domain or has actively engaged in stock trading.Can clearly define key concepts of the securities domain as a DW data modeler and take the lead in designing easy-to-understand data structures.Has experience simplifying complex data models or automating repetitive issues.Can propose efficient data processing methods while adhering to data standards through smooth communication with various stakeholders.Has experience structuring enterprise tables through defining data standards and building data catalogs.Is capable of independently conducting data warehouse/mart modeling, pipeline construction, and operational tasks.Can present standards from a clear data structure and efficient utilization perspective, rather than just processing simple requests.Is proficient in SQL and can write organized queries considering readability and efficiency.Has experience developing data pipelines based on Hadoop, Airflow, and DBT.May need to have intermediate to advanced skills in PySpark, depending on the situation.Would benefit from having experience with BI tools such as Tableau. Resume Tips:Detail impactful projects you have worked on.If you have improved services, quantify the results (omit sensitive external information).Elaborate on your work related to data governance.Include business analysis or reporting experience.

Mar 10, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

Join Our TeamThe Data Analytics Engineer at Toss Securities is a key member of the Data Warehouse Team within the Data Division.Your responsibilities will focus on our Data Warehouse Platform, Business Mart, and CPC Mart.The CPC (Central Point of Contact) Mart is designed to establish a reliable data infrastructure that meets various regulatory demands (CPC, disclosures, periodic reports) while enhancing Toss Securities' external credibility and internal operational efficiency through automation and advancement.We are enhancing and refining the information shared with Business Mart and the proprietary data required for CPC.Our team consists of around 7 members with diverse backgrounds, ranging from 2 to 14 years of experience in various sectors such as portals, finance, gaming, and startups.Your ResponsibilitiesImplement and respond to external requests from regulatory authorities (CPC, disclosures, periodic reporting) through our systems.Design, build, and manage necessary data marts and dashboards for the required reports.Ensure the reliability of reporting systems through data integrity and quality management (DQ).Collaborate with various departments (domestic/international trading ledgers, accounts, compliance, PM, etc.) to provide data support for data-driven decision-making.Systematically manage reporting tasks based on legal frameworks.Establish a foundation for effectively utilizing data assets through data cataloging and standard management.Proactively resolve essential data processing tasks in our rapidly growing services through collaboration with colleagues.Enhance system efficiency by refactoring and optimizing existing mart tables through data modeling that considers consistency, reusability, and scalability.Who We Are Looking ForExperience in CPC-related tasks is preferred.A strong knowledge of the securities domain or active experience in stock trading is a plus.You should be able to clearly define key concepts in the securities domain as a DW data modeler and lead the design of understandable and clear data structures.We need someone who has experience simplifying complex data models or automating repetitive problems.You should be able to propose efficient data processing methods while maintaining data standards based on smooth communication with various stakeholders.Experience in defining enterprise data standards and structuring tables through data cataloging would be beneficial.You must possess the capability and experience to take the lead in data warehouse/mart modeling, pipeline construction, and operations.You should be able to provide standards from the perspective of clear data structures and efficient data utilization, rather than merely processing simple requests.Strong SQL skills are required, with the ability to write efficient and readable code.Experience in data pipeline development based on Hadoop, Airflow, and DBT is a plus.In some cases, a basic understanding of pySpark may be required.Experience with BI tools (such as Tableau) is a plus.Resume RecommendationsDetail your experience designing and building Data Warehouses considering requirements and data infrastructure environments.Include the problems you wish to solve and how you approached and resolved them.Highlight your important table design methodologies while building Data Warehouses in your resume.Detail any work you have done related to data governance.Include specific experiences where you boosted data utilization by leveraging DW tables, such as business analysis or reporting automation.Be specific about your experience managing data quality, such as handling duplicates or outliers.

Mar 10, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

# About the Team- The Data Mart Platform Team is dedicated to building a standardized Data Warehouse for various Toss products, aiming to prevent data silos and enhance overall data maturity across the organization.- Responsibilities include enhancing centralized DW quality management processes, standard monitoring, integrating product data with the enterprise data mart, designing efficient pipelines, and creating standardized marts.- **Interested in learning more about Toss's Data Organization?** [→ *Toss Data Division Wiki*](https://recruit-data-division.oopy.io/) # Responsibilities- After completing an onboarding process to familiarize yourself with Toss's DW standards, you will work as part of the Data Mart Platform Team.- Maintain and manage an agile and manageable enterprise DW standard, taking responsibility for DW quality management from an enterprise perspective in collaboration with DAEs (Domain Analytics Engineers) from various product domains (development and execution of standard management monitoring).- Plan and execute systems and processes to enhance data reliability, improving table consistency, advancing DQ rules, and establishing health check metrics.- Develop enterprise-level marts, managing the integration of standard marts from different domains and ensuring efficient data pipeline improvements.- Identify and execute tasks to enhance data discoverability across the organization.- Develop a platform to measure data maturity across various Toss domains and initiate projects to enhance the productivity of DAEs.- The data development environment is based on Hadoop, Airflow, Python, and SQL (Impala). # Desired Qualifications- Understanding of database normalization and the fundamental characteristics of Data Warehouses (Subject-Oriented, Integrated, Non-Volatile, Time-Variant).- Ability to clearly define key concepts as a DW data modeler and propose efficient data structures based on diverse data perspectives.- High-level understanding of DW standard management and the capability to propose and lead improvement initiatives at the enterprise level.- Strong comprehension of data governance aspects, including data quality and compliance, with the ability to suggest actionable plans.- Proficient in SQL, capable of writing efficient and readable queries.- Basic Python skills (enough to work with Airflow) are acceptable, but understanding modules and PySpark code written by others is preferred.- Experience with large-scale data processing and designing metrics from an AARRR perspective is a plus. # Application Tips- Please specify any relevant experience with DW construction projects and mart design, detailing your contributions.- Mention specific challenges you have addressed regarding data maturity.- Outline your contributions and lessons learned while solving data-related issues. # Joining Toss - Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Discussion > Final Acceptance and Onboarding # A Note to Future Colleagues > "Our team strives for better service every day." - I was drawn to the thrilling risks associated with financial data and saw that my growth could contribute to the company's success, which is why I joined Toss. - The most stressful aspect of my previous company was being led by predetermined objectives, but Toss offers more autonomy than I expected, along with a dedicated and ambitious team focusing on "better service every day."

Mar 9, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

Join our dynamic team at Toss as a Data Analytics Engineer, where you will play a crucial role in transforming raw data into actionable insights. You will collaborate with cross-functional teams to identify opportunities for improving our services and enhancing user experiences.

Apr 9, 2026
Apply
companydaangn logo
Full-time|On-site|SEOUL

Welcome to the daangn Team!At daangn, we strive to create an environment where individuals can grow alongside the company’s success.We are here to assist you in making meaningful connections with fantastic colleagues. Introducing the Data Valuation TeamThe daangn team is dedicated to discovering valuable information that connects neighborhoods and resolving inconveniences in local living. To generate this user value, we must provide easy access to reliable information for decision-making. While we already utilize vast amounts of data for our decisions, maximizing the value of our data requires substantial change.The vision of the Data Valuation Team is to "make user-centric decisions through daily data utilization." We take the lead in addressing and solving the issues of data valuation.Role of the Data Analytics EngineerThe Data Analytics Engineer ensures that data is utilized reliably and consistently, working across data modeling, engineering, and analysis to deliver value to the business and its users.In the diverse service environment of daangn, the Data Analytics Engineer will design and improve the overall flow of data collection to utilization, enabling analysts, engineers, and product teams to leverage data reliably.Moreover, the role involves designing data marts, managing quality, operating data governance frameworks, and performing basic analyses or experiments to support data-driven decision-making. Discover the Journey of the Data Valuation Team with daangn (Google Data Webinar) Learn about daangn's Metric Platform KarrotMetrics The 7 Challenges daangn Faced While Implementing DBT and Airflow How daangn's Data Engineers Simplified Modeling with DBT (2024 Data Conference) Mapping daangn Data: Column Level Lineage Building No Need to Always Pull Everything? Building MongoDB CDC at daangn Why daangn Implements User Activation Engagement in 3 Key Areas

Feb 2, 2026
Apply
companyToss Bank logo
Full-time|On-site|Seoul

Join Our Dynamic Team!The Data Engineer for Workflow Platform is an integral member of Toss Bank's Data Division, specifically within the Data Platform team.This team comprises three key areas: Data Infrastructure & Hadoop, Streaming Platform, and Workflow Platform.We operate various Data Platforms, including Hadoop, Kafka, CDC, and Airflow.Our mission is to ensure the reliability and scalability of the enterprise data infrastructure, ensuring all data is securely collected and processed.Your Responsibilities:Design and operate a large-scale data workflow execution platform in an on-premise Kubernetes environment.Optimize resources to ensure the stable execution of large workflows across various data organizations, enhancing platform performance and reliability.Collaborate with enterprise data engineers to improve the execution quality of the overall data pipeline and enhance developer experience.Monitor workflow execution status, design and improve systems for automated fault detection, alerts, and recovery procedures.Safely manage workflow executions in accordance with internal control standards of the financial sector, advancing a systematic history management system.Continuously review and implement new technologies and open-source solutions to enhance the performance and scalability of the workflow platform.We Are Looking For:Experience operating an Airflow-based workflow orchestration system with proven improvements in stability, scalability, and execution efficiency.Background in developing Python-based data workflows and platform services.Understanding of container technologies (Docker, Kubernetes, etc.) and experience in automating service deployment and configuration using tools like Helm.Ability to understand company environments and communicate effectively with various teams during service development.A keen interest in improving operational efficiency and optimization in large-scale workflow environments.Desire to enhance platform user experience to facilitate easier and safer pipeline development and operations for in-house data engineers.A proactive approach to analyzing, modifying, and improving open-source solutions at the code level to solve issues.Resume Submission Tips:Clearly outline impactful projects you have worked on in your career.Focus on experiences related to data platforms, particularly with Airflow, Kubernetes, and Python.

Mar 9, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

Join Us and Engage in Exciting Work! After completing a comprehensive onboarding process to familiarize yourself with the Toss data environment, you will be part of the Data Warehouse Team, undertaking the following responsibilities: Develop a data quality platform that enhances table consistency, advances DQ rules, and establishes health check metrics. We aim to create a reliability management platform allowing all data users to work without questioning, 'Can I trust this data?'.Enhance the GraphRAG pipeline. Build a knowledge graph construction pipeline that extracts entities by parsing ontology YAML, SQL, and code, followed by vector embedding for indexing in Elasticsearch, making Toss's data assets easily navigable for everyone.Design and operate MSA architectures. Split necessary services for the ontology platform into microservices, ensuring each is designed, implemented, and operated reliably.Develop AI agent infrastructure. Create a multi-agent workflow execution environment based on open-source agent frameworks like CrewAI. Establish an MCP Tool Registry and develop integration infrastructure with external MCP servers.Build an early warning platform. Create a monitoring system that detects anomalies in data lineage, code, and trends, automatically performing alerts and analyses to identify issues before they escalate.Develop a lineage tracking engine. Create a system that automatically analyzes the extent of impacts by parsing SQL to extract column-wise influence relationships, determining how far changes propagate.

Apr 1, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

Join Our Data Platform Team!As a Data Engineer at Toss, you will be part of our Data Platform Team.The team consists of Data Engineers and Data Analytics Engineers.We are responsible for building platforms and data pipelines essential for analyzing the services provided by Toss.Your ResponsibilitiesDevelop and operate OLAP (Online Analytical Processing) based data pipelines.Design and optimize systems for reliable operation of large-scale data analysis and real-time/batch data pipelines.Develop and manage batch and streaming pipelines to load various types of data generated at Toss.Continuously improve data models and processing logic based on service requirements.We Are Looking For Candidates Who HaveExperience operating services in a Kubernetes (K8s) based environment.Experience in designing and operating data streaming pipelines using Kafka and Kafka Connect.Experience in processing large volumes of data using Apache Spark (Batch/Structured Streaming).Additional Skills That Would Be a PlusExperience in operating and tuning MPP/OLAP engines like StarRocks or ClickHouse.Experience in building Data Lakehouses using Open Table Formats such as Apache Iceberg, Hudi, or Delta Lake.Experience in real-time data processing using streaming frameworks like Kafka Streams or Apache Flink.Experience in designing and operating ETL pipelines based on Airflow.Your Journey to Join TossApplication Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation > Final Offer

Mar 9, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

# Join Us in the Domain- Commerce: Our commerce domain, initiated to enhance simple payment solutions, has grown to provide both group buying and administrative services for sellers. You will own the data for the commerce domain and play a crucial role in expanding our services, significantly contributing to Toss's growth.- Ads: You will take ownership of the advertising domain data to deliver meaningful ads to Toss users. Toss Ads is the most effective performance advertising platform in Korea, and you will help build a solid foundation for its efficient utilization.- Pay: As a key area of finance, the payment domain significantly steers Toss's growth. You will facilitate connections with various merchants and external organizations to develop Toss Pay, ensuring smooth analysis and insights from payment data.- Growth: Focused on enhancing user engagement and economic value, you will strategize and execute plans for the growth of new services and cross-activation of Toss users.- Business: Placed within the business organization that drives revenue growth for Toss, you will utilize comprehensive data across various domains to provide insights for business analytics and revenue growth strategies.- An interview process will determine the most synergistic domain for your placement, considering your strengths and the organization's needs.- **Want to learn more about Toss's Data Organization?** [→ *Toss Data Division Wiki*](https://recruit-data-division.oopy.io/)

Mar 9, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

About the Role Toss is looking for a Data Analytics Engineer focused on Financial Operations (FinOps) in Seoul. This position centers on analyzing financial data and supporting decisions that improve financial processes. The work directly impacts how the company manages and optimizes its financial performance. What You Will Do Analyze financial data to support operational decisions Work closely with teams across the company to identify areas for improvement Develop insights that help optimize financial workflows Contribute to the development of fintech solutions by providing data-driven recommendations Collaboration This role involves frequent collaboration with cross-functional teams. Expect to share findings, discuss strategies, and help shape financial operations through data.

Apr 16, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

About the Data Reliability Team The Data Reliability Team at Toss, part of the Data Platform Tribe, focuses on monitoring the company’s data assets end-to-end. The team identifies critical data points, manages data quality checks, and oversees the full data lifecycle. Formed to address the lack of visibility into how backend and frontend code deployments affect data, features, models, and serving processes, the team plays a key role in maintaining trust in Toss’s data infrastructure. What You Will Do Design and build pipelines to generate features used across the organization, including inputs for machine learning models, Elasticsearch indices, Redis, and API responses. Develop and operate systems for feature quality management, such as retention policies and data quality maintenance. Directly develop common features that serve multiple teams within Toss. Review existing metadata for features and models, and work to fill any gaps in metadata coverage. Create processes to detect ad-hoc features and promote them to Verified Features status. Systematically manage data quality and assess the impact of data as it flows into online serving and machine learning systems. Who We’re Looking For Experience in feature engineering or in designing and building a feature store. Proven ability to design and build large-scale data pipelines using tools such as Spark, Flink, or Kafka. Familiarity with data lineage or metadata management systems is a plus. Experience building or operating data quality monitoring systems is also valued. Application Tips If you have designed or operated a feature store, please describe its structure and the challenges it addressed in your application. Include examples of how you have systematically resolved data quality issues. Hiring Process Application Job Interview Cultural Fit Interview Reference Check Compensation Negotiation Final Acceptance and Onboarding Why Join Toss as a Data Analytics Engineer This role offers the chance to set the standard for organization-wide features at Toss. Help design processes and management systems, and lead the development and operation of shared data infrastructure that supports the company’s growth. Location Seoul

Apr 15, 2026
Apply
companyTossCareers logo
Full-time|On-site|Seoul

Join our dynamic team as a Data Engineer specializing in log platforms. In this role, you will work with cutting-edge technologies to design, build, and maintain robust data pipelines. Your expertise will help us manage and analyze extensive log data, enabling improved decision-making and operational efficiency.

Apr 9, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

Join Our Dynamic TeamAs a Data Engineer at Toss, you will be part of the Data Platform team.The Data Platform team operates within the Data Division, supporting and managing the data and platforms necessary for all Toss services.Our team comprises members with 2 to 18 years of experience from diverse backgrounds, including portals, banking, gaming, and startups.We encourage team members to pursue various interests and collaborate freely on skills and knowledge sharing.Your ResponsibilitiesDevelop and maintain stable and efficient data pipelines (ingestion, loading, streaming).Contribute to data-driven Toss services through real-time distributed processing of large datasets.Create and manage tools to ensure a reliable and efficient data experimentation and analysis environment for your colleagues.Develop various data service applications for data analysis and platform operation.Who You AreYou have experience in developing data pipelines for large-scale data processing (collection, processing, analysis).You are familiar with large-scale distributed systems (Hadoop, HBase, Kafka, Spark, Flink, etc.).You possess software development skills for data application development (Java, Scala, Python, etc.).You have intermediate or advanced programming skills (web/client/server programming).Experience in developing services related to recommendations/advertising/machine learning is a plus.Please Highlight These Experiences in Your ResumeDetail the projects you worked on, the technologies you used, and how you solved challenges, rather than just listing languages or frameworks.Experience using platforms similar to Toss is beneficial, but we prioritize growth potential and problem-solving abilities over specific technologies.Include any experience resolving critical failures while operating platforms or optimizing for performance and resource usage.Share experiences where you identified and resolved bugs in open-source software or contributed enhancements.Your Journey to Joining TossApplication submission > Technical interview > Cultural fit interview > Reference check > Compensation discussion > Final offer

Mar 10, 2026
Apply
companyCoupang logo
Full-time|On-site|Seoul, South Korea

Coupang seeks a Senior Data Analyst to join the Eats Analytics team in Seoul. This position centers on using data to inform decisions that shape the Coupang Eats customer experience. Role overview The Senior Data Analyst will work closely with teams developing new solutions for Coupang Eats. Analysis and insights from this role help guide improvements and respond to evolving user needs. What you will do Analyze data to support decisions that enhance the Coupang Eats customer journey Collaborate with teams to develop and refine solutions based on user data Monitor and interpret trends to help teams adapt to changing customer requirements Location This role is based in Seoul, South Korea.

Apr 23, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

Join Our Dynamic TeamThe Data Engineer (AI) position is part of the AI Data Platform Team at Toss Securities.The AI Data Platform Team comprises Data Engineers, Machine Learning Engineers, Server Engineers, and Product Operation Managers, fostering collaboration across various roles.Our mission is to develop a unique data moat for Toss Securities through the integration of diverse securities domain data and AI technologies, providing essential insights for investors.We utilize external LLMs and conduct training and evaluation of our internally developed models while leveraging various data platform technologies.Your ResponsibilitiesProactively identify and lead projects to solve business challenges at Toss Securities, overseeing the entire process from data architecture design to development and operation.Build and manage a securities data platform that integrates, processes, and serves global market data.Establish and maintain a knowledge graph platform for real-time domain data.Create and operate data pipelines that underpin AI service products.Develop and manage a feature store for personalized recommendation services in real-time.Ensure data integrity by designing, developing, and operating data quality verification and monitoring systems.We Seek Candidates WhoHave over 5 years of experience in data engineering.Can comprehend requirements and analyze technical trade-offs to determine the optimal data architecture in a given environment.Possess a solid understanding and experience in large-scale distributed processing and data platforms.Have experience sharing knowledge with peers and junior engineers, contributing to the technical growth of the entire team.Are interested in leveraging AI beyond mere tools, understanding its principles to innovate engineering productivity.Can coordinate with colleagues across various functions and provide constructive feedback.Are eager to take on new challenges and proactively learn and grow.Preferred ExperiencesExperience with Kafka-based stream processing and large-scale distributed data processing (Hadoop/ClickHouse/ElasticSearch).Experience building and operating data pipelines using Airflow, Docker, and Kubernetes.Experience in monitoring and managing data integrity and quality.Stay up-to-date with the latest trends in AI/data technologies and have an interest in automation and productivity enhancement.

Mar 10, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

About the Team You Will JoinThe Product Owner for Toss Securities' AI Data Platform focuses on generating diverse investment information content through AI, enabling customers to make informed investment decisions.Your team is dedicated to gathering all types of data and creating a pipeline that allows AI to generate content tailored to customer needs.We collaborate with various internal silos to build the necessary infrastructure for seamless data and machine learning service delivery, fundamentally transforming the investment experience through enhanced search and recommendation capabilities.Additionally, you will belong to a PO/PM chapter where you can share and solve product management challenges with other Product Owners and Managers.Your ResponsibilitiesDefine challenges in the customer investment journey and hypothesize AI-driven solutions.Plan AI-based content and experiences to help customers become more comfortable with investing.Refine and structure diverse data sources, including investment data, to make them usable.Design products that assist in customer investment decision-making using the latest AI technologies, including RAG, LLM, and ML modeling.Collaborate with ML Engineers and Data Engineers to develop prototypes, enhance model performance, and bring products to market.What We Are Looking ForNo specific years of experience required; we value depth of experience over years worked.Experience in developing or planning data and AI-driven products (search/recommendation/ML/unstructured data/personalization/ETL) is essential.Experience in simplifying and standardizing complex data pipelines to quickly supply necessary data is highly preferred.Ability to clearly define customer problems and connect them with technical solutions.Strong communication skills to collaborate effectively with MLEs and DEs.Resume TipsClearly outline the flow of problem definition, solution derivation, collaboration with stakeholders, and the resulting outcomes for each product, service, or project.Include insights and achievements from the processes rather than just listing tasks.The Journey to Joining Toss SecuritiesApplication Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation > Final Offer and OnboardingPlease NoteIf any false information is found in your resume or if disciplinary actions are confirmed in your employment history, hiring may be canceled.Hiring may be canceled for candidates who fall under Toss Securities' hiring restrictions or disqualification criteria.

Mar 10, 2026
Apply
companydaangn logo
Full-time|On-site|SEOUL

Welcome to the Journey of Joining the Daangn Team!At Daangn, we strive to create an environment where individuals can grow alongside the company's growth.The Daangn recruitment team is here to help facilitate those moments of thoughtful collaboration with wonderful colleagues. Introducing the Data Value TeamThe Daangn team is dedicated to uncovering valuable information within local neighborhoods and resolving inconveniences in regional living. To create user value, it's essential to provide trustworthy information that users can easily access and incorporate into their decision-making. While Daangn already utilizes extensive data for decision-making, maximizing the value of our data requires significant changes.The vision of the Data Value Team is to make decisions for users through data every day. To realize this vision, we proactively tackle challenges in data value realization and lead the way in solving them.About the Data Software Engineer RoleThe Data Software Engineer plays a crucial role in addressing the challenges that arise during the process of data value realization through software engineering.In alignment with Daangn's rapid growth, you will design data systems that will not become bottlenecks in the future. You will ensure data reliability through automated testing and system observability. Additionally, you will solve technical problems that arise as Daangn members seek to understand users through data, thereby exponentially enhancing data-informed decision-making through data products (indicator platforms, experiment platforms, etc.).The mission of the Data Value Team's engineers is to facilitate a seamless flow of high-quality data at Daangn, enabling the creation of value without bottlenecks. Discover the Journey of the Data Value Team Growing with Daangn (Google Data Webinar) Learn about Daangn's Indicator Platform, KarrotMetrics Seven Challenges Daangn Faced in Implementing DBT and Airflow Tips for Easy Modeling with DBT from Daangn's Data Engineer (2024 Data Conference) Creating a Data Map at Daangn: Building Column Level Lineage No Need to Always Fetch Everything? Daangn's MongoDB CDC Build

Mar 16, 2026
Apply
companyToss Securities logo
Realtime Data Engineer

Toss Securities

Full-time|On-site|Seoul

Join Our Team!The Realtime Data Engineer will be part of the Realtime Data Team within our Data Division.This team operates a distributed messaging streaming platform, ensuring the stable transmission of large-scale financial transactions.We manage high-volume data pipelines that deliver data with zero latency while maintaining integrity.Moreover, we integrate real-time data into OLAP environments, enabling immediate business decision-making and service enhancement. Your Responsibilities:Operate and optimize our Kafka cluster to ensure high availability of large-scale data from Toss Securities.Utilize tools like CDC, Kafka Connect, Flink, and ksqlDB to construct real-time data pipelines.Manage OLAP to efficiently store and query large volumes of incoming real-time data, optimizing query performance.Enhance architecture for greater throughput and lower latency, proactively assessing and implementing next-gen technologies for reliable data services. Ideal Candidate:Experience managing large-scale data platforms, ensuring infrastructure stability and performance.Proven experience designing and operating Kafka-based architectures or a deep understanding of distributed messaging systems.Intermediate to advanced proficiency in Java (or Kotlin), capable of implementing complex business logic in real-time streaming frameworks (Flink, ksqlDB).Experience building or operating real-time analytics environments using OLAP systems like ClickHouse, StarRocks, Druid, or Pinot.A broad experience in Data Engineering or depth in a specific area, eager to expand your role.Strong foundation in Data Engineering skills, quick to learn new tech stacks, and adept at finding optimal solutions in diverse situations.Excellent communication skills to collaboratively tackle complex problems with the team. Joining Toss Securities:Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation > Final Offer and Onboarding Please Note:Any inaccuracies found in the resume or disciplinary issues during employment history may lead to cancellation of the application.Candidates who are prohibited from hiring or have disqualifying reasons according to Toss Securities' regulations may have their applications canceled.Individuals with disabilities or national veterans are given preferential treatment in accordance with relevant laws. A Note for Future Colleagues:Processing transaction data generated in the securities domain in real-time is of high importance from a business perspective and poses significant technical challenges.The Toss Securities Realtime Data Team is at the forefront of this effort, currently maintaining stable securities services.Toss Securities continues to grow. We hope that the entire process of maintaining the systems of a growing securities firm will be an enjoyable journey.

Mar 10, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

Join Our Team!Toss Securities is rapidly growing under the mission of "Innovating every investment experience for our customers", boasting over 7.4 million registered users and 4 million active monthly users. We are currently leading in foreign stock transaction volumes. Our diverse range of investment products, including stocks, bonds, and options, expands our customers' choices. At the core of this growth is our robust and trustworthy data platform. Toss Securities is at a pivotal moment, needing to design a data architecture capable of handling large-scale real-time trading data, user behavior data, and regulatory compliance data.We are seeking a Head of Data Engineering to design and lead this structural transformation. This role is not just about operations; it is key to defining the data future of Toss Securities and enabling our organization to work data-driven.Your Responsibilities:Design and build an on-premise distributed architecture aligned with our mid-to-long-term business strategy, resolving data silos through Hadoop enhancements and transitioning to a Kafka-centered streaming-first approach.Construct and operate large-scale batch and streaming pipelines based on Spark/Flink and Kafka, ensuring reliable data processing through high-availability ETL/ELT design and performance optimization.Establish and manage data standards (layering, naming, permissions), quality management (DQ rules, SLA, lineage), and regulatory compliance frameworks based on metadata and personal information (PII).Coordinate data interests across services, risk, accounting, AI, and backend teams, establishing and executing a comprehensive data strategy including integration strategies and ownership definitions.Design the goals, structure, and processes of the data organization, leading a growing team through coaching and decision-making while addressing technical debt and fostering a trust-based environment.Oversee the design and operation of ML platforms and infrastructure for LLM/recommendation services, collaborating with service teams to build model deployment, operation, monitoring standards, and automated pipelines.Ideal Candidate:10+ years of experience in data engineering or platform architecture.Experience designing and operating large-scale clusters (Hadoop, Kafka).Proficiency in real-time streaming and batch data processing architecture design.Experience in building data governance, quality, and permission management systems.Leadership experience in engineering organizations (5-50 team members) is preferred.Excellent coordination and communication skills across diverse organizational stakeholders.Additional Preferred Qualifications:Experience with financial data in securities, banking, or fintech.Experience handling data within regulatory environments (e.g., PIPA, Financial Transaction Act).Experience in building semantic layers or data meshes.Experience with real-time transaction or advertising/shopping service data.Experience designing and operating large-scale model training, serving, and MLOps/LLMOps pipelines (e.g., Kubeflow, Argo, H100/H200 GPU clusters, vLLM/Triton).Experience with feature stores for real-time recommendations, model optimization, profiling (e.g., BentoML, ONNX, torchserve), and LLM fine-tuning/RAG operations.

Mar 9, 2026

Sign in to browse more jobs

Create account — see all 408 results

Tailoring 0 resumes

We'll move completed jobs to Ready to Apply automatically.