Data Analytics Engineer Platform At Toss Seoul jobs in Seoul – Browse 1,215 openings on RoboApply Jobs

Data Analytics Engineer Platform At Toss Seoul jobs in Seoul

Open roles matching “Data Analytics Engineer Platform At Toss Seoul” with location signals for Seoul. 1,215 active listings on RoboApply Jobs.

1,215 jobs found

1 - 20 of 1,215 Jobs
Apply
companyToss logo
Full-time|On-site|Seoul

Join Our Data Platform Team!As a Data Engineer at Toss, you will be part of our Data Platform Team.The team consists of Data Engineers and Data Analytics Engineers.We are responsible for building platforms and data pipelines essential for analyzing the services provided by Toss.Your ResponsibilitiesDevelop and operate OLAP (Online Analytical Processing) based data pipelines.Design and optimize systems for reliable operation of large-scale data analysis and real-time/batch data pipelines.Develop and manage batch and streaming pipelines to load various types of data generated at Toss.Continuously improve data models and processing logic based on service requirements.We Are Looking For Candidates Who HaveExperience operating services in a Kubernetes (K8s) based environment.Experience in designing and operating data streaming pipelines using Kafka and Kafka Connect.Experience in processing large volumes of data using Apache Spark (Batch/Structured Streaming).Additional Skills That Would Be a PlusExperience in operating and tuning MPP/OLAP engines like StarRocks or ClickHouse.Experience in building Data Lakehouses using Open Table Formats such as Apache Iceberg, Hudi, or Delta Lake.Experience in real-time data processing using streaming frameworks like Kafka Streams or Apache Flink.Experience in designing and operating ETL pipelines based on Airflow.Your Journey to Join TossApplication Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation > Final Offer

Mar 9, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

# About the Team- The Data Mart Platform Team is dedicated to building a standardized Data Warehouse for various Toss products, aiming to prevent data silos and enhance overall data maturity across the organization.- Responsibilities include enhancing centralized DW quality management processes, standard monitoring, integrating product data with the enterprise data mart, designing efficient pipelines, and creating standardized marts.- **Interested in learning more about Toss's Data Organization?** [→ *Toss Data Division Wiki*](https://recruit-data-division.oopy.io/) # Responsibilities- After completing an onboarding process to familiarize yourself with Toss's DW standards, you will work as part of the Data Mart Platform Team.- Maintain and manage an agile and manageable enterprise DW standard, taking responsibility for DW quality management from an enterprise perspective in collaboration with DAEs (Domain Analytics Engineers) from various product domains (development and execution of standard management monitoring).- Plan and execute systems and processes to enhance data reliability, improving table consistency, advancing DQ rules, and establishing health check metrics.- Develop enterprise-level marts, managing the integration of standard marts from different domains and ensuring efficient data pipeline improvements.- Identify and execute tasks to enhance data discoverability across the organization.- Develop a platform to measure data maturity across various Toss domains and initiate projects to enhance the productivity of DAEs.- The data development environment is based on Hadoop, Airflow, Python, and SQL (Impala). # Desired Qualifications- Understanding of database normalization and the fundamental characteristics of Data Warehouses (Subject-Oriented, Integrated, Non-Volatile, Time-Variant).- Ability to clearly define key concepts as a DW data modeler and propose efficient data structures based on diverse data perspectives.- High-level understanding of DW standard management and the capability to propose and lead improvement initiatives at the enterprise level.- Strong comprehension of data governance aspects, including data quality and compliance, with the ability to suggest actionable plans.- Proficient in SQL, capable of writing efficient and readable queries.- Basic Python skills (enough to work with Airflow) are acceptable, but understanding modules and PySpark code written by others is preferred.- Experience with large-scale data processing and designing metrics from an AARRR perspective is a plus. # Application Tips- Please specify any relevant experience with DW construction projects and mart design, detailing your contributions.- Mention specific challenges you have addressed regarding data maturity.- Outline your contributions and lessons learned while solving data-related issues. # Joining Toss - Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Discussion > Final Acceptance and Onboarding # A Note to Future Colleagues > "Our team strives for better service every day." - I was drawn to the thrilling risks associated with financial data and saw that my growth could contribute to the company's success, which is why I joined Toss. - The most stressful aspect of my previous company was being led by predetermined objectives, but Toss offers more autonomy than I expected, along with a dedicated and ambitious team focusing on "better service every day."

Mar 9, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

Join Our Dynamic TeamAs a Data Engineer at Toss, you will be part of the Data Platform team.The Data Platform team operates within the Data Division, supporting and managing the data and platforms necessary for all Toss services.Our team comprises members with 2 to 18 years of experience from diverse backgrounds, including portals, banking, gaming, and startups.We encourage team members to pursue various interests and collaborate freely on skills and knowledge sharing.Your ResponsibilitiesDevelop and maintain stable and efficient data pipelines (ingestion, loading, streaming).Contribute to data-driven Toss services through real-time distributed processing of large datasets.Create and manage tools to ensure a reliable and efficient data experimentation and analysis environment for your colleagues.Develop various data service applications for data analysis and platform operation.Who You AreYou have experience in developing data pipelines for large-scale data processing (collection, processing, analysis).You are familiar with large-scale distributed systems (Hadoop, HBase, Kafka, Spark, Flink, etc.).You possess software development skills for data application development (Java, Scala, Python, etc.).You have intermediate or advanced programming skills (web/client/server programming).Experience in developing services related to recommendations/advertising/machine learning is a plus.Please Highlight These Experiences in Your ResumeDetail the projects you worked on, the technologies you used, and how you solved challenges, rather than just listing languages or frameworks.Experience using platforms similar to Toss is beneficial, but we prioritize growth potential and problem-solving abilities over specific technologies.Include any experience resolving critical failures while operating platforms or optimizing for performance and resource usage.Share experiences where you identified and resolved bugs in open-source software or contributed enhancements.Your Journey to Joining TossApplication submission > Technical interview > Cultural fit interview > Reference check > Compensation discussion > Final offer

Mar 10, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

Join Our Dynamic TeamThe Data Engineer (AI) position is part of the AI Data Platform Team at Toss Securities.The AI Data Platform Team comprises Data Engineers, Machine Learning Engineers, Server Engineers, and Product Operation Managers, fostering collaboration across various roles.Our mission is to develop a unique data moat for Toss Securities through the integration of diverse securities domain data and AI technologies, providing essential insights for investors.We utilize external LLMs and conduct training and evaluation of our internally developed models while leveraging various data platform technologies.Your ResponsibilitiesProactively identify and lead projects to solve business challenges at Toss Securities, overseeing the entire process from data architecture design to development and operation.Build and manage a securities data platform that integrates, processes, and serves global market data.Establish and maintain a knowledge graph platform for real-time domain data.Create and operate data pipelines that underpin AI service products.Develop and manage a feature store for personalized recommendation services in real-time.Ensure data integrity by designing, developing, and operating data quality verification and monitoring systems.We Seek Candidates WhoHave over 5 years of experience in data engineering.Can comprehend requirements and analyze technical trade-offs to determine the optimal data architecture in a given environment.Possess a solid understanding and experience in large-scale distributed processing and data platforms.Have experience sharing knowledge with peers and junior engineers, contributing to the technical growth of the entire team.Are interested in leveraging AI beyond mere tools, understanding its principles to innovate engineering productivity.Can coordinate with colleagues across various functions and provide constructive feedback.Are eager to take on new challenges and proactively learn and grow.Preferred ExperiencesExperience with Kafka-based stream processing and large-scale distributed data processing (Hadoop/ClickHouse/ElasticSearch).Experience building and operating data pipelines using Airflow, Docker, and Kubernetes.Experience in monitoring and managing data integrity and quality.Stay up-to-date with the latest trends in AI/data technologies and have an interest in automation and productivity enhancement.

Mar 10, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

About the Team You'll Join The Data Analytics Engineer at Toss Securities is part of the Data Warehouse Team within the Data Division. Your responsibilities will be focused on Data Platform and Data Mart tasks. While your primary focus will vary, you will also engage in cross-functional projects. The Platform tasks involve maintaining and optimizing ETL/Pipeline Tools to effectively manage the DW Mart tables. You will explore and implement new methods to reduce DW operation time with limited resources. Our goal is to maximize data utilization across the organization using tables managed by the DW team. The current team consists of approximately 7 members with varying experience levels ranging from 2 to 14 years, coming from diverse backgrounds including portals, banking, gaming, and startups. Curious about the Data Division? The Data Division at Toss Securities aims to become the world's leading securities firm in data handling, contributing through data technology, services, and data-driven decision-making. We foster close collaboration among various data professionals and enjoy our work. Regular Tech Weekly sessions allow us to share expertise, and you can freely engage with different teams to learn from each other. Your Responsibilities Experience and contribute to an efficient DW environment within a rapidly growing agile organization. Design data marts and develop and automate DW Data Workflows based on the Hadoop Ecosystem and open-source solutions. Identify and implement methods for structuring and automating numerous DW/Mart tables. Process large volumes of data swiftly and effectively to create and manage various features. Establish Data Quality Checks and Governance within the data marts. Experience in deriving and establishing system requirements for large data processing and analysis is a plus. Ideal Candidate At least 5 years of experience as a Data Engineer is essential. You should possess a fundamental understanding of RDBMS, Hadoop Ecosystem, and Data Warehousing. Proven experience in leading the design, construction, and operation of data marts is required. You should be capable of installing, operating, and troubleshooting Airflow, DBT, and Django, with the ability to modify open-source tools to develop features needed for securities DW. Experience in simplifying complex problems or automating repetitive tasks using data models is critical. Extensive experience in efficiently processing big data using Spark is highly desirable. Intermediate proficiency in Python and advanced skills in SQL are required. Resume Tips If you have resolved critical issues while operating platforms or optimized performance and system resource usage, please include those experiences. Be specific about impactful projects you have worked on. If you have addressed bugs or issues while using open-source tools, or developed or enhanced features, please detail those experiences. Highlight the results of any improvements made in actual services, quantified if possible (please exclude sensitive information if necessary). Join Toss Securities Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation >...

Mar 10, 2026
Apply
companyToss Payments logo
Full-time|On-site|Seoul

Join our innovative team at Toss Payments as a Data Engineer, where you'll play a crucial role in building and optimizing our data infrastructure. You will work with cutting-edge technologies, collaborating with cross-functional teams to drive data-driven insights and solutions. Your expertise will help us enhance our payment systems and improve user experiences.

Mar 9, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

About the Team You Will Join The Data Engineer (Search) at Toss Securities is part of the AI Tribe within the AI Intelligence Silo. This Silo is a collaborative team consisting of Data Engineers, Machine Learning Engineers, Server Engineers, Frontend Engineers, Product Owners, and Product Designers, all working towards creating AI-driven information services leveraging securities domain data. Our focus is not just on presenting information but on rapidly experimenting with how to process and present data in a way that assists investors effectively. Search acts as the primary entry point connecting various securities data and AI services, with the Data Engineer (Search) being responsible for search/indexing functionalities that can be utilized across AI-based data services. The role centers around designing and operating the data and indexing aspects that constitute our search services, concentrating on stably designing and managing the data flow and infrastructure for search indexing rather than just enhancing algorithms or ranking models as seen in larger portals or e-commerce. Key Responsibilities Design and manage the indexing pipeline for Toss Securities search services, including stocks, autocomplete, news, and community features. Architect and reliably operate real-time/big data pipelines for search indexing. Gain insights into Elasticsearch-based search indexes and enhance the indexing structure and performance from a data perspective. Collaborate on data integrity management and re-indexing strategies to ensure stable data delivery for search. Gradually expand your responsibilities into areas beyond search, such as Graph search and ingestion of new data sources. Who We Are Looking For A candidate with over 3 years of experience in Data Engineering. Strong programming skills are preferred. Experience in designing or operating real-time or batch-based data pipelines is a plus. Experience in collecting and processing diverse data sources for service utilization is beneficial. Familiarity with big data processing platforms such as Spark, Hadoop, Impala is an advantage. A passion and curiosity for learning new domains and technologies are highly valued. A preference for collaborative environments where feedback and growth are encouraged is ideal. Additional Preferred Experience Experience using or managing search service infrastructures like Elasticsearch, Lucene, or Solr is advantageous. A genuine interest or experience in search domains such as search engines, recommendations, or ranking systems is a plus. Resume Tips Please detail your experience in designing/developing/operating data pipelines, ETL, streaming, etc. Highlight your role in projects and what you learned and improved through those experiences. If you have experience with search or Elasticsearch, even on a smaller scale, please describe the problems you solved. Your Journey to Joining Toss Securities Application submission > Job interview > Cultural fit interview > Reference check > Compensation discussion > Final acceptance and onboarding

Mar 9, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

Join Our Team!The Data Engineer (DataOps) at Toss Securities is part of the Data Infra Team within the Data Division.The Data Infra Team manages a big data infrastructure based on the Hadoop ecosystem and operates a log/search platform (Elasticsearch). We provide technical support for Kubernetes, which is widely used for data acquisition and various data operations. Additionally, we are developing a data analysis platform aimed at enhancing the company's data literacy. Curious About the Data Division?The Toss Securities Data Division strives to become the world's leading securities firm in data management by contributing to data technology, services, and data-driven decision making.A diverse range of data professionals collaborate closely and enjoy their work.We regularly host Tech Weekly sessions to share expertise. If you are interested and willing, you can learn from various roles and their methodologies. Your Responsibilities Include:Operating and Improving a Stable Log PlatformThe log platform (Elasticsearch) at Toss Securities stores logs generated from various services, making it a key tool for securities service operations.You will innovate to ensure real-time data acquisition and fast search capabilities, handling millions of records per second.Managing the Toss Securities Search PlatformWe utilize an Elasticsearch-based search platform to provide a variety of search services for stocks, news, and content.You will collaborate with search engineers to create an environment for accurate and rapid searches without delays.Operating and Supporting Data KubernetesVarious engineers (Data Engineers, ML Engineers, DW Engineers, Search Engineers) in the Data Division use dedicated data Kubernetes for data acquisition, serving, and internal operations.The Data Engineer (DataOps) will address issues arising from the Data Kubernetes infrastructure and continuously develop the system from a long-term perspective.You will also analyze the experiences of data engineers, focusing on process development, operation, and enhancement tasks. We Are Looking For Candidates Who:Have over 5 years of experience as a Data Engineer.Have experience using or operating search engines like Elasticsearch or Solr.Have worked with or managed distributed databases.Have experience operating Kubernetes + Istio-based infrastructure.Understand or have experience with the Hadoop ecosystem.Have built and managed metric systems in large-scale distributed environments.Have experience operating container orchestration services and CI/CD servers.Can automate engineering experiences within their current organization and improve processes or tools.Possess the ability to eliminate poor funnels in engineering experiences, as our organization focuses on collaborative achievement rather than direct supervision. Technologies We Use at Toss Securities:Python, Kotlin, Java, Spring Boot, Spring Cloud ConfigElasticsearch stack, Opensearch stack, Vector

Mar 10, 2026
Apply
companyToss Bank logo
Full-time|On-site|Seoul

About the Team You'll Join You will be part of the DataOps Engineering Team under the Data Division at Toss Bank. Utilize the Hadoop Ecosystem and various open-source technologies to manage enterprise data reliably and efficiently. Establish standards for data workflows and develop and optimize data pipelines. Responsible for performance and stability optimization of open-source-based analytical platforms. Your Responsibilities Upon Joining Develop, operate, and automate the enterprise data pipelines. Efficiently process and optimize large data using open-source tools. Manage a large-scale Airflow cluster in an on-premises Kubernetes environment. Evaluate and implement new technologies like Iceberg and Trino for data platform enhancement. Ideal Candidate Profile Proficient in development within a Hadoop Ecosystem environment. Extensive experience with open-source pipeline tools such as Airflow, Nifi, Kafka, and Flink, along with strong problem-solving skills. Experience in developing data pipelines for large-scale data collection, processing, and analysis. Experience in analyzing, tuning, and optimizing execution plans for distributed processing engines like Spark is a plus. Beyond just using open-source tools, it would be great if you have experience in analyzing source code and contributing through patches and backporting. Resume Recommendations Please specify impactful projects you have worked on, particularly those involving the design and construction of ETL/streaming pipelines. If you have applied improvements in actual services, please quantify the results (omit sensitive information if necessary). Share experiences that go beyond mere development and construction to include troubleshooting, operations, and achieved results. Joining Toss Bank Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation > Final Acceptance and Onboarding Important Notices If any false information is found in your resume or if disciplinary issues are confirmed during your employment history, hiring may be canceled. Hiring may be canceled for applicants who fall under the disqualifications specified in Toss Bank's employment regulations.

Mar 9, 2026
Apply
companyToss Bank logo
Full-time|On-site|Seoul

About the Team You'll Join The Data Analyst at Toss Bank is integrated into both the squad and DA chapter, functioning within a matrix structure. At Toss Bank, teams are organized into squads, consisting of 6-8 professionals including Product Owners, Designers, Developers, and Analysts, working autonomously like a startup. Data Analysts in the Toss community belong to the DA chapter, which meets every Wednesday to share insights on data analysis topics. The DA chapter comprises team members with 1 to 12 years of experience from diverse backgrounds, including finance, corporate, gaming, and e-commerce. Key Responsibilities You'll Undertake As a Data Analyst at Toss Bank, you will define business problems through data and identify optimal solutions to effect real change. Drive Data-Driven Decision Making: Go beyond merely extracting requested data by defining the 'real problems' behind business needs and formulating hypotheses. Validate hypotheses using appropriate methodologies, such as causal inference and A/B testing, to derive actionable insights. Create Reliable and Efficient Analytical Environments: Focus on developing reusable data structures that team members can utilize repeatedly, moving beyond one-time analyses. Establish data standards through data mart design and consistency validation that everyone can trust. Work Smart with Technology: Utilize cutting-edge technologies (AI, LLM, etc.) to enhance efficiency in repetitive tasks, allowing for a focus on more significant problem-solving. Proactively adopt new analytical methods and tools that can elevate productivity without being constrained by past practices. We Seek a Candidate Who: Can discern the essence of problems and proactively propose solutions that consider the impact on our team, company strategy, and business objectives. Understands complex business logic and can translate this into coherent SQL queries for structured data. Is adept in deep data analysis using tools like Python/R, capable of accurately interpreting experimental results based on statistical knowledge. Can communicate complex analytical findings in simple terms that non-experts can understand and persuade. Preferred Experience: Experience in constructing dashboards and analytical environments using tools such as Tableau or Amplitude is a plus. Prior involvement in service planning and log design stages is advantageous. Experience in leveraging new technologies (like AI) to enhance work processes is preferred. Resume Tips: Detail impactful projects you have worked on. Showcase the process of problem definition, hypothesis formulation, experimental design, validation, and results. Mention any experiences that led to deep user understanding through data analysis and actionable insights. Check if you have extensively utilized mobile service data analysis methods (LTV, AARRR, Cohort, Funnel, etc.).

Mar 9, 2026
Apply
companyToss Careers logo
Full-time|On-site|Seoul

Join Our Team! As a Site Reliability Engineer in the Tech Platform Tribe, you will work as a Server Developer. You'll build and operate effective monitoring systems to proactively detect and rapidly respond to issues. Your role will involve establishing performance testing environments to support service growth and addressing any challenges that arise, while designing solutions to prevent recurrence. Your Responsibilities: Continuously enhance the stability, scalability, availability, and latency of our systems. Improve monitoring systems to facilitate rapid root cause analysis in high-traffic environments. Resolve issues during outages and develop designs and solutions to prevent future occurrences. Establish and conduct performance testing environments to identify improvement points. Identify and improve single points of failure in network, application, open-source, and Kubernetes systems. Experiment with and evaluate various open-source products for potential implementation. We Are Looking For: Experience in Java/Kotlin and Spring Framework development, coupled with strong problem-solving skills. Ability to verify system availability through performance testing and resolve bottlenecks. Deep understanding of Linux and network systems. Experience operating infrastructure based on MSA, Kubernetes, Istio, Redis, Kafka, and ELK. A proactive approach to operating mission-critical services with a strong sense of responsibility. A willingness to embrace change and quickly adapt to new technologies while seeking continuous growth. Resume Submission Tips: Highlight impactful experiences and learning points rather than just listing past roles. Share instances where you proactively identified and resolved issues to enhance service stability. Discuss experiences where you significantly improved operational systems or introduced innovations that enhanced productivity and efficiency. Provide examples of deep analytical work to identify and resolve root causes of issues. Consider how you have designed systems to emphasize efficiency in environments with high or fluctuating traffic.

Mar 9, 2026
Apply
companyToss Insurance logo
Full-time|On-site|Seoul

About Toss Insurance Toss Insurance was founded to reshape the insurance sector within the financial ecosystem. The team is building a platform that helps insurance consultants use their expertise and engage more deeply, so they can address customer needs more effectively. This work aims to transform how people experience insurance consultation, contracts, and management, changing the structure of the insurance business itself. Role Overview: Platform Product Owner Design and improve the core systems and platforms that support Toss Insurance’s services. Define and refine foundational tools used by insurance consultants and internal teams, covering consultation, contracts, customer data, and operations. Focus on building a strong platform foundation rather than individual features, supporting productivity and scalability across the organization. Work as a key member of the product team, shaping products that reflect the structure and flow of the insurance business. Main Responsibilities Platform Product Strategy Analyze the full process from consultation to contract maintenance, identifying platform challenges. Set platform roadmaps and priorities aligned with business goals. Balance immediate improvements with long-term strategy. Problem Definition and Structuring Use user experience data from consultants, operations, and customers to define issues. Break down complex workflows into actionable product tasks. Product Discovery and Delivery Lead the process from forming hypotheses through validation, launch, and ongoing improvement. Work closely with designers, front-end and back-end engineers, and infrastructure/security specialists. Oversee timelines and quality to ensure results are delivered. Platform Improvement and Adoption Continuously enhance shared systems to support daily operations. Promote adoption and drive gains in efficiency, productivity, and stability. Stability and Compliance Address privacy, security, and compliance needs through product solutions. Lead efforts to build stable, scalable system architecture. What We Look For Experience defining and leading problem-solving with both customer and internal perspectives. Ability to break down complex challenges using data and turn them into clear plans. History of collaborating with a range of stakeholders to complete projects. Experience setting priorities and managing resources across multiple projects. Interest in building structures that help products, teams, and businesses grow together. Preferred Qualifications Background in planning or operating platforms, back-office, or B2B systems. Familiarity with insurance, finance, or compliance topics. Understanding of technology-based systems, including infrastructure, data, and security. Experience improving products in organizations experiencing rapid growth. What to Expect Direct involvement in designing the core flow of the insurance business, building both domain knowledge and product expertise.

Apr 14, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

About the Team You'll Join* As a Cloud Engineer (Cloud Platform Development) at Toss, you will be part of the Cloud Engineering Team, responsible for addressing and resolving issues within the Toss service infrastructure through a Cloud Native approach.* Joining this position means becoming a vital member of the Cloud Engineering Team.* Our goal is to optimize the infrastructure for Toss team members, ensuring a resilient system capable of handling large-scale traffic and preparing for potential failures.Your Responsibilities* Beyond utilizing cloud services, you will design, operate, and enhance them directly.* Analyze the infrastructure from a Cloud Engineer's perspective, providing solutions based on Kubernetes and OpenStack.* Implement features for performance optimization and operational automation of virtual infrastructure.* Develop monitoring, metering, and alerting systems to support the stability and efficiency of services at the virtual infrastructure level.* Adapt and improve open-source software for an in-house environment.* Build technical trust in cloud adoption and provide training and guidance to development and operations teams for seamless cloud utilization.We Want to Work With Someone Who* Is proficient in at least one programming language: Python, Golang, or C.* Understands REST APIs and database transactions.* Has knowledge and experience with OS, network, and storage technologies.* Is passionate about learning new technologies and actively applying them in operational environments.* Has a deep understanding of key OpenStack components such as Nova, Neutron, Cinder, Keystone, and experience in performance optimization.* Has experience in service operation and deployment using Kubernetes.A Note for Potential Colleagues"You will have the opportunity to experience cloud platform serving based on the latest trends of CI/CD and TDD."* Our team does not just 'write code.' We lead from problem recognition to infrastructure architecting, allowing you to experience the entire process of feature development, operation, and deployment automation. We constantly experiment with and apply new technologies, aiming for the highest level of operational automation and testing environments.* If you want to experience everything about cloud-native, discover a new career as an infrastructure developer, or are eager to learn more about infrastructure technology, we welcome you!

Mar 11, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

Join our dynamic team at Toss as a Data Analytics Engineer, where you will play a crucial role in transforming raw data into actionable insights. You will collaborate with cross-functional teams to identify opportunities for improving our services and enhancing user experiences.

Apr 9, 2026
Apply
companyToss Bank logo
Full-time|On-site|Seoul

Join Our Dynamic Team!The Data Engineer for Workflow Platform is an integral member of Toss Bank's Data Division, specifically within the Data Platform team.This team comprises three key areas: Data Infrastructure & Hadoop, Streaming Platform, and Workflow Platform.We operate various Data Platforms, including Hadoop, Kafka, CDC, and Airflow.Our mission is to ensure the reliability and scalability of the enterprise data infrastructure, ensuring all data is securely collected and processed.Your Responsibilities:Design and operate a large-scale data workflow execution platform in an on-premise Kubernetes environment.Optimize resources to ensure the stable execution of large workflows across various data organizations, enhancing platform performance and reliability.Collaborate with enterprise data engineers to improve the execution quality of the overall data pipeline and enhance developer experience.Monitor workflow execution status, design and improve systems for automated fault detection, alerts, and recovery procedures.Safely manage workflow executions in accordance with internal control standards of the financial sector, advancing a systematic history management system.Continuously review and implement new technologies and open-source solutions to enhance the performance and scalability of the workflow platform.We Are Looking For:Experience operating an Airflow-based workflow orchestration system with proven improvements in stability, scalability, and execution efficiency.Background in developing Python-based data workflows and platform services.Understanding of container technologies (Docker, Kubernetes, etc.) and experience in automating service deployment and configuration using tools like Helm.Ability to understand company environments and communicate effectively with various teams during service development.A keen interest in improving operational efficiency and optimization in large-scale workflow environments.Desire to enhance platform user experience to facilitate easier and safer pipeline development and operations for in-house data engineers.A proactive approach to analyzing, modifying, and improving open-source solutions at the code level to solve issues.Resume Submission Tips:Clearly outline impactful projects you have worked on in your career.Focus on experiences related to data platforms, particularly with Airflow, Kubernetes, and Python.

Mar 9, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

Join Us and Engage in Exciting Work! After completing a comprehensive onboarding process to familiarize yourself with the Toss data environment, you will be part of the Data Warehouse Team, undertaking the following responsibilities: Develop a data quality platform that enhances table consistency, advances DQ rules, and establishes health check metrics. We aim to create a reliability management platform allowing all data users to work without questioning, 'Can I trust this data?'.Enhance the GraphRAG pipeline. Build a knowledge graph construction pipeline that extracts entities by parsing ontology YAML, SQL, and code, followed by vector embedding for indexing in Elasticsearch, making Toss's data assets easily navigable for everyone.Design and operate MSA architectures. Split necessary services for the ontology platform into microservices, ensuring each is designed, implemented, and operated reliably.Develop AI agent infrastructure. Create a multi-agent workflow execution environment based on open-source agent frameworks like CrewAI. Establish an MCP Tool Registry and develop integration infrastructure with external MCP servers.Build an early warning platform. Create a monitoring system that detects anomalies in data lineage, code, and trends, automatically performing alerts and analyses to identify issues before they escalate.Develop a lineage tracking engine. Create a system that automatically analyzes the extent of impacts by parsing SQL to extract column-wise influence relationships, determining how far changes propagate.

Apr 1, 2026
Apply
companydaangn logo
Full-time|On-site|SEOUL

Welcome to the daangn Team!At daangn, we strive to create an environment where individuals can grow alongside the company’s success.We are here to assist you in making meaningful connections with fantastic colleagues. Introducing the Data Valuation TeamThe daangn team is dedicated to discovering valuable information that connects neighborhoods and resolving inconveniences in local living. To generate this user value, we must provide easy access to reliable information for decision-making. While we already utilize vast amounts of data for our decisions, maximizing the value of our data requires substantial change.The vision of the Data Valuation Team is to "make user-centric decisions through daily data utilization." We take the lead in addressing and solving the issues of data valuation.Role of the Data Analytics EngineerThe Data Analytics Engineer ensures that data is utilized reliably and consistently, working across data modeling, engineering, and analysis to deliver value to the business and its users.In the diverse service environment of daangn, the Data Analytics Engineer will design and improve the overall flow of data collection to utilization, enabling analysts, engineers, and product teams to leverage data reliably.Moreover, the role involves designing data marts, managing quality, operating data governance frameworks, and performing basic analyses or experiments to support data-driven decision-making. Discover the Journey of the Data Valuation Team with daangn (Google Data Webinar) Learn about daangn's Metric Platform KarrotMetrics The 7 Challenges daangn Faced While Implementing DBT and Airflow How daangn's Data Engineers Simplified Modeling with DBT (2024 Data Conference) Mapping daangn Data: Column Level Lineage Building No Need to Always Pull Everything? Building MongoDB CDC at daangn Why daangn Implements User Activation Engagement in 3 Key Areas

Feb 2, 2026
Apply
companyToss Bank logo
Full-time|On-site|Seoul

About the Team You'll Join The Business Data Analyst will be part of the Corporate Initiatives Team within the Strategy Tribe of Toss Bank's Strategy Division. This team plays a critical role in management planning by deriving insights for corporate-level decision-making and discovering new initiatives. We develop, measure, and analyze key performance indicators (KPIs) across the organization to manage overall performance. Your role will involve collaborating with various teams to facilitate data-driven decision-making, interpreting the current state of Toss Bank, and envisioning its future. Your Responsibilities Will Include Building data foundations for key management decisions and supporting process establishment. Structuring management information to provide accurate and timely insights to Toss Bank team members. Designing and enhancing frameworks for establishing enterprise OKRs, analyzing contributions of product organizations, and conducting cost-benefit analyses to support critical decision-making. Structuring and automating performance management systems. Creating data foundations that enhance not only short-term performance but also long-term growth and the value of Toss Bank as a platform, ensuring sustainable growth. Building processes and frameworks to connect user data with financial and operational data for holistic analysis. Supporting team members in understanding how their work impacts overall metrics through a common language and framework for data-driven decision-making. We're Looking For Someone Who Can independently carry out the entire process of data analysis from data cleansing to problem definition and insights generation. Has over 5 years of practical experience in data analysis using SQL. Experience with statistical analysis tools such as Python or R is a plus. Is proficient in at least one data visualization tool (Tableau, Redash, Power BI, Google Data Studio, etc.). Possesses strong communication skills to convey analytical results clearly. Has experience collaborating with teams focused on data analysis, strategic planning, management control, or FP&A from an enterprise perspective. Can quickly grasp the company's current situation, identify issues, and propose new alternatives and strategies. Resume Submission Guidelines There are no specific formats or items required. Please present your experiences in a way that best showcases your qualifications. It’s beneficial if your resume outlines the problem definition, hypothesis setting, experimental design and validation, and outcomes of projects you've undertaken. If you have experience proposing new alternatives and strategies based on a solid understanding of the business, please include that. Journey to Joining Toss Bank Application submission > Pre-query test > Job interview > Cultural fit interview > Reference check > Salary negotiation > Final acceptance and onboarding.

Mar 9, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

# Join Us in the Domain- Commerce: Our commerce domain, initiated to enhance simple payment solutions, has grown to provide both group buying and administrative services for sellers. You will own the data for the commerce domain and play a crucial role in expanding our services, significantly contributing to Toss's growth.- Ads: You will take ownership of the advertising domain data to deliver meaningful ads to Toss users. Toss Ads is the most effective performance advertising platform in Korea, and you will help build a solid foundation for its efficient utilization.- Pay: As a key area of finance, the payment domain significantly steers Toss's growth. You will facilitate connections with various merchants and external organizations to develop Toss Pay, ensuring smooth analysis and insights from payment data.- Growth: Focused on enhancing user engagement and economic value, you will strategize and execute plans for the growth of new services and cross-activation of Toss users.- Business: Placed within the business organization that drives revenue growth for Toss, you will utilize comprehensive data across various domains to provide insights for business analytics and revenue growth strategies.- An interview process will determine the most synergistic domain for your placement, considering your strengths and the organization's needs.- **Want to learn more about Toss's Data Organization?** [→ *Toss Data Division Wiki*](https://recruit-data-division.oopy.io/)

Mar 9, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

The Network Engineer role at Toss is part of the Infra Engineering Tribe in Seoul. This team collaborates with System Engineers and InfraOps Engineers to deliver infrastructure that keeps Toss services reliable, fast, and secure. The work supports Toss’s mission to advance financial technology and innovation. Role overview Build and manage network infrastructure for a range of Toss services. Develop and operate monitoring systems to ensure smooth network operations. Optimize network performance, addressing issues to maintain high service quality. Requirements Less than 5 years of experience in network construction. Strong understanding and hands-on experience with L2/L3 data center networks. Demonstrated initiative to improve networks and expand technical skills. Familiarity with TCP/IP and troubleshooting based on it is preferred. Experience working with others to identify and resolve network issues is valued. Experience with IPSEC VPN and L4 (SLB) is a plus. Knowledge of traffic or log monitoring using open source tools is advantageous. Resume tips Describe two memorable cases where technical issues were resolved, such as optimizations or bug fixes. Highlight project contributions, specifying your role, the technology stack, and improvements made before and after your involvement. Hiring process Application submission Job interview Cultural fit interview Reference check Salary negotiation Final acceptance and onboarding

Apr 28, 2026

Sign in to browse more jobs

Create account — see all 1,215 results

Tailoring 0 resumes

We'll move completed jobs to Ready to Apply automatically.