Data Engineer Log Platform jobs in Seoul – Browse 407 openings on RoboApply Jobs

Data Engineer Log Platform jobs in Seoul

Open roles matching “Data Engineer Log Platform” with location signals for Seoul. 407 active listings on RoboApply Jobs.

407 jobs found

1 - 20 of 407 Jobs
Apply
TossCareers logo
Full-time|On-site|Seoul

Join our dynamic team as a Data Engineer specializing in log platforms. In this role, you will work with cutting-edge technologies to design, build, and maintain robust data pipelines. Your expertise will help us manage and analyze extensive log data, enabling improved decision-making and operational efficiency.

Apr 9, 2026
Apply
Toss Bank logo
Full-time|On-site|Seoul

# About the Team/Position- The Data Product Manager (Log) is part of the Real-Time Data team.- This team ensures that all Toss Bank products operate on reliable data by designing and managing the entire process of log data from 'definition - collection - validation - utilization'.- The App Log Manager plays a crucial role in maintaining the integrity of log data over time, creating standards and quality criteria that ensure data reliability.# Responsibilities- Define and manage the log structure and standards across apps and servers (json schema, key/value rules).- Build and manage the Log Specification with a traceable history of key/value creation, modification, and deletion.- Minimize structural differences between FE, BE, and Native logs while enforcing a unified logging policy.- Design and enhance automated validation systems to ensure log data quality, including: - Syntax validation: checking fields, types, and structures - Semantic validation: ensuring consistency and meaning in logs for the same actions.- Balance team resources and efficiency during standardization and automation, optimizing overall organizational log operations.- Collaborate with data engineers, data analysts, and developers to institutionalize log standardization and QA processes across the organization.- Analyze and address the impact of UI/UX changes, such as design modifications or multilingual (i18n) expansions, on log meaning.- Design a balance between standardization, validation, and automation while taking responsibility for overall data quality.# Ideal Candidate- A self-starter who can structure the entire lifecycle from log definition to validation, with experience in writing Log Specifications and schema design, as well as key/value management.- Strong problem awareness regarding log quality assurance and validation automation, with practical experience in designing and improving validation/QA systems.- Experience in integrating logs from various platforms (FE, BE, Native) into a cohesive system or able to approach complex environments structurally.- Familiarity with dynamic parameters and unstructured log formats.- Proven experience in creating a governance structure focused on systems rather than individuals while collaborating with diverse roles (DA, DE, FE, BE, QA, Design) to improve data quality.- Understanding the influence of design changes and multilingual extensions on log meaning, with the ability to design realistic connections with designers and developers.- Capable of not just identifying complex issues but also designing and implementing practical processes to enhance data quality.# Resume Recommendations- Experience in defining, designing, and managing app logs or service logs.- Experience in log standardization, key/value rules, and schema management.- Experience in designing validation and QA automation (Python, SQL, Airflow, CI/CD, etc.).- Experience in integrating and managing FE/Native/BE logs for quality assurance.- Experience in addressing data quality issues such as log error detection, backfill, and reprocessing.- Experience collaborating with various roles (FE, DA, DE, QA) to enhance data quality.# Journey to Joining Toss- Application submission > Job interview > Cultural fit interview > Reference check > Compensation negotiation > Final acceptance and onboarding.# Important Notes- Any false information found in resumes or documents may lead to job cancellation.- Applicants falling under the disqualification criteria per Toss Bank's employment rules may also face cancellation.- Disabled individuals and veterans will receive preferential treatment in accordance with relevant laws.# A Colleague's Insight"We manage the log structure and standards across applications and servers."- The Log DPM at Toss Bank plays a pivotal role in ensuring that logs maintain their meaning over time and are trusted as reliable data.- This role provides a great sense of utility as it is responsible for overall data quality!

Mar 9, 2026
Apply
Toss Bank logo
Full-time|On-site|Seoul

Join Our Dynamic Team!The Data Engineer for Workflow Platform is an integral member of Toss Bank's Data Division, specifically within the Data Platform team.This team comprises three key areas: Data Infrastructure & Hadoop, Streaming Platform, and Workflow Platform.We operate various Data Platforms, including Hadoop, Kafka, CDC, and Airflow.Our mission is to ensure the reliability and scalability of the enterprise data infrastructure, ensuring all data is securely collected and processed.Your Responsibilities:Design and operate a large-scale data workflow execution platform in an on-premise Kubernetes environment.Optimize resources to ensure the stable execution of large workflows across various data organizations, enhancing platform performance and reliability.Collaborate with enterprise data engineers to improve the execution quality of the overall data pipeline and enhance developer experience.Monitor workflow execution status, design and improve systems for automated fault detection, alerts, and recovery procedures.Safely manage workflow executions in accordance with internal control standards of the financial sector, advancing a systematic history management system.Continuously review and implement new technologies and open-source solutions to enhance the performance and scalability of the workflow platform.We Are Looking For:Experience operating an Airflow-based workflow orchestration system with proven improvements in stability, scalability, and execution efficiency.Background in developing Python-based data workflows and platform services.Understanding of container technologies (Docker, Kubernetes, etc.) and experience in automating service deployment and configuration using tools like Helm.Ability to understand company environments and communicate effectively with various teams during service development.A keen interest in improving operational efficiency and optimization in large-scale workflow environments.Desire to enhance platform user experience to facilitate easier and safer pipeline development and operations for in-house data engineers.A proactive approach to analyzing, modifying, and improving open-source solutions at the code level to solve issues.Resume Submission Tips:Clearly outline impactful projects you have worked on in your career.Focus on experiences related to data platforms, particularly with Airflow, Kubernetes, and Python.

Mar 9, 2026
Apply
Toss logo
Full-time|On-site|Seoul

Join Us and Engage in Exciting Work! After completing a comprehensive onboarding process to familiarize yourself with the Toss data environment, you will be part of the Data Warehouse Team, undertaking the following responsibilities: Develop a data quality platform that enhances table consistency, advances DQ rules, and establishes health check metrics. We aim to create a reliability management platform allowing all data users to work without questioning, 'Can I trust this data?'.Enhance the GraphRAG pipeline. Build a knowledge graph construction pipeline that extracts entities by parsing ontology YAML, SQL, and code, followed by vector embedding for indexing in Elasticsearch, making Toss's data assets easily navigable for everyone.Design and operate MSA architectures. Split necessary services for the ontology platform into microservices, ensuring each is designed, implemented, and operated reliably.Develop AI agent infrastructure. Create a multi-agent workflow execution environment based on open-source agent frameworks like CrewAI. Establish an MCP Tool Registry and develop integration infrastructure with external MCP servers.Build an early warning platform. Create a monitoring system that detects anomalies in data lineage, code, and trends, automatically performing alerts and analyses to identify issues before they escalate.Develop a lineage tracking engine. Create a system that automatically analyzes the extent of impacts by parsing SQL to extract column-wise influence relationships, determining how far changes propagate.

Apr 1, 2026
Apply
Toss Securities logo
Full-time|On-site|Seoul

About the Team You'll Join The Data Analytics Engineer at Toss Securities is part of the Data Warehouse Team within the Data Division. Your responsibilities will be focused on Data Platform and Data Mart tasks. While your primary focus will vary, you will also engage in cross-functional projects. The Platform tasks involve maintaining and optimizing ETL/Pipeline Tools to effectively manage the DW Mart tables. You will explore and implement new methods to reduce DW operation time with limited resources. Our goal is to maximize data utilization across the organization using tables managed by the DW team. The current team consists of approximately 7 members with varying experience levels ranging from 2 to 14 years, coming from diverse backgrounds including portals, banking, gaming, and startups. Curious about the Data Division? The Data Division at Toss Securities aims to become the world's leading securities firm in data handling, contributing through data technology, services, and data-driven decision-making. We foster close collaboration among various data professionals and enjoy our work. Regular Tech Weekly sessions allow us to share expertise, and you can freely engage with different teams to learn from each other. Your Responsibilities Experience and contribute to an efficient DW environment within a rapidly growing agile organization. Design data marts and develop and automate DW Data Workflows based on the Hadoop Ecosystem and open-source solutions. Identify and implement methods for structuring and automating numerous DW/Mart tables. Process large volumes of data swiftly and effectively to create and manage various features. Establish Data Quality Checks and Governance within the data marts. Experience in deriving and establishing system requirements for large data processing and analysis is a plus. Ideal Candidate At least 5 years of experience as a Data Engineer is essential. You should possess a fundamental understanding of RDBMS, Hadoop Ecosystem, and Data Warehousing. Proven experience in leading the design, construction, and operation of data marts is required. You should be capable of installing, operating, and troubleshooting Airflow, DBT, and Django, with the ability to modify open-source tools to develop features needed for securities DW. Experience in simplifying complex problems or automating repetitive tasks using data models is critical. Extensive experience in efficiently processing big data using Spark is highly desirable. Intermediate proficiency in Python and advanced skills in SQL are required. Resume Tips If you have resolved critical issues while operating platforms or optimized performance and system resource usage, please include those experiences. Be specific about impactful projects you have worked on. If you have addressed bugs or issues while using open-source tools, or developed or enhanced features, please detail those experiences. Highlight the results of any improvements made in actual services, quantified if possible (please exclude sensitive information if necessary). Join Toss Securities Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation >...

Mar 10, 2026
Apply
Toss logo
Full-time|On-site|Seoul

Join Our Dynamic TeamAs a Data Engineer at Toss, you will be part of the Data Platform team.The Data Platform team operates within the Data Division, supporting and managing the data and platforms necessary for all Toss services.Our team comprises members with 2 to 18 years of experience from diverse backgrounds, including portals, banking, gaming, and startups.We encourage team members to pursue various interests and collaborate freely on skills and knowledge sharing.Your ResponsibilitiesDevelop and maintain stable and efficient data pipelines (ingestion, loading, streaming).Contribute to data-driven Toss services through real-time distributed processing of large datasets.Create and manage tools to ensure a reliable and efficient data experimentation and analysis environment for your colleagues.Develop various data service applications for data analysis and platform operation.Who You AreYou have experience in developing data pipelines for large-scale data processing (collection, processing, analysis).You are familiar with large-scale distributed systems (Hadoop, HBase, Kafka, Spark, Flink, etc.).You possess software development skills for data application development (Java, Scala, Python, etc.).You have intermediate or advanced programming skills (web/client/server programming).Experience in developing services related to recommendations/advertising/machine learning is a plus.Please Highlight These Experiences in Your ResumeDetail the projects you worked on, the technologies you used, and how you solved challenges, rather than just listing languages or frameworks.Experience using platforms similar to Toss is beneficial, but we prioritize growth potential and problem-solving abilities over specific technologies.Include any experience resolving critical failures while operating platforms or optimizing for performance and resource usage.Share experiences where you identified and resolved bugs in open-source software or contributed enhancements.Your Journey to Joining TossApplication submission > Technical interview > Cultural fit interview > Reference check > Compensation discussion > Final offer

Mar 10, 2026
Apply
Toss logo
Full-time|On-site|Seoul

Join Our Data Platform Team!As a Data Engineer at Toss, you will be part of our Data Platform Team.The team consists of Data Engineers and Data Analytics Engineers.We are responsible for building platforms and data pipelines essential for analyzing the services provided by Toss.Your ResponsibilitiesDevelop and operate OLAP (Online Analytical Processing) based data pipelines.Design and optimize systems for reliable operation of large-scale data analysis and real-time/batch data pipelines.Develop and manage batch and streaming pipelines to load various types of data generated at Toss.Continuously improve data models and processing logic based on service requirements.We Are Looking For Candidates Who HaveExperience operating services in a Kubernetes (K8s) based environment.Experience in designing and operating data streaming pipelines using Kafka and Kafka Connect.Experience in processing large volumes of data using Apache Spark (Batch/Structured Streaming).Additional Skills That Would Be a PlusExperience in operating and tuning MPP/OLAP engines like StarRocks or ClickHouse.Experience in building Data Lakehouses using Open Table Formats such as Apache Iceberg, Hudi, or Delta Lake.Experience in real-time data processing using streaming frameworks like Kafka Streams or Apache Flink.Experience in designing and operating ETL pipelines based on Airflow.Your Journey to Join TossApplication Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation > Final Offer

Mar 9, 2026
Apply
Toss Securities logo
Full-time|On-site|Seoul

Join Our Dynamic TeamThe Data Engineer (AI) position is part of the AI Data Platform Team at Toss Securities.The AI Data Platform Team comprises Data Engineers, Machine Learning Engineers, Server Engineers, and Product Operation Managers, fostering collaboration across various roles.Our mission is to develop a unique data moat for Toss Securities through the integration of diverse securities domain data and AI technologies, providing essential insights for investors.We utilize external LLMs and conduct training and evaluation of our internally developed models while leveraging various data platform technologies.Your ResponsibilitiesProactively identify and lead projects to solve business challenges at Toss Securities, overseeing the entire process from data architecture design to development and operation.Build and manage a securities data platform that integrates, processes, and serves global market data.Establish and maintain a knowledge graph platform for real-time domain data.Create and operate data pipelines that underpin AI service products.Develop and manage a feature store for personalized recommendation services in real-time.Ensure data integrity by designing, developing, and operating data quality verification and monitoring systems.We Seek Candidates WhoHave over 5 years of experience in data engineering.Can comprehend requirements and analyze technical trade-offs to determine the optimal data architecture in a given environment.Possess a solid understanding and experience in large-scale distributed processing and data platforms.Have experience sharing knowledge with peers and junior engineers, contributing to the technical growth of the entire team.Are interested in leveraging AI beyond mere tools, understanding its principles to innovate engineering productivity.Can coordinate with colleagues across various functions and provide constructive feedback.Are eager to take on new challenges and proactively learn and grow.Preferred ExperiencesExperience with Kafka-based stream processing and large-scale distributed data processing (Hadoop/ClickHouse/ElasticSearch).Experience building and operating data pipelines using Airflow, Docker, and Kubernetes.Experience in monitoring and managing data integrity and quality.Stay up-to-date with the latest trends in AI/data technologies and have an interest in automation and productivity enhancement.

Mar 10, 2026
Apply
Toss Securities logo
Full-time|On-site|Seoul

About the Team You Will JoinThe Product Owner for Toss Securities' AI Data Platform focuses on generating diverse investment information content through AI, enabling customers to make informed investment decisions.Your team is dedicated to gathering all types of data and creating a pipeline that allows AI to generate content tailored to customer needs.We collaborate with various internal silos to build the necessary infrastructure for seamless data and machine learning service delivery, fundamentally transforming the investment experience through enhanced search and recommendation capabilities.Additionally, you will belong to a PO/PM chapter where you can share and solve product management challenges with other Product Owners and Managers.Your ResponsibilitiesDefine challenges in the customer investment journey and hypothesize AI-driven solutions.Plan AI-based content and experiences to help customers become more comfortable with investing.Refine and structure diverse data sources, including investment data, to make them usable.Design products that assist in customer investment decision-making using the latest AI technologies, including RAG, LLM, and ML modeling.Collaborate with ML Engineers and Data Engineers to develop prototypes, enhance model performance, and bring products to market.What We Are Looking ForNo specific years of experience required; we value depth of experience over years worked.Experience in developing or planning data and AI-driven products (search/recommendation/ML/unstructured data/personalization/ETL) is essential.Experience in simplifying and standardizing complex data pipelines to quickly supply necessary data is highly preferred.Ability to clearly define customer problems and connect them with technical solutions.Strong communication skills to collaborate effectively with MLEs and DEs.Resume TipsClearly outline the flow of problem definition, solution derivation, collaboration with stakeholders, and the resulting outcomes for each product, service, or project.Include insights and achievements from the processes rather than just listing tasks.The Journey to Joining Toss SecuritiesApplication Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation > Final Offer and OnboardingPlease NoteIf any false information is found in your resume or if disciplinary actions are confirmed in your employment history, hiring may be canceled.Hiring may be canceled for candidates who fall under Toss Securities' hiring restrictions or disqualification criteria.

Mar 10, 2026
Apply
Tosscareers logo
Full-time|On-site|Seoul

Role Overview Tosscareers is hiring an AIOps Platform Engineer in Seoul. This role focuses on building and improving AIOps platforms that support IT operations. The work centers on integrating AI with operations to improve performance and reliability. What You Will Do Work with teams from different disciplines to design and develop AIOps solutions Implement platforms that help automate and optimize IT operations Contribute technical expertise to projects that connect AI tools with operational workflows Support the stability and efficiency of IT systems through AI-driven approaches Location This position is based in Seoul.

Apr 16, 2026
Apply
Toss Careers logo
Full-time|On-site|Seoul

# Join Our Team- The AI Platform team is on a mission to create a platform that enables everyone to use AI technology quickly and reliably. We technically support the use of AI across Toss.- We are developing the necessary tools and platforms for new AI systems such as Retrieval-Augmented Generation, Agents, and Assistants to be rapidly experimented with and reliably operated.- The platform we create is not just a simple toolkit; it is designed with scalability in mind, allowing AI technology to be implemented by more teams effectively.- Addressing unresolved problems means that collaboratively defining and structuring technical directions is crucial.- **Want to learn more about Toss’s data organization?** [→ *Toss Data Division Wiki*](https://recruit-data-division.oopy.io/)# Responsibilities- Integrate LLM-based components such as Retrieval, Generation, and Vector Search into a platform that various teams can reuse.- Provide features that integrate both SaaS and self-hosted LLMs and ensure stable operations.- Design the foundation for creating and experimenting with Agent systems more easily, including prompts, tools, and context configurations.- After experiments, ensure stable operation in the service by systematizing and tooling the serving and operational flow of RAG and Agents.- Develop a foundation to quantitatively evaluate the performance and quality of Agents and provide it as a platform.- Design a common environment and experience to enable rapid experimentation and application of AI systems not only within the team but also across other teams.- Structure unformatted technical elements and create directions that can expand into broader problems.# We Are Looking For- Individuals with experience applying technologies such as LLM, RAG, and Agents to real-world problems.- Those who can technically define unstructured problems and systematically solve them.- Candidates who have collaborated with multiple teams to develop and operate technology as if it were a product.- Those who can quickly adapt to new technological trends and integrate them naturally within the team.- Individuals interested in simplifying complex AI systems into a consistent and straightforward user experience.# Preferred Qualifications- Experience designing RAG components such as Retrieval, Generation, and Vector Search independently and integrating them at the system level.- Experience selecting and operating various LLM serving structures (OpenAI API, HuggingFace, vLLM) tailored to service situations.- Experience structuring various purpose-driven Agents and applying them in service operations is highly welcomed.- Experience proactively designing experimental environments or tools based on the requirements of platform users (internal developers, model engineers, etc.).- Experience creating and operating common platforms or RAG-based systems that can be extended across multiple projects or domains is especially desirable.# Resume Tips- If your past projects had significant impacts on the organization, please detail them.- Rather than just listing languages, platforms, frameworks, or technologies used, provide context about the project's objectives, the methods you employed, and how you solved the problems.- Include experiences where you resolved critical issues during platform operations or optimized performance/resources.- If you have contributed to open source by fixing bugs or enhancing functionalities, please share those experiences.# The Journey to Join Toss- Application submission > Job interview > Cultural fit interview > Reference check > Compensation negotiation > Final acceptance- The job interview will feature in-depth technical interviews focused on ML system design.# A Message for Future Colleagues“We don’t just serve rapidly evolving AI models; we build systems that ensure these models operate stably and are continuously improved.”- The AI Platform team is responsible for the serving, experimenting, and operating infrastructure of various AI technologies such as LLM-based services, RAG systems, and search infrastructure to ensure smooth operation in production environments.- We efficiently manage GPU resources and clusters, utilizing vLLM, Triton, Model Registry, etc., to automate experiments and deployments.

Mar 9, 2026
Apply
Toss Careers logo
Full-time|On-site|Seoul

Join our innovative team as a Frontend Platform Engineer at Toss, where we revolutionize the way people manage their finances through cutting-edge technology and seamless user experiences.

Mar 9, 2026
Apply
Toss Care logo
Full-time|On-site|Seoul

Join our innovative team as a Frontend Platform Engineer, where you will play a pivotal role in developing and enhancing our cutting-edge frontend technologies. You'll collaborate closely with cross-functional teams to design, implement, and maintain high-quality user interfaces that elevate our platform's performance and user experience. If you're passionate about creating seamless web applications and have a knack for problem-solving, we want to hear from you!

Mar 9, 2026
Apply
Toss logo
Full-time|On-site|Seoul

# About the Team- The Data Mart Platform Team is dedicated to building a standardized Data Warehouse for various Toss products, aiming to prevent data silos and enhance overall data maturity across the organization.- Responsibilities include enhancing centralized DW quality management processes, standard monitoring, integrating product data with the enterprise data mart, designing efficient pipelines, and creating standardized marts.- **Interested in learning more about Toss's Data Organization?** [→ *Toss Data Division Wiki*](https://recruit-data-division.oopy.io/) # Responsibilities- After completing an onboarding process to familiarize yourself with Toss's DW standards, you will work as part of the Data Mart Platform Team.- Maintain and manage an agile and manageable enterprise DW standard, taking responsibility for DW quality management from an enterprise perspective in collaboration with DAEs (Domain Analytics Engineers) from various product domains (development and execution of standard management monitoring).- Plan and execute systems and processes to enhance data reliability, improving table consistency, advancing DQ rules, and establishing health check metrics.- Develop enterprise-level marts, managing the integration of standard marts from different domains and ensuring efficient data pipeline improvements.- Identify and execute tasks to enhance data discoverability across the organization.- Develop a platform to measure data maturity across various Toss domains and initiate projects to enhance the productivity of DAEs.- The data development environment is based on Hadoop, Airflow, Python, and SQL (Impala). # Desired Qualifications- Understanding of database normalization and the fundamental characteristics of Data Warehouses (Subject-Oriented, Integrated, Non-Volatile, Time-Variant).- Ability to clearly define key concepts as a DW data modeler and propose efficient data structures based on diverse data perspectives.- High-level understanding of DW standard management and the capability to propose and lead improvement initiatives at the enterprise level.- Strong comprehension of data governance aspects, including data quality and compliance, with the ability to suggest actionable plans.- Proficient in SQL, capable of writing efficient and readable queries.- Basic Python skills (enough to work with Airflow) are acceptable, but understanding modules and PySpark code written by others is preferred.- Experience with large-scale data processing and designing metrics from an AARRR perspective is a plus. # Application Tips- Please specify any relevant experience with DW construction projects and mart design, detailing your contributions.- Mention specific challenges you have addressed regarding data maturity.- Outline your contributions and lessons learned while solving data-related issues. # Joining Toss - Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Discussion > Final Acceptance and Onboarding # A Note to Future Colleagues > "Our team strives for better service every day." - I was drawn to the thrilling risks associated with financial data and saw that my growth could contribute to the company's success, which is why I joined Toss. - The most stressful aspect of my previous company was being led by predetermined objectives, but Toss offers more autonomy than I expected, along with a dedicated and ambitious team focusing on "better service every day."

Mar 9, 2026
Apply
Airwallex logo
Full-time|On-site|KR - Seoul

Role overview Airwallex is seeking an Engineering Manager for the GTPN Platform in Seoul. This position leads a group of engineers dedicated to improving the platform’s features and reliability. The role involves close collaboration with teams throughout the company to achieve technical goals and support ongoing enhancements. What you will do Guide and mentor engineers working on the GTPN Platform Oversee projects that expand and strengthen the platform’s capabilities Collaborate with cross-functional teams to meet project objectives Promote a culture that prioritizes technical quality and creative problem-solving Location This role is based in Seoul, South Korea.

Apr 24, 2026
Apply
Airwallex logo
Full-time|On-site|KR - Seoul

Join Our Team at AirwallexAirwallex is pioneering a unified payments and financial platform designed for global businesses. Our innovative infrastructure and software empower over 200,000 businesses, including industry leaders like Brex, Rippling, Navan, Qantas, and SHEIN, to seamlessly manage everything from business accounts and payments to spend management and treasury services.Founded in Melbourne, our team of more than 2,000 bright minds operates across 26 global offices. With a valuation of US$8 billion and support from top-tier investors such as T. Rowe Price, Visa, and Mastercard, Airwallex is at the forefront of revolutionizing the global payments and financial landscape. If you’re ready to undertake the most ambitious project of your career, we invite you to join us.What We ValueWe seek builders with entrepreneurial spirit who are driven to make a genuine impact. You will bring your expertise and sharp insights, motivated by our mission and operating principles. You are quick to act with sound judgment, curious, and capable of making informed decisions that balance speed with thoroughness.Collaboration and humility are key traits; you will transform innovative concepts into tangible products and ensure project completion from start to finish. By leveraging AI, you will optimize processes and enhance problem-solving efficiency. Here, you will tackle complex challenges alongside exceptional colleagues and advance your career as we redefine the future of global banking.About the GTPN TeamThe Global Treasury and Payment Network (GTPN) team is integral to Airwallex’s payment infrastructure. As our first product and primary revenue driver, GTPN enhances many critical customer interactions. We operate at the intersection of traditional and modern finance, expanding our global treasury network while strengthening its capabilities and reliability through innovative payment products.Our team is a truly global, cross-site engineering organization, with talented engineers based in established and emerging hubs including the US, Singapore, Hong Kong, and Shanghai. As we continue to grow globally, we are looking for passionate Senior Software Engineers to join our ranks. You’ll work collaboratively in a dynamic environment focused on innovation.

Apr 30, 2026
Apply
Toss Securities logo
Full-time|On-site|Seoul

About the Team You'll Join The ML Engineer (Platform) at Toss Securities is part of the ML Platform Team within the Product Division. The mission of the ML Platform Team is to create an optimal machine learning platform that enables efficient and stable development and operation of various AI/ML services at Toss Securities. Your Responsibilities Upon Joining Develop and enhance the Gateway system, the gateway for ML services. Develop and operate a Gateway system based on FastAPI that handles enterprise-level LLM API requests. Design and implement authentication, routing, traffic control, fault isolation (Circuit Breaker, Fallback), large-scale TPS processing, and load balancing strategies from both application and infrastructure perspectives in the FastAPI-based Gateway application. Manage and serve ML services. Directly operate a machine learning model serving system in a Kubernetes environment. Design and improve the LLM serving architecture to operate stably under large traffic conditions. Monitor latency, error rates, resource usage, and analyze and resolve operational issues for the models in service. Identify root causes of failures and implement structural improvements, including operational policies and architecture. Develop and manage a common ML platform for the company. Develop and manage a common platform for efficiently operating the training and serving of internal ML/LLM models based on Kubeflow. Continuously monitor and optimize the performance and resources of workloads executed on the platform. Build infrastructure for LLM-based services. Operate LLM services using various serving frameworks such as vLLM, SGLang, and Triton. Manage the environment to ensure stable operation of training and serving workloads on high-performance GPU clusters like H100/B300. Build and operate a large-scale data training environment for finance domain-specific LLMs. We Are Looking for Candidates Who: Are proficient in one or more programming languages such as Python, Go, Java, or Kotlin, and have experience designing and developing API servers in production environments. Have experience developing or operating API Gateways (Nginx, Kong, etc.) or LLM Routers (LiteLLM, Envoy AI Gateway, etc.), with a background in handling high-volume traffic and incident response. Have experience operating serving logs and event pipelines integrated with Kafka, Elasticsearch, and Kibana. Have experience defining monitoring metrics for model serving and configuring and operating dashboards using Prometheus and Grafana. Have experience operating ML/LLM model serving using KServe, BentoML, vLLM, SGLang, etc. Have experience directly managing MLOps components (Kubeflow, KServe, Airflow, Argo CD, MLflow, etc.) in Kubernetes environments and debugging and resolving issues. Can design and apply long-term improvement plans through root cause analysis beyond short-term responses to issues that arise during service operations. Additional Preferred Experience: Experience in related fields or technologies will be a plus.

Mar 10, 2026
Apply
daangn logo
Full-time|On-site|SEOUL

Welcome to the Journey of Joining the Daangn Team!At Daangn, we strive to create an environment where individuals can grow alongside the company's growth.The Daangn recruitment team is here to help facilitate those moments of thoughtful collaboration with wonderful colleagues. Introducing the Data Value TeamThe Daangn team is dedicated to uncovering valuable information within local neighborhoods and resolving inconveniences in regional living. To create user value, it's essential to provide trustworthy information that users can easily access and incorporate into their decision-making. While Daangn already utilizes extensive data for decision-making, maximizing the value of our data requires significant changes.The vision of the Data Value Team is to make decisions for users through data every day. To realize this vision, we proactively tackle challenges in data value realization and lead the way in solving them.About the Data Software Engineer RoleThe Data Software Engineer plays a crucial role in addressing the challenges that arise during the process of data value realization through software engineering.In alignment with Daangn's rapid growth, you will design data systems that will not become bottlenecks in the future. You will ensure data reliability through automated testing and system observability. Additionally, you will solve technical problems that arise as Daangn members seek to understand users through data, thereby exponentially enhancing data-informed decision-making through data products (indicator platforms, experiment platforms, etc.).The mission of the Data Value Team's engineers is to facilitate a seamless flow of high-quality data at Daangn, enabling the creation of value without bottlenecks. Discover the Journey of the Data Value Team Growing with Daangn (Google Data Webinar) Learn about Daangn's Indicator Platform, KarrotMetrics Seven Challenges Daangn Faced in Implementing DBT and Airflow Tips for Easy Modeling with DBT from Daangn's Data Engineer (2024 Data Conference) Creating a Data Map at Daangn: Building Column Level Lineage No Need to Always Fetch Everything? Daangn's MongoDB CDC Build

Mar 16, 2026
Apply
Airwallex logo
Full-time|On-site|KR - Seoul

About AirwallexAirwallex is a pioneering unified payments and financial platform designed for global enterprises. With our unique blend of proprietary infrastructure and software, we empower more than 200,000 businesses globally—including industry leaders such as Brex, Rippling, Navan, Qantas, SHEIN, and others—with fully integrated solutions for managing business accounts, payments, spend management, treasury, and embedded finance on a global scale.Founded in Melbourne, we take pride in our diverse team of over 2,000 innovative professionals across 26 offices worldwide. Valued at $8 billion and supported by renowned investors like T. Rowe Price, Visa, Mastercard, Robinhood Ventures, Sequoia, Salesforce Ventures, DST Global, and Lone Pine Capital, Airwallex is at the forefront of developing the future of global payments and financial services. If you’re ready to embark on the most ambitious journey of your career, we invite you to join us.

Apr 30, 2026
Apply
Toss Securities logo
Realtime Data Engineer

Toss Securities

Full-time|On-site|Seoul

Join Our Team!The Realtime Data Engineer will be part of the Realtime Data Team within our Data Division.This team operates a distributed messaging streaming platform, ensuring the stable transmission of large-scale financial transactions.We manage high-volume data pipelines that deliver data with zero latency while maintaining integrity.Moreover, we integrate real-time data into OLAP environments, enabling immediate business decision-making and service enhancement. Your Responsibilities:Operate and optimize our Kafka cluster to ensure high availability of large-scale data from Toss Securities.Utilize tools like CDC, Kafka Connect, Flink, and ksqlDB to construct real-time data pipelines.Manage OLAP to efficiently store and query large volumes of incoming real-time data, optimizing query performance.Enhance architecture for greater throughput and lower latency, proactively assessing and implementing next-gen technologies for reliable data services. Ideal Candidate:Experience managing large-scale data platforms, ensuring infrastructure stability and performance.Proven experience designing and operating Kafka-based architectures or a deep understanding of distributed messaging systems.Intermediate to advanced proficiency in Java (or Kotlin), capable of implementing complex business logic in real-time streaming frameworks (Flink, ksqlDB).Experience building or operating real-time analytics environments using OLAP systems like ClickHouse, StarRocks, Druid, or Pinot.A broad experience in Data Engineering or depth in a specific area, eager to expand your role.Strong foundation in Data Engineering skills, quick to learn new tech stacks, and adept at finding optimal solutions in diverse situations.Excellent communication skills to collaboratively tackle complex problems with the team. Joining Toss Securities:Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation > Final Offer and Onboarding Please Note:Any inaccuracies found in the resume or disciplinary issues during employment history may lead to cancellation of the application.Candidates who are prohibited from hiring or have disqualifying reasons according to Toss Securities' regulations may have their applications canceled.Individuals with disabilities or national veterans are given preferential treatment in accordance with relevant laws. A Note for Future Colleagues:Processing transaction data generated in the securities domain in real-time is of high importance from a business perspective and poses significant technical challenges.The Toss Securities Realtime Data Team is at the forefront of this effort, currently maintaining stable securities services.Toss Securities continues to grow. We hope that the entire process of maintaining the systems of a growing securities firm will be an enjoyable journey.

Mar 10, 2026

Sign in to browse more jobs

Create account — see all 407 results

Tailoring 0 resumes

We'll move completed jobs to Ready to Apply automatically.