Data Analytics Engineer Product jobs in Seoul – Browse 497 openings on RoboApply Jobs

Data Analytics Engineer Product jobs in Seoul

Open roles matching “Data Analytics Engineer Product” with location signals for Seoul. 497 active listings on RoboApply Jobs.

497 jobs found

1 - 20 of 497 Jobs
Apply
companyToss logo
Full-time|On-site|Seoul

# Join Us in the Domain- Commerce: Our commerce domain, initiated to enhance simple payment solutions, has grown to provide both group buying and administrative services for sellers. You will own the data for the commerce domain and play a crucial role in expanding our services, significantly contributing to Toss's growth.- Ads: You will take ownership of the advertising domain data to deliver meaningful ads to Toss users. Toss Ads is the most effective performance advertising platform in Korea, and you will help build a solid foundation for its efficient utilization.- Pay: As a key area of finance, the payment domain significantly steers Toss's growth. You will facilitate connections with various merchants and external organizations to develop Toss Pay, ensuring smooth analysis and insights from payment data.- Growth: Focused on enhancing user engagement and economic value, you will strategize and execute plans for the growth of new services and cross-activation of Toss users.- Business: Placed within the business organization that drives revenue growth for Toss, you will utilize comprehensive data across various domains to provide insights for business analytics and revenue growth strategies.- An interview process will determine the most synergistic domain for your placement, considering your strengths and the organization's needs.- **Want to learn more about Toss's Data Organization?** [→ *Toss Data Division Wiki*](https://recruit-data-division.oopy.io/)

Mar 9, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

Join our dynamic team at Toss as a Data Analytics Engineer, where you will play a crucial role in transforming raw data into actionable insights. You will collaborate with cross-functional teams to identify opportunities for improving our services and enhancing user experiences.

Apr 9, 2026
Apply
companydaangn logo
Full-time|On-site|SEOUL

Welcome to the daangn Team!At daangn, we strive to create an environment where individuals can grow alongside the company’s success.We are here to assist you in making meaningful connections with fantastic colleagues. Introducing the Data Valuation TeamThe daangn team is dedicated to discovering valuable information that connects neighborhoods and resolving inconveniences in local living. To generate this user value, we must provide easy access to reliable information for decision-making. While we already utilize vast amounts of data for our decisions, maximizing the value of our data requires substantial change.The vision of the Data Valuation Team is to "make user-centric decisions through daily data utilization." We take the lead in addressing and solving the issues of data valuation.Role of the Data Analytics EngineerThe Data Analytics Engineer ensures that data is utilized reliably and consistently, working across data modeling, engineering, and analysis to deliver value to the business and its users.In the diverse service environment of daangn, the Data Analytics Engineer will design and improve the overall flow of data collection to utilization, enabling analysts, engineers, and product teams to leverage data reliably.Moreover, the role involves designing data marts, managing quality, operating data governance frameworks, and performing basic analyses or experiments to support data-driven decision-making. Discover the Journey of the Data Valuation Team with daangn (Google Data Webinar) Learn about daangn's Metric Platform KarrotMetrics The 7 Challenges daangn Faced While Implementing DBT and Airflow How daangn's Data Engineers Simplified Modeling with DBT (2024 Data Conference) Mapping daangn Data: Column Level Lineage Building No Need to Always Pull Everything? Building MongoDB CDC at daangn Why daangn Implements User Activation Engagement in 3 Key Areas

Feb 2, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

About the Role Toss is looking for a Data Analytics Engineer focused on Financial Operations (FinOps) in Seoul. This position centers on analyzing financial data and supporting decisions that improve financial processes. The work directly impacts how the company manages and optimizes its financial performance. What You Will Do Analyze financial data to support operational decisions Work closely with teams across the company to identify areas for improvement Develop insights that help optimize financial workflows Contribute to the development of fintech solutions by providing data-driven recommendations Collaboration This role involves frequent collaboration with cross-functional teams. Expect to share findings, discuss strategies, and help shape financial operations through data.

Apr 16, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

Join Our Dynamic Team!The Data Analytics Engineer (Data Engineer) at Toss Securities is an integral part of the Data Warehouse Team within the Data Division.Your focus will be on Data Platform and Data Mart, with opportunities to collaborate cross-functionally.The Mart responsibilities include structuring and managing data from the Toss Securities domain to facilitate analysis through data warehouse and aggregation table creation.Our current team of approximately 7 members brings diverse experiences ranging from 2 to 14 years, with backgrounds in various sectors such as portals, banking, gaming, and startups. Curious About Our Data Division?The Data Division at Toss Securities strives to become a world-class securities firm by leveraging data technology, services, and data-driven decision-making.We foster close collaboration among various data roles, creating an enjoyable working environment.Regular Tech Weekly sessions are held to share expertise, allowing you to engage with and learn from other roles as per your interest. Your Responsibilities Will Include:Designing clear and reliable table structures that can be easily understood and utilized, encompassing architecture design, compliance with standards, data processing logic management, data integrity validation, DQ monitoring, security reviews, and documentation using meta management systems.Collaborating with data users to design data marts and establish pipelines for key business performance analysis.Setting the groundwork for effective data asset utilization through data cataloging and standard management.Proactively addressing essential data processing tasks in a rapidly growing service environment with your colleagues.Enhancing system efficiency by effectively refactoring and optimizing various existing mart tables through data modeling that considers consistency, reusability, and scalability.Designing data marts and constructing pipelines for external/public reporting requirements. We Are Looking For Someone Who:Has a deep understanding of the securities domain or has actively engaged in stock trading.Can clearly define key concepts of the securities domain as a DW data modeler and take the lead in designing easy-to-understand data structures.Has experience simplifying complex data models or automating repetitive issues.Can propose efficient data processing methods while adhering to data standards through smooth communication with various stakeholders.Has experience structuring enterprise tables through defining data standards and building data catalogs.Is capable of independently conducting data warehouse/mart modeling, pipeline construction, and operational tasks.Can present standards from a clear data structure and efficient utilization perspective, rather than just processing simple requests.Is proficient in SQL and can write organized queries considering readability and efficiency.Has experience developing data pipelines based on Hadoop, Airflow, and DBT.May need to have intermediate to advanced skills in PySpark, depending on the situation.Would benefit from having experience with BI tools such as Tableau. Resume Tips:Detail impactful projects you have worked on.If you have improved services, quantify the results (omit sensitive external information).Elaborate on your work related to data governance.Include business analysis or reporting experience.

Mar 10, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

About the Team You'll Join The Data Analytics Engineer at Toss Securities is part of the Data Warehouse Team within the Data Division. Your responsibilities will be focused on Data Platform and Data Mart tasks. While your primary focus will vary, you will also engage in cross-functional projects. The Platform tasks involve maintaining and optimizing ETL/Pipeline Tools to effectively manage the DW Mart tables. You will explore and implement new methods to reduce DW operation time with limited resources. Our goal is to maximize data utilization across the organization using tables managed by the DW team. The current team consists of approximately 7 members with varying experience levels ranging from 2 to 14 years, coming from diverse backgrounds including portals, banking, gaming, and startups. Curious about the Data Division? The Data Division at Toss Securities aims to become the world's leading securities firm in data handling, contributing through data technology, services, and data-driven decision-making. We foster close collaboration among various data professionals and enjoy our work. Regular Tech Weekly sessions allow us to share expertise, and you can freely engage with different teams to learn from each other. Your Responsibilities Experience and contribute to an efficient DW environment within a rapidly growing agile organization. Design data marts and develop and automate DW Data Workflows based on the Hadoop Ecosystem and open-source solutions. Identify and implement methods for structuring and automating numerous DW/Mart tables. Process large volumes of data swiftly and effectively to create and manage various features. Establish Data Quality Checks and Governance within the data marts. Experience in deriving and establishing system requirements for large data processing and analysis is a plus. Ideal Candidate At least 5 years of experience as a Data Engineer is essential. You should possess a fundamental understanding of RDBMS, Hadoop Ecosystem, and Data Warehousing. Proven experience in leading the design, construction, and operation of data marts is required. You should be capable of installing, operating, and troubleshooting Airflow, DBT, and Django, with the ability to modify open-source tools to develop features needed for securities DW. Experience in simplifying complex problems or automating repetitive tasks using data models is critical. Extensive experience in efficiently processing big data using Spark is highly desirable. Intermediate proficiency in Python and advanced skills in SQL are required. Resume Tips If you have resolved critical issues while operating platforms or optimized performance and system resource usage, please include those experiences. Be specific about impactful projects you have worked on. If you have addressed bugs or issues while using open-source tools, or developed or enhanced features, please detail those experiences. Highlight the results of any improvements made in actual services, quantified if possible (please exclude sensitive information if necessary). Join Toss Securities Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation >...

Mar 10, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

About the Data Reliability Team The Data Reliability Team at Toss, part of the Data Platform Tribe, focuses on monitoring the company’s data assets end-to-end. The team identifies critical data points, manages data quality checks, and oversees the full data lifecycle. Formed to address the lack of visibility into how backend and frontend code deployments affect data, features, models, and serving processes, the team plays a key role in maintaining trust in Toss’s data infrastructure. What You Will Do Design and build pipelines to generate features used across the organization, including inputs for machine learning models, Elasticsearch indices, Redis, and API responses. Develop and operate systems for feature quality management, such as retention policies and data quality maintenance. Directly develop common features that serve multiple teams within Toss. Review existing metadata for features and models, and work to fill any gaps in metadata coverage. Create processes to detect ad-hoc features and promote them to Verified Features status. Systematically manage data quality and assess the impact of data as it flows into online serving and machine learning systems. Who We’re Looking For Experience in feature engineering or in designing and building a feature store. Proven ability to design and build large-scale data pipelines using tools such as Spark, Flink, or Kafka. Familiarity with data lineage or metadata management systems is a plus. Experience building or operating data quality monitoring systems is also valued. Application Tips If you have designed or operated a feature store, please describe its structure and the challenges it addressed in your application. Include examples of how you have systematically resolved data quality issues. Hiring Process Application Job Interview Cultural Fit Interview Reference Check Compensation Negotiation Final Acceptance and Onboarding Why Join Toss as a Data Analytics Engineer This role offers the chance to set the standard for organization-wide features at Toss. Help design processes and management systems, and lead the development and operation of shared data infrastructure that supports the company’s growth. Location Seoul

Apr 15, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

Join Our TeamThe Data Analytics Engineer at Toss Securities is a key member of the Data Warehouse Team within the Data Division.Your responsibilities will focus on our Data Warehouse Platform, Business Mart, and CPC Mart.The CPC (Central Point of Contact) Mart is designed to establish a reliable data infrastructure that meets various regulatory demands (CPC, disclosures, periodic reports) while enhancing Toss Securities' external credibility and internal operational efficiency through automation and advancement.We are enhancing and refining the information shared with Business Mart and the proprietary data required for CPC.Our team consists of around 7 members with diverse backgrounds, ranging from 2 to 14 years of experience in various sectors such as portals, finance, gaming, and startups.Your ResponsibilitiesImplement and respond to external requests from regulatory authorities (CPC, disclosures, periodic reporting) through our systems.Design, build, and manage necessary data marts and dashboards for the required reports.Ensure the reliability of reporting systems through data integrity and quality management (DQ).Collaborate with various departments (domestic/international trading ledgers, accounts, compliance, PM, etc.) to provide data support for data-driven decision-making.Systematically manage reporting tasks based on legal frameworks.Establish a foundation for effectively utilizing data assets through data cataloging and standard management.Proactively resolve essential data processing tasks in our rapidly growing services through collaboration with colleagues.Enhance system efficiency by refactoring and optimizing existing mart tables through data modeling that considers consistency, reusability, and scalability.Who We Are Looking ForExperience in CPC-related tasks is preferred.A strong knowledge of the securities domain or active experience in stock trading is a plus.You should be able to clearly define key concepts in the securities domain as a DW data modeler and lead the design of understandable and clear data structures.We need someone who has experience simplifying complex data models or automating repetitive problems.You should be able to propose efficient data processing methods while maintaining data standards based on smooth communication with various stakeholders.Experience in defining enterprise data standards and structuring tables through data cataloging would be beneficial.You must possess the capability and experience to take the lead in data warehouse/mart modeling, pipeline construction, and operations.You should be able to provide standards from the perspective of clear data structures and efficient data utilization, rather than merely processing simple requests.Strong SQL skills are required, with the ability to write efficient and readable code.Experience in data pipeline development based on Hadoop, Airflow, and DBT is a plus.In some cases, a basic understanding of pySpark may be required.Experience with BI tools (such as Tableau) is a plus.Resume RecommendationsDetail your experience designing and building Data Warehouses considering requirements and data infrastructure environments.Include the problems you wish to solve and how you approached and resolved them.Highlight your important table design methodologies while building Data Warehouses in your resume.Detail any work you have done related to data governance.Include specific experiences where you boosted data utilization by leveraging DW tables, such as business analysis or reporting automation.Be specific about your experience managing data quality, such as handling duplicates or outliers.

Mar 10, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

# About the Team- The Data Mart Platform Team is dedicated to building a standardized Data Warehouse for various Toss products, aiming to prevent data silos and enhance overall data maturity across the organization.- Responsibilities include enhancing centralized DW quality management processes, standard monitoring, integrating product data with the enterprise data mart, designing efficient pipelines, and creating standardized marts.- **Interested in learning more about Toss's Data Organization?** [→ *Toss Data Division Wiki*](https://recruit-data-division.oopy.io/) # Responsibilities- After completing an onboarding process to familiarize yourself with Toss's DW standards, you will work as part of the Data Mart Platform Team.- Maintain and manage an agile and manageable enterprise DW standard, taking responsibility for DW quality management from an enterprise perspective in collaboration with DAEs (Domain Analytics Engineers) from various product domains (development and execution of standard management monitoring).- Plan and execute systems and processes to enhance data reliability, improving table consistency, advancing DQ rules, and establishing health check metrics.- Develop enterprise-level marts, managing the integration of standard marts from different domains and ensuring efficient data pipeline improvements.- Identify and execute tasks to enhance data discoverability across the organization.- Develop a platform to measure data maturity across various Toss domains and initiate projects to enhance the productivity of DAEs.- The data development environment is based on Hadoop, Airflow, Python, and SQL (Impala). # Desired Qualifications- Understanding of database normalization and the fundamental characteristics of Data Warehouses (Subject-Oriented, Integrated, Non-Volatile, Time-Variant).- Ability to clearly define key concepts as a DW data modeler and propose efficient data structures based on diverse data perspectives.- High-level understanding of DW standard management and the capability to propose and lead improvement initiatives at the enterprise level.- Strong comprehension of data governance aspects, including data quality and compliance, with the ability to suggest actionable plans.- Proficient in SQL, capable of writing efficient and readable queries.- Basic Python skills (enough to work with Airflow) are acceptable, but understanding modules and PySpark code written by others is preferred.- Experience with large-scale data processing and designing metrics from an AARRR perspective is a plus. # Application Tips- Please specify any relevant experience with DW construction projects and mart design, detailing your contributions.- Mention specific challenges you have addressed regarding data maturity.- Outline your contributions and lessons learned while solving data-related issues. # Joining Toss - Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Discussion > Final Acceptance and Onboarding # A Note to Future Colleagues > "Our team strives for better service every day." - I was drawn to the thrilling risks associated with financial data and saw that my growth could contribute to the company's success, which is why I joined Toss. - The most stressful aspect of my previous company was being led by predetermined objectives, but Toss offers more autonomy than I expected, along with a dedicated and ambitious team focusing on "better service every day."

Mar 9, 2026
Apply
companyToss logo
Full-time|On-site|Seoul

Join Our Data Platform Team!As a Data Engineer at Toss, you will be part of our Data Platform Team.The team consists of Data Engineers and Data Analytics Engineers.We are responsible for building platforms and data pipelines essential for analyzing the services provided by Toss.Your ResponsibilitiesDevelop and operate OLAP (Online Analytical Processing) based data pipelines.Design and optimize systems for reliable operation of large-scale data analysis and real-time/batch data pipelines.Develop and manage batch and streaming pipelines to load various types of data generated at Toss.Continuously improve data models and processing logic based on service requirements.We Are Looking For Candidates Who HaveExperience operating services in a Kubernetes (K8s) based environment.Experience in designing and operating data streaming pipelines using Kafka and Kafka Connect.Experience in processing large volumes of data using Apache Spark (Batch/Structured Streaming).Additional Skills That Would Be a PlusExperience in operating and tuning MPP/OLAP engines like StarRocks or ClickHouse.Experience in building Data Lakehouses using Open Table Formats such as Apache Iceberg, Hudi, or Delta Lake.Experience in real-time data processing using streaming frameworks like Kafka Streams or Apache Flink.Experience in designing and operating ETL pipelines based on Airflow.Your Journey to Join TossApplication Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation > Final Offer

Mar 9, 2026
Apply
companyCoupang logo
Full-time|On-site|Seoul, South Korea

Coupang seeks a Senior Data Analyst to join the Eats Analytics team in Seoul. This position centers on using data to inform decisions that shape the Coupang Eats customer experience. Role overview The Senior Data Analyst will work closely with teams developing new solutions for Coupang Eats. Analysis and insights from this role help guide improvements and respond to evolving user needs. What you will do Analyze data to support decisions that enhance the Coupang Eats customer journey Collaborate with teams to develop and refine solutions based on user data Monitor and interpret trends to help teams adapt to changing customer requirements Location This role is based in Seoul, South Korea.

Apr 23, 2026
Apply
companyToss Bank logo
Full-time|On-site|Seoul

# Data Product at Toss Bank- The Data Product team views data as valuable, creating user-friendly data-centric products.- Data Products are built on data pipelines, analytics systems, and machine learning models, making them easily accessible for individuals and systems.- Our Data Products must ensure clarity in value delivery, with attributes such as consistency, reliability, regulatory compliance, security, usability, scalability, and reusability.- We offer completed service-oriented Data Products like recommendation systems, fraud detection systems (FDS), optical character recognition (OCR), speech-to-text (STT), and dashboards, as well as reusable and combinable Data Mart/API products.# Role Overview- As a Data Product Manager (Mart, API), you will define and manage the company's data as a 'product'. Your responsibility is to provide data in a timely, accurate, and user-friendly manner to internal analysts, engineers, and planners, thereby accelerating data-driven decision-making and driving business growth.- Upon joining, you will integrate into the Data Division, developing and executing the roadmap for data products centered around enterprise data marts and APIs.# Responsibilities- Define and execute the roadmap for data products based on Data Mart and API.- Identify data consumer requirements (VOC) to define data structures, schemas, and access methods (APIs), designing the best developer experience.- Collaborate closely with Data Engineers, Analysts, and operational departments to bridge the gap between business requirements and technical implementation.- Lead the design of a scalable and stable data architecture, understanding complex requirements such as data governance, security, and privacy.- Set key performance indicators (KPIs, OKRs) to measure the success of data products and continuously monitor and improve data quality, API response times, and usage.- Shape the overall data strategy of the organization, designing and executing a long-term vision of 'creating business value through data'.# Ideal Candidate- We are looking for someone with experience or deep understanding of planning and launching data products such as Data Marts, data warehouses, API platforms, and BI tools.- Preferred candidates will have experience leading technically challenging or ambiguous projects to completion.- We seek individuals who can create incremental outcomes while considering both business impact and technical feasibility.- Ideal candidates will want to engage not just in project management but also in transforming the organizational culture and processes around data utilization, collaborating with diverse stakeholders (analysts, engineers, operational departments) to build stronger teams.- We favor those who approach problems holistically and redefine them when necessary to guide projects in better directions.- Experience designing developer APIs or leading large-scale data pipeline projects is a plus.# Resume Tips- Clearly articulate your contributions to data-related projects.- Present a clear narrative of problem definition, solution derivation, collaboration process, and final outcomes.- Focus on achievements (e.g., increased API usage, improved data quality, contributions to automation) rather than simply listing roles.- Highlight experiences solving complex data/technical problems while collaborating with multiple stakeholders.# Journey to Join Toss- Application submission > Job interview > Cultural fit interview > Reference check > Compensation negotiation > Final acceptance and onboarding.# Important Notes- If any falsehoods are discovered in your resume or disciplinary issues are found in your work history, your application may be canceled.- Candidates who fall under disqualifications according to Toss Bank’s internal regulations or hiring disqualifications for financial institutions may have their applications canceled.- Individuals with disabilities and those eligible for national veteran status will receive preferential treatment in accordance with relevant laws.# Colleague's Insight> "You must be able to view Toss Bank's data as a product."- You will play a role in outlining the roadmap for data products based on Data Mart and API, especially in accelerating data-driven decision-making and driving business growth, which offers significant career growth opportunities.

Apr 2, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

Join Our Dynamic Team As a Data Analyst at Toss Securities, you will be part of our data analytics team. Our Data Analytics (DA) team comprises two main segments: the enterprise analytics unit and the product analytics unit, where you will be contributing to product analysis. Our team is diverse, with members hailing from finance, gaming, e-commerce, O2O, and portal backgrounds, bringing a wealth of experience. Your Responsibilities Support data-driven decision-making for our product development teams. Quantify team goals and identify key levers to achieve them, providing direction for prioritization. Contribute to product enhancements through A/B test design and result analysis, leveraging various analytical methodologies for insights. Ideal Candidate Profile Proficient in SQL for data extraction and cleansing. High-level understanding of mobile service data analysis methods (Customer Lifetime Value, Retention, Cohort Analysis, etc.). Experience in setting hypotheses and conducting experiments to improve services. Demonstrated ability to design A/B tests and derive meaningful results. Familiarity with data visualization tools (such as Tableau) is a plus. No strict experience level is required; we value insights gained from diverse experiences. Resume Recommendations Detail impactful projects from your experience. Clearly outline the problem definition, hypothesis setting, experimental design, and result validation processes. Share experiences where you deepened user understanding through data analysis and proposed actionable strategies. Confirm proficiency in mobile service data analysis methods (LTV, AARRR, Cohort, Funnel, etc.). If objectives were met, quantify improvements in metrics like churn rate, conversion rate, and revenue. (Please omit sensitive information if necessary.)

Mar 11, 2026
Apply
companyCoupang, Inc. logo
Full-time|On-site|z-Test & Templates Only

쿠팡은 고객 감동 실현을 위해 존재합니다. 고객들이 "쿠팡 없이 그동안 어떻게 살았을까?" 라고 말할 때, 비로소 우리의 미션을 실현하고 있음을 알 수 있습니다. 고객들의 쇼핑과 식사, 생활 전반을 편하게 만들겠다는 유일한 집념으로 쿠팡은 수억 달러 규모의 이커머스 산업 전반의 혁신을 이끌고 있습니다. 쿠팡은 가장 빠르게 성장하는 이커머스 기업 중 하나로, 국내 커머스 업계에서의 독보적인 입지와, 고객 신뢰를 구축했습니다. 쿠팡은 스타트업 문화를 기반으로 한 글로벌 대형 상장사라고 자부합니다. 이것이 창립 당시의 기민함을 유지하며, 신규 서비스를 끊임없이 출시하며 비즈니스를 확장해 나가는 우리의 성장 동력입니다. 쿠팡의 모든 임직원에게는 기업가 정신을 갖추고 새로운 혁신과 이니셔티브를 추진할 수 있는 기회가 주어집니다. 주저 없이 일에 뛰어들어 성과를 이루고자 하는 과감성이, 바로 쿠팡이 일하는 방식의 본질입니다. 쿠팡에서는 여러분 자신, 동료, 팀 그리고 회사 전체가 매일 성장하는 모습을 목격할 것입니다. 쿠팡의 모든 직원은 커머스의 미래를 만들겠다는 쿠팡의 미션에 진심입니다. 우리는 고객의 문제를 해결해 나가고, 전통적인 관념과 통념에 맞서며 실현 가능한 한계를 뛰어넘고 있습니다. 고가용성 (always-on) 과 최첨단의 앞선 기술 (high-tech), 초연결사회 (hyper-connected world) 에서의 놀라운 업무 경험을 원하신다면, 지금 바로 쿠팡에 합류하세요. 직무 개요: Coupang Ads는 쿠팡 내에서 광고 플랫폼을 담당하는 조직입니다. 세계 곳곳에 분산되어 있는 조직인 Coupang Ads는 쿠팡 플랫폼에서 상품을 판매하는 업체들을 위한 광고 솔루션을 제공하는 데 집중합니다. 디스플레이 광고, 검색 광고, 성과형 광고, 브랜딩 광고, 비디오 광고 등이 Coupang Ads가 제공하는 광고 솔루션에 포함됩니다. Ads Analytics 팀은 쿠팡 광고 솔루션의 수익을 극대화하고 사용자가 광고에서 가치를 얻고 전체 광고 비즈니스에 걸쳐 프로덕트 전략을 수립할 수 있도록 지원합니다. 데이터를 통합하여 실행 가능한 통찰력을 제공하고, 복잡한 비즈니스 질문에 답변하며, 고객에 대한 심도 있는 이해를 돕고 있습니다. 우리는 Ecommerce 뿐 아니라 Eats앱에서의 쿠팡 광고 사업에 초점을 맞추고 있으며, KPI를 분석하고, A/B 및 다변수 테스트 이니셔티브의 성공을 분석함으로써 이를 수행하고 있습니다. 담당 업무: 제품 및 수익화 전략: 다양한 입찰 및 수익화 모델을 개발하고 테스트하며, 변화가 수익에 미칠 수 있는 영향을 시뮬레이션합니다. 지표: 상품 채택, 사용, 성과 및 이탈에 대한 보고에 사용되는 일련의 핵심 성과 지표(KPI)를 제공합니다. 그룹의 수익화 노력이 효과가 있는지 파악할 수 있도록 가시성을 제공합니다. 예측: 인벤토리 및 수익 예측 모델을 수익 최적화 프로세스의 중요한 구성 요소로 설계 및 유지, 관리합니다. 판매 패턴을 분석하여 동향을 파악하고 재무 예측을 지원합니다. A/B 테스트: A/B 테스트를 설계, 수행, 분석 및 인사이트를 발굴하여 상품 이터레이션에 대해 알리고 우선순위를 정합니다. 통계적 엄밀성을 가지고 이를 수행하고 시험 결과 동인에 대한 "왜?"라는 질문에 대답하기 위해 코호트 연구를 수행합니다. 간결한 스토리 전달: 테스트 완료 시 시의적절하고 설득력이 있으며 사실에 기반한 테스트 내용을 정리해 데이터와 테스트 결과에 숨겨진 비즈니스 “스토리”를 강력히 전달합니다.

Jan 9, 2026
Apply
companydaangn logo
Full-time|On-site|SEOUL

Welcome to the Journey of Joining the Daangn Team!At Daangn, we strive to create an environment where individuals can grow alongside the company's growth.The Daangn recruitment team is here to help facilitate those moments of thoughtful collaboration with wonderful colleagues. Introducing the Data Value TeamThe Daangn team is dedicated to uncovering valuable information within local neighborhoods and resolving inconveniences in regional living. To create user value, it's essential to provide trustworthy information that users can easily access and incorporate into their decision-making. While Daangn already utilizes extensive data for decision-making, maximizing the value of our data requires significant changes.The vision of the Data Value Team is to make decisions for users through data every day. To realize this vision, we proactively tackle challenges in data value realization and lead the way in solving them.About the Data Software Engineer RoleThe Data Software Engineer plays a crucial role in addressing the challenges that arise during the process of data value realization through software engineering.In alignment with Daangn's rapid growth, you will design data systems that will not become bottlenecks in the future. You will ensure data reliability through automated testing and system observability. Additionally, you will solve technical problems that arise as Daangn members seek to understand users through data, thereby exponentially enhancing data-informed decision-making through data products (indicator platforms, experiment platforms, etc.).The mission of the Data Value Team's engineers is to facilitate a seamless flow of high-quality data at Daangn, enabling the creation of value without bottlenecks. Discover the Journey of the Data Value Team Growing with Daangn (Google Data Webinar) Learn about Daangn's Indicator Platform, KarrotMetrics Seven Challenges Daangn Faced in Implementing DBT and Airflow Tips for Easy Modeling with DBT from Daangn's Data Engineer (2024 Data Conference) Creating a Data Map at Daangn: Building Column Level Lineage No Need to Always Fetch Everything? Daangn's MongoDB CDC Build

Mar 16, 2026
Apply
companyJellyfish logo
Full-time|On-site|Seoul

At Jellyfish, we harness the power of diverse perspectives and foster inclusive collaboration. We invite individuals who thrive in collaborative, multidisciplinary teams and appreciate the unique contributions that every team member offers.Jellyfish is a global digital marketing agency that seamlessly blends technology enthusiasts, creative thinkers, and data and media specialists. Our mission is to empower clients on their digital journey through innovative strategies that rethink media activation and create compelling narratives for our global clientele and their audiences. Join us in crafting a future where business growth and personal fulfillment are intertwined.Job OpportunityWe are in search of an experienced Analytics Manager to lead our Analytics team in Korea. This pivotal role requires a data-savvy professional who comprehends the local market and is eager to provide exceptional analytics services to our clients in Seoul.Reporting directly to the APAC Analytics Lead, your primary objective will be to bridge the gap between technical data initiatives and our clients' strategic business goals. You will serve as a key advisor for client projects, enhancing their data utilization, while mentoring and training our team of junior analysts. Furthermore, you will play a vital role in collaborating with the senior leadership team on sales proposals and ensuring the consistent quality of our analytics deliverables.Key Responsibilities:**Technical Expert**: Demonstrate expertise in the Google Marketing Platform (GA4, GTM) and Visualization tools (Looker Studio) to guide the team on Web and App projects.**Sales and Proposals Support**: Assist the senior leadership team in articulating our services to clients in an accessible manner. Participate in crafting proposals, estimating project timelines, and calculating costs.**Team Mentorship**: Mentor junior team members by guiding their daily tasks, managing their workload to ensure work-life balance, and training them to hone their technical skills.**Client Lead**: Act as a principal advisor for client projects, developing long-term measurement plans, facilitating client workshops, and assisting clients in launching testing programs (A/B testing).**Quality Control & Documentation**: Ensure the quality of deliverables produced by junior team members, including reviewing tracking plans and troubleshooting complex technical issues.**Process Improvement**: Proactively identify opportunities for enhancing analytic processes.

Mar 24, 2026
Apply
companyToss Securities logo
Realtime Data Engineer

Toss Securities

Full-time|On-site|Seoul

Join Our Team!The Realtime Data Engineer will be part of the Realtime Data Team within our Data Division.This team operates a distributed messaging streaming platform, ensuring the stable transmission of large-scale financial transactions.We manage high-volume data pipelines that deliver data with zero latency while maintaining integrity.Moreover, we integrate real-time data into OLAP environments, enabling immediate business decision-making and service enhancement. Your Responsibilities:Operate and optimize our Kafka cluster to ensure high availability of large-scale data from Toss Securities.Utilize tools like CDC, Kafka Connect, Flink, and ksqlDB to construct real-time data pipelines.Manage OLAP to efficiently store and query large volumes of incoming real-time data, optimizing query performance.Enhance architecture for greater throughput and lower latency, proactively assessing and implementing next-gen technologies for reliable data services. Ideal Candidate:Experience managing large-scale data platforms, ensuring infrastructure stability and performance.Proven experience designing and operating Kafka-based architectures or a deep understanding of distributed messaging systems.Intermediate to advanced proficiency in Java (or Kotlin), capable of implementing complex business logic in real-time streaming frameworks (Flink, ksqlDB).Experience building or operating real-time analytics environments using OLAP systems like ClickHouse, StarRocks, Druid, or Pinot.A broad experience in Data Engineering or depth in a specific area, eager to expand your role.Strong foundation in Data Engineering skills, quick to learn new tech stacks, and adept at finding optimal solutions in diverse situations.Excellent communication skills to collaboratively tackle complex problems with the team. Joining Toss Securities:Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation > Final Offer and Onboarding Please Note:Any inaccuracies found in the resume or disciplinary issues during employment history may lead to cancellation of the application.Candidates who are prohibited from hiring or have disqualifying reasons according to Toss Securities' regulations may have their applications canceled.Individuals with disabilities or national veterans are given preferential treatment in accordance with relevant laws. A Note for Future Colleagues:Processing transaction data generated in the securities domain in real-time is of high importance from a business perspective and poses significant technical challenges.The Toss Securities Realtime Data Team is at the forefront of this effort, currently maintaining stable securities services.Toss Securities continues to grow. We hope that the entire process of maintaining the systems of a growing securities firm will be an enjoyable journey.

Mar 10, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

Join Our Team!Toss Securities is rapidly growing under the mission of "Innovating every investment experience for our customers", boasting over 7.4 million registered users and 4 million active monthly users. We are currently leading in foreign stock transaction volumes. Our diverse range of investment products, including stocks, bonds, and options, expands our customers' choices. At the core of this growth is our robust and trustworthy data platform. Toss Securities is at a pivotal moment, needing to design a data architecture capable of handling large-scale real-time trading data, user behavior data, and regulatory compliance data.We are seeking a Head of Data Engineering to design and lead this structural transformation. This role is not just about operations; it is key to defining the data future of Toss Securities and enabling our organization to work data-driven.Your Responsibilities:Design and build an on-premise distributed architecture aligned with our mid-to-long-term business strategy, resolving data silos through Hadoop enhancements and transitioning to a Kafka-centered streaming-first approach.Construct and operate large-scale batch and streaming pipelines based on Spark/Flink and Kafka, ensuring reliable data processing through high-availability ETL/ELT design and performance optimization.Establish and manage data standards (layering, naming, permissions), quality management (DQ rules, SLA, lineage), and regulatory compliance frameworks based on metadata and personal information (PII).Coordinate data interests across services, risk, accounting, AI, and backend teams, establishing and executing a comprehensive data strategy including integration strategies and ownership definitions.Design the goals, structure, and processes of the data organization, leading a growing team through coaching and decision-making while addressing technical debt and fostering a trust-based environment.Oversee the design and operation of ML platforms and infrastructure for LLM/recommendation services, collaborating with service teams to build model deployment, operation, monitoring standards, and automated pipelines.Ideal Candidate:10+ years of experience in data engineering or platform architecture.Experience designing and operating large-scale clusters (Hadoop, Kafka).Proficiency in real-time streaming and batch data processing architecture design.Experience in building data governance, quality, and permission management systems.Leadership experience in engineering organizations (5-50 team members) is preferred.Excellent coordination and communication skills across diverse organizational stakeholders.Additional Preferred Qualifications:Experience with financial data in securities, banking, or fintech.Experience handling data within regulatory environments (e.g., PIPA, Financial Transaction Act).Experience in building semantic layers or data meshes.Experience with real-time transaction or advertising/shopping service data.Experience designing and operating large-scale model training, serving, and MLOps/LLMOps pipelines (e.g., Kubeflow, Argo, H100/H200 GPU clusters, vLLM/Triton).Experience with feature stores for real-time recommendations, model optimization, profiling (e.g., BentoML, ONNX, torchserve), and LLM fine-tuning/RAG operations.

Mar 9, 2026
Apply
companyToss Securities logo
Full-time|On-site|Seoul

About the Team You Will JoinThe Product Owner for Toss Securities' AI Data Platform focuses on generating diverse investment information content through AI, enabling customers to make informed investment decisions.Your team is dedicated to gathering all types of data and creating a pipeline that allows AI to generate content tailored to customer needs.We collaborate with various internal silos to build the necessary infrastructure for seamless data and machine learning service delivery, fundamentally transforming the investment experience through enhanced search and recommendation capabilities.Additionally, you will belong to a PO/PM chapter where you can share and solve product management challenges with other Product Owners and Managers.Your ResponsibilitiesDefine challenges in the customer investment journey and hypothesize AI-driven solutions.Plan AI-based content and experiences to help customers become more comfortable with investing.Refine and structure diverse data sources, including investment data, to make them usable.Design products that assist in customer investment decision-making using the latest AI technologies, including RAG, LLM, and ML modeling.Collaborate with ML Engineers and Data Engineers to develop prototypes, enhance model performance, and bring products to market.What We Are Looking ForNo specific years of experience required; we value depth of experience over years worked.Experience in developing or planning data and AI-driven products (search/recommendation/ML/unstructured data/personalization/ETL) is essential.Experience in simplifying and standardizing complex data pipelines to quickly supply necessary data is highly preferred.Ability to clearly define customer problems and connect them with technical solutions.Strong communication skills to collaborate effectively with MLEs and DEs.Resume TipsClearly outline the flow of problem definition, solution derivation, collaboration with stakeholders, and the resulting outcomes for each product, service, or project.Include insights and achievements from the processes rather than just listing tasks.The Journey to Joining Toss SecuritiesApplication Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation > Final Offer and OnboardingPlease NoteIf any false information is found in your resume or if disciplinary actions are confirmed in your employment history, hiring may be canceled.Hiring may be canceled for candidates who fall under Toss Securities' hiring restrictions or disqualification criteria.

Mar 10, 2026
Apply
companyToss Bank logo
Full-time|On-site|Seoul

# About the Team/Position- The Data Product Manager (Log) is part of the Real-Time Data team.- This team ensures that all Toss Bank products operate on reliable data by designing and managing the entire process of log data from 'definition - collection - validation - utilization'.- The App Log Manager plays a crucial role in maintaining the integrity of log data over time, creating standards and quality criteria that ensure data reliability.# Responsibilities- Define and manage the log structure and standards across apps and servers (json schema, key/value rules).- Build and manage the Log Specification with a traceable history of key/value creation, modification, and deletion.- Minimize structural differences between FE, BE, and Native logs while enforcing a unified logging policy.- Design and enhance automated validation systems to ensure log data quality, including: - Syntax validation: checking fields, types, and structures - Semantic validation: ensuring consistency and meaning in logs for the same actions.- Balance team resources and efficiency during standardization and automation, optimizing overall organizational log operations.- Collaborate with data engineers, data analysts, and developers to institutionalize log standardization and QA processes across the organization.- Analyze and address the impact of UI/UX changes, such as design modifications or multilingual (i18n) expansions, on log meaning.- Design a balance between standardization, validation, and automation while taking responsibility for overall data quality.# Ideal Candidate- A self-starter who can structure the entire lifecycle from log definition to validation, with experience in writing Log Specifications and schema design, as well as key/value management.- Strong problem awareness regarding log quality assurance and validation automation, with practical experience in designing and improving validation/QA systems.- Experience in integrating logs from various platforms (FE, BE, Native) into a cohesive system or able to approach complex environments structurally.- Familiarity with dynamic parameters and unstructured log formats.- Proven experience in creating a governance structure focused on systems rather than individuals while collaborating with diverse roles (DA, DE, FE, BE, QA, Design) to improve data quality.- Understanding the influence of design changes and multilingual extensions on log meaning, with the ability to design realistic connections with designers and developers.- Capable of not just identifying complex issues but also designing and implementing practical processes to enhance data quality.# Resume Recommendations- Experience in defining, designing, and managing app logs or service logs.- Experience in log standardization, key/value rules, and schema management.- Experience in designing validation and QA automation (Python, SQL, Airflow, CI/CD, etc.).- Experience in integrating and managing FE/Native/BE logs for quality assurance.- Experience in addressing data quality issues such as log error detection, backfill, and reprocessing.- Experience collaborating with various roles (FE, DA, DE, QA) to enhance data quality.# Journey to Joining Toss- Application submission > Job interview > Cultural fit interview > Reference check > Compensation negotiation > Final acceptance and onboarding.# Important Notes- Any false information found in resumes or documents may lead to job cancellation.- Applicants falling under the disqualification criteria per Toss Bank's employment rules may also face cancellation.- Disabled individuals and veterans will receive preferential treatment in accordance with relevant laws.# A Colleague's Insight"We manage the log structure and standards across applications and servers."- The Log DPM at Toss Bank plays a pivotal role in ensuring that logs maintain their meaning over time and are trusted as reliable data.- This role provides a great sense of utility as it is responsible for overall data quality!

Mar 9, 2026

Sign in to browse more jobs

Create account — see all 497 results

Tailoring 0 resumes

We'll move completed jobs to Ready to Apply automatically.