companyToss Securities logo

Data Engineer (AI) at Toss Securities | Seoul

On-site Full-time

Clicking Apply Now takes you to AutoApply where you can tailor your resume and apply.


Unlock Your Potential

Generate Job-Optimized Resume

One Click And Our AI Optimizes Your Resume to Match The Job Description.

Is Your Resume Optimized For This Role?

Find Out If You're Highlighting The Right Skills And Fix What's Missing

Experience Level

Mid to Senior

Qualifications

Minimum of 5 years of data engineering experience. Strong analytical skills to assess requirements and technical trade-offs for optimal data architecture. Experience with large-scale distributed processing and data platforms. Ability to mentor and support technical growth within a team. Understanding of AI principles and their practical application in engineering. Strong communication skills for collaboration across various functions. Proactive learner with a passion for growth and new challenges.

About the job

Join Our Dynamic Team

The Data Engineer (AI) position is part of the AI Data Platform Team at Toss Securities.

  • The AI Data Platform Team comprises Data Engineers, Machine Learning Engineers, Server Engineers, and Product Operation Managers, fostering collaboration across various roles.
  • Our mission is to develop a unique data moat for Toss Securities through the integration of diverse securities domain data and AI technologies, providing essential insights for investors.
  • We utilize external LLMs and conduct training and evaluation of our internally developed models while leveraging various data platform technologies.

Your Responsibilities

  • Proactively identify and lead projects to solve business challenges at Toss Securities, overseeing the entire process from data architecture design to development and operation.
  • Build and manage a securities data platform that integrates, processes, and serves global market data.
  • Establish and maintain a knowledge graph platform for real-time domain data.
  • Create and operate data pipelines that underpin AI service products.
  • Develop and manage a feature store for personalized recommendation services in real-time.
  • Ensure data integrity by designing, developing, and operating data quality verification and monitoring systems.

We Seek Candidates Who

  • Have over 5 years of experience in data engineering.
  • Can comprehend requirements and analyze technical trade-offs to determine the optimal data architecture in a given environment.
  • Possess a solid understanding and experience in large-scale distributed processing and data platforms.
  • Have experience sharing knowledge with peers and junior engineers, contributing to the technical growth of the entire team.
  • Are interested in leveraging AI beyond mere tools, understanding its principles to innovate engineering productivity.
  • Can coordinate with colleagues across various functions and provide constructive feedback.
  • Are eager to take on new challenges and proactively learn and grow.

Preferred Experiences

  • Experience with Kafka-based stream processing and large-scale distributed data processing (Hadoop/ClickHouse/ElasticSearch).
  • Experience building and operating data pipelines using Airflow, Docker, and Kubernetes.
  • Experience in monitoring and managing data integrity and quality.
  • Stay up-to-date with the latest trends in AI/data technologies and have an interest in automation and productivity enhancement.

About Toss Securities

Toss Securities is a forward-thinking financial technology company operating in the securities domain, committed to leveraging data and AI to provide unparalleled services to investors.

Similar jobs

Tailoring 0 resumes

We'll move completed jobs to Ready to Apply automatically.