About the job
About the Team You'll Join
- The Data Analytics Engineer at Toss Securities is part of the Data Warehouse Team within the Data Division.
- Your responsibilities will be focused on Data Platform and Data Mart tasks.
- While your primary focus will vary, you will also engage in cross-functional projects.
- The Platform tasks involve maintaining and optimizing ETL/Pipeline Tools to effectively manage the DW Mart tables.
- You will explore and implement new methods to reduce DW operation time with limited resources.
- Our goal is to maximize data utilization across the organization using tables managed by the DW team.
- The current team consists of approximately 7 members with varying experience levels ranging from 2 to 14 years, coming from diverse backgrounds including portals, banking, gaming, and startups.
Curious about the Data Division?
- The Data Division at Toss Securities aims to become the world's leading securities firm in data handling, contributing through data technology, services, and data-driven decision-making.
- We foster close collaboration among various data professionals and enjoy our work.
- Regular Tech Weekly sessions allow us to share expertise, and you can freely engage with different teams to learn from each other.
Your Responsibilities
- Experience and contribute to an efficient DW environment within a rapidly growing agile organization.
- Design data marts and develop and automate DW Data Workflows based on the Hadoop Ecosystem and open-source solutions.
- Identify and implement methods for structuring and automating numerous DW/Mart tables.
- Process large volumes of data swiftly and effectively to create and manage various features.
- Establish Data Quality Checks and Governance within the data marts.
- Experience in deriving and establishing system requirements for large data processing and analysis is a plus.
Ideal Candidate
- At least 5 years of experience as a Data Engineer is essential.
- You should possess a fundamental understanding of RDBMS, Hadoop Ecosystem, and Data Warehousing.
- Proven experience in leading the design, construction, and operation of data marts is required.
- You should be capable of installing, operating, and troubleshooting Airflow, DBT, and Django, with the ability to modify open-source tools to develop features needed for securities DW.
- Experience in simplifying complex problems or automating repetitive tasks using data models is critical.
- Extensive experience in efficiently processing big data using Spark is highly desirable.
- Intermediate proficiency in Python and advanced skills in SQL are required.
Resume Tips
- If you have resolved critical issues while operating platforms or optimized performance and system resource usage, please include those experiences. Be specific about impactful projects you have worked on.
- If you have addressed bugs or issues while using open-source tools, or developed or enhanced features, please detail those experiences.
- Highlight the results of any improvements made in actual services, quantified if possible (please exclude sensitive information if necessary).
Join Toss Securities
- Application Submission > Job Interview > Cultural Fit Interview > Reference Check > Compensation Negotiation >...

