About the job
Join Our Team
- The Data Analytics Engineer at Toss Securities is a key member of the Data Warehouse Team within the Data Division.
- Your responsibilities will focus on our Data Warehouse Platform, Business Mart, and CPC Mart.
- The CPC (Central Point of Contact) Mart is designed to establish a reliable data infrastructure that meets various regulatory demands (CPC, disclosures, periodic reports) while enhancing Toss Securities' external credibility and internal operational efficiency through automation and advancement.
- We are enhancing and refining the information shared with Business Mart and the proprietary data required for CPC.
- Our team consists of around 7 members with diverse backgrounds, ranging from 2 to 14 years of experience in various sectors such as portals, finance, gaming, and startups.
Your Responsibilities
- Implement and respond to external requests from regulatory authorities (CPC, disclosures, periodic reporting) through our systems.
- Design, build, and manage necessary data marts and dashboards for the required reports.
- Ensure the reliability of reporting systems through data integrity and quality management (DQ).
- Collaborate with various departments (domestic/international trading ledgers, accounts, compliance, PM, etc.) to provide data support for data-driven decision-making.
- Systematically manage reporting tasks based on legal frameworks.
- Establish a foundation for effectively utilizing data assets through data cataloging and standard management.
- Proactively resolve essential data processing tasks in our rapidly growing services through collaboration with colleagues.
- Enhance system efficiency by refactoring and optimizing existing mart tables through data modeling that considers consistency, reusability, and scalability.
Who We Are Looking For
- Experience in CPC-related tasks is preferred.
- A strong knowledge of the securities domain or active experience in stock trading is a plus.
- You should be able to clearly define key concepts in the securities domain as a DW data modeler and lead the design of understandable and clear data structures.
- We need someone who has experience simplifying complex data models or automating repetitive problems.
- You should be able to propose efficient data processing methods while maintaining data standards based on smooth communication with various stakeholders.
- Experience in defining enterprise data standards and structuring tables through data cataloging would be beneficial.
- You must possess the capability and experience to take the lead in data warehouse/mart modeling, pipeline construction, and operations.
- You should be able to provide standards from the perspective of clear data structures and efficient data utilization, rather than merely processing simple requests.
- Strong SQL skills are required, with the ability to write efficient and readable code.
- Experience in data pipeline development based on Hadoop, Airflow, and DBT is a plus.
- In some cases, a basic understanding of pySpark may be required.
- Experience with BI tools (such as Tableau) is a plus.
Resume Recommendations
- Detail your experience designing and building Data Warehouses considering requirements and data infrastructure environments.
- Include the problems you wish to solve and how you approached and resolved them.
- Highlight your important table design methodologies while building Data Warehouses in your resume.
- Detail any work you have done related to data governance.
- Include specific experiences where you boosted data utilization by leveraging DW tables, such as business analysis or reporting automation.
- Be specific about your experience managing data quality, such as handling duplicates or outliers.

