About the job
About Canals
Canals is a pioneering remote startup revolutionizing the industrial supply chain, a staggering $10 trillion industry, through cutting-edge AI technology. Our innovative platform integrates seamlessly with existing distributor systems, automating monotonous tasks and minimizing failure points in the global movement of physical goods.
With a diverse team of 70 talented individuals, including approximately 45 engineers, we are spread across North and South America, fostering a collaborative and dynamic work environment.
The Role
We are looking for a skilled Data Analyst to join our team. This position emphasizes a platform-first approach, moving beyond basic team reporting. You will play a vital role in transforming ad-hoc business inquiries into scalable, production-ready data models. These foundational assets will serve as essential resources for the entire organization.
Collaboration with business stakeholders is crucial for successful project outcomes; however, your primary responsibility will be the data warehouse, ensuring data accuracy, visibility, and usability for all teams.
What You’ll Do
Take ownership of key business metrics throughout their lifecycle: design, implement, test, document, and maintain them using dbt and Snowflake.
Transform ad-hoc analyses into durable, reusable data warehouse artifacts instead of temporary dashboard fixes.
Set up and maintain automated data tests, monitors, and lineage; ensure integration into CI processes.
Diagnose ingestion and transformation challenges, providing upstream solutions (dbt/Snowflake) instead of implementing stopgap measures.
Work closely with other teams to guarantee that data is efficient, stable, and production-ready for modeling.
Assist product and commercial teams by delivering insightful analysis and recommendations.
Enhance self-service documentation, tools, and onboarding processes to enable others to utilize the warehouse independently.
Highlight platform roadmap initiatives: identify recurring needs that should evolve into new marts, schemas, or pipelines.
What You'll Bring
Demonstrated experience in production SQL (Snowflake, BigQuery, Redshift, or similar).
Robust experience with dbt: models, tests, macros, CI practices, and modular modeling techniques.
A proven history of converting one-off analyses into reusable data models.
Familiarity with Python for data-related tasks, Git workflows, and CI pipelines.
Strong grasp of data lineage, monitoring, and testing best practices.

