About the job
TL;DR - Why This Role Matters
Join Scribe as a Senior Analytics Engineer and take on a pivotal role in shaping our data infrastructure to drive insights for over 5 million users and Fortune 500 clients. This hands-on position involves architecting scalable data solutions using Snowflake and dbt, enhancing our orchestration capabilities with Dagster or Airflow, and enabling AI-driven self-serve analytics through Sigma. Your expertise will be instrumental in scaling our data foundation from 5 million to 50 million users.
About the Role
As a Senior Analytics Engineer at Scribe, you will:
Collaborate with the Data Engineering team and report directly to Fardad Golshany, working closely with Engineering, Product, GTM, and Finance teams.
Design and construct scalable analytics data marts in Snowflake using dbt to facilitate data science and insights across various business domains.
Maintain and enhance existing data pipelines, developing new ETL and reverse ETL workflows for reliable data movement.
Implement and expand orchestration infrastructure utilizing Dagster or Airflow, along with monitoring and alerting via Metaplane.
Create dynamic executive dashboards in Sigma to accelerate insights and promote AI-driven self-service analytics.
Establish data governance standards that include automated testing, thorough documentation, and semantic definitions to minimize ad-hoc queries.
Optimize dbt pipeline performance by employing incremental models, enhancing materializations, and adhering to engineering best practices.
Location
While we hire talent from around the globe, our headquarters is in the vibrant city of San Francisco, California. For this position, we anticipate that you will reside in or near San Francisco and work on-site three days a week.
What Makes You a Great Fit
A minimum of 4 years of experience in developing production data pipelines and analytics infrastructure in cloud-native environments, with strong expertise in SQL and large-scale data transformation.
Advanced production experience with dbt (2+ years), encompassing writing macros, packages, and implementing best practices, along with hands-on experience in performance tuning and cost optimization in Snowflake.

