hero

Craft Ventures Portfolio Job Board

Data Engineer

Route

Route

Software Engineering, Data Science
Lehi, UT, USA
USD 138k-146k / year + Equity
Posted on Apr 10, 2026

We are Route

Buying stuff online can get messy once you hit that “order” button. Managing dozens carrier tracking links, dealing with lost or damaged packages, and resolving issues with customer support can feel like a wild goose chase. That’s why we created the Route — to make the post-purchase experience for consumers like you, and the brands you love, as seamless as possible.

Route is on a mission to connect the world’s commerce. Through our network of millions of Route App users and thousands of merchants, we’re making it easier than ever for consumers to track, insure, and discover their favorite products in one place — which connects the world’s best direct-to-consumer brands to happy, repeat customers.

Since Route launched in 2018, we’ve been on a journey to build innovative products that empower our customers, all while fostering a people-first, values-driven company culture. We’re looking for talented people across the ecommerce space to join us on the next steps of this adventure.

Don’t just take our word for it! Discover what life at Route has to offer.

The team

The Data Engineering team (DAENGR) is the backbone of Route's data ecosystem. We are primarily responsible for the data infrastructure, quality, standards, frameworks, and architecture that power enterprise data, reporting, and analytics across the entire organization. We solve problems of poor data quality and limited accessibility, simplify reporting for teams outside of data engineering, and monitor the uptime, security, and consistency of the majority of Route's data lifecycle. We collaborate closely with every corner of the business to make data a first-class citizen at Route.

The opportunity

This is a rare chance to shape the data destiny of a young, fast-growing company at an inflection point. As Route moves into the AI era of data infrastructure, you'll have the opportunity to leave a lasting legacy — exploring new systems and designs, and building tools that genuinely change how Route operates day to day. We are actively migrating from legacy Snowflake pipelines into a modern Databricks-first architecture, building out an Enterprise Data Warehouse (EDW) with a normalized 3NF core, and laying the groundwork for AI-ready data systems. You'll be co-authoring the next chapter of data at Route.

The ideal candidate is organized and articulate in their thinking, able to adapt and compromise to keep pace with business needs, but firm enough to push back when security and data integrity are at stake. We aim to help Route improve profitability, cash flow, and return on investment by providing accurate data to make informed decisions and process improvements.

What you’ll do

  • Build and maintain ELT pipelines that ingest data from systems.
  • Co-own the mapping and migration of source data into the new 3NF EDW, ensuring data integrity, reducing redundancy, and maintaining automated unit and data tests.
  • Develop data observability processes and monitoring dashboards to track pipeline health, freshness, and data quality across Databricks.
  • Build new data and AI-powered tooling to improve the productivity of the data engineering team and broaden self-service data access for Route employees and external partners.
  • Help harden the Integration Pipeline by automating deployment of shared staging and production infrastructure for new pipelines and managing dependency updates for dbt and CI templates.
  • Support the full migration from Snowflake to Databricks, targeting completion by end of Q2 2027, including reporting services and ingest/egress jobs.
  • Coordinate with engineering, analytics, product, and business teams to define and prioritize data requirements and ensure end-to-end data lifecycle coverage for existing and new products.
  • Champion data democratization, help establish a company-wide data retention policy, and expand the foundation for a self-service Silver layer (EDW) that serves as a single source of truth.

What we’re looking for

  • 4+ years of formal, professional data engineering experience
  • 3+ years of SQL, fluency in complex transformations, window functions, query optimization
  • 2+ years of python, data pipeline development, scripting, testing, and package management (Poetry)
  • 2+ years of experience with AWS (e.g. - S3, RDS, DMS, DynamoDB) across data-related services
  • 1+ years of experience using Databricks, our primary development platform for this role
  • Experience using Terraform and GoLang
  • PagerDuty / Grafana / Tableau, preferred experience
  • Understanding of third normal form (3NF) data modeling and when to apply it
  • Knowledge and application of data theory
  • Working knowledge of data security practices and least-privilege access standards
  • Experience with data access controls in cloud environments (IAM roles, catalog permissions, etc.)

Equal opportunity for all

Route is an Equal Opportunity Employer. We embrace diversity and equal opportunity in a serious way. We are committed to building a team that represents a variety of backgrounds, perspectives, and skills. The more inclusive we are, the better our work will be.

Total Rewards

We know our team works best when everyone feels happy, healthy, and supported. We offer to pay 95% - 100% of your health insurance premiums for you and your family, remote or hybrid work arrangements, unlimited PTO, 401k matching, formalized growth opportunities, learning & development, DEI programs & events, and so much more.

Pay Transparency

Salary for this role: $138,000 - $146,000

The cash compensation above includes base salary, and is not reflective of potential commission for employees in eligible roles, or annual bonus targets under Route’s bonus plan for eligible roles. In addition to cash compensation, all Route employees are eligible to participate in Routes equity incentive plan to receive stock options per the terms of the agreement. Some roles may also be eligible for overtime pay. Individual compensation packages are based on a few different factors unique to each candidate, including their career level, skills, experience, specific geographic location qualifications and other job-related reasons.