COVU Logo

COVU

Data Engineer

Posted 16 Days Ago
Remote or Hybrid
Hiring Remotely in California, USA
Mid level
Remote or Hybrid
Hiring Remotely in California, USA
Mid level
The Data Engineer will build automated pipelines, harmonize messy data, and integrate data operations with an AI mindset. Responsibilities include developing robust workflows, optimizing data processing, collaborating with teams, and maintaining operational excellence.
The summary above was generated by AI

About COVU
COVU is a venture-backed technology startup transforming the insurance industry. We empower independent agencies with AI-driven insights and digitized operations, enabling them to manage risk more effectively. Our team is building an AI-first company set to redefine the future of insurance distribution.

Location:

This role can be hybrid or remote. If the candidate is based in the Los Angeles (LA) area, it will be a hybrid role working from our office in West Hollywood. For candidates based anywhere else in the US, this will be a fully remote role.

The Role

We are looking for a crafty, execution-focused Data Engineer to join our Platform team. We’ve spent the last year building the foundation of COVU Connect - our proprietary data warehouse. Now, we are moving into a high-velocity phase: scaling the Golden Policy and Account records to power our AI-native operational platform, COVU OS.

This is not a role for someone who wants to spend months in "discovery." We need a builder who thrives in a defined architectural landscape, leverages AI tools (like Gemini-CLI) to ship code faster, and understands that speed-to-market is our most important KPI. You will work directly with our Lead Data Engineer to transform complex insurance logic into performant, automated pipelines.

What You’ll Do (The Mission)

  • Execute the "Golden Records": Be the primary builder of the Golden Policy Journal and Golden Account Cluster. You will implement the harmonization and arbitration logic that turns messy carrier data into our single source of truth.
  • AI-Augmented Development: Proactively use AI tooling (Gemini, Copilot, etc.) to accelerate ETL development, unit testing, and documentation. We value "smart speed."
  • Build & Optimize Pipelines: Develop and maintain robust DAGs in Airflow and models in dbt to ensure our data is processed with high integrity and point-in-time accuracy.
  • Operational Excellence: Implement "quarantine" logic for bad data and build reconciliation triggers to ensure our internal AMS matches our "Golden" state.
  • Modernize & Refactor: Work within our Python-based framework to systematically replace legacy processes, ensuring every line of code is modular and SOC2 compliant.
  • Collaborate via Agile: Participate in tight feedback loops with Product and Tech Leads to deliver comprehensive data integration.

What We’re Looking For

  • 3–5 years of experience in data engineering. You’ve moved past the "learning" phase and are now focused on high-quality delivery. With demonstrated ownership of production pipelines - not just contributing to them.
  • SQL & Python Fluency: You can write complex analytical SQL and clean, modular Python in your sleep.
  • Modern Data Stack Experience: Hands-on experience with Snowflake and dbt
    • You understand how to build dimensional models that don’t just store data but solve business problems. 
    • You've optimized queries, managed costs, not just run SELECT statements. 
    • You understand project structure, testing strategies, incremental models, and materializations.
  • Orchestration Skills: You've built and debugged DAGs in production, understand task dependencies, retries, and have dealt with scheduling or executor issues firsthand.
  • AI-Native Mindset: You are comfortable using (and want to use) AI tools to handle boilerplate code, debug complex queries, and speed up your workflow.
  • Communication: You can explain why a pipeline failed and how you’re fixing it without needing jargon gymnastics.
  • Pragmatism: You know when to build a "perfect" solution and when to build a “tactical” solution that works today and scales tomorrow.
  • Ownership and Trust: You are able to power ahead on your own and seek guidance when you need it, not when you're stuck for a week.
  • Operational instincts: You monitor what you build. You've set up alerts, investigated pipeline failures, and implemented fixes that prevent recurrence.

Bonus Points For

  • Insurtech Background: If you know the difference between an Endorsement and a Reinstatement, or if you’ve ever wrestled with AMS/AL3 data, we want to talk to you.
  • AWS Familiarity: Experience with S3, Lambda, ECS, RDS.
  • Legacy Parsing: Ability to read/understand Java or SQL-based ETLs to help migrate them into our new Python/Snowflake environment.

Why COVU?

We are past the "experimental" stage. We have a clear vision, a stabilized architecture, and a market that is hungry for our platform. You’ll be joining a team where your work directly impacts the operational success of insurance agencies across the country.

Application Process:

  1. Intro call with People team
  2. Technical interviews
  3. Final interview with leaders

Please be strictly advised that the use of any real-time AI assistance, screen-reading software, or external aids during the application process is strictly prohibited. We employ active detection methods to ensure the integrity of our hiring process. Any violation of this policy will result in the immediate termination of the interview and permanent disqualification of your candidacy.

HQ

COVU Redwood, California, USA Office

Redwood, California, United States

Similar Jobs

An Hour Ago
Remote or Hybrid
CA, USA
85K-120K Annually
Mid level
85K-120K Annually
Mid level
Cloud • Computer Vision • Information Technology • Sales • Security • Cybersecurity
As a Data Engineer, you'll lead data engineering projects, develop data transformations, build workflows with Apache Airflow, and ensure data quality and integrity while collaborating with stakeholders.
Top Skills: AirflowCi/CdDbtGitPythonRedshiftSnowflake
3 Days Ago
Easy Apply
Remote or Hybrid
United States
Easy Apply
102K-154K Annually
Mid level
102K-154K Annually
Mid level
Artificial Intelligence • Cloud • Computer Vision • Hardware • Internet of Things • Software
As a Data Engineer II, you will build, scale, and optimize data platforms, focusing on ETL/ELT pipelines, and collaborate with data scientists and AI engineers to support data-driven applications and insights.
Top Skills: DatabricksDbtPythonRdsRedshiftSnowflakeSparkSQL
18 Days Ago
Remote or Hybrid
USA
125K-159K Annually
Mid level
125K-159K Annually
Mid level
AdTech • Automotive • Big Data • Consumer Web
As a Data Engineer at Edmunds, you will architect and maintain data platforms for analytics and AI/ML, ensuring scalable, reliable data processing and supporting cross-functional collaboration.
Top Skills: AirflowAWSDatabricksPythonScalaSparkSQL

What you need to know about the San Francisco Tech Scene

San Francisco and the surrounding Bay Area attracts more startup funding than any other region in the world. Home to Stanford University and UC Berkeley, leading VC firms and several of the world’s most valuable companies, the Bay Area is the place to go for anyone looking to make it big in the tech industry. That said, San Francisco has a lot to offer beyond technology thanks to a thriving art and music scene, excellent food and a short drive to several of the country’s most beautiful recreational areas.

Key Facts About San Francisco Tech

  • Number of Tech Workers: 365,500; 13.9% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Google, Apple, Salesforce, Meta
  • Key Industries: Artificial intelligence, cloud computing, fintech, consumer technology, software
  • Funding Landscape: $50.5 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Sequoia Capital, Andreessen Horowitz, Bessemer Venture Partners, Greylock Partners, Khosla Ventures, Kleiner Perkins
  • Research Centers and Universities: Stanford University; University of California, Berkeley; University of San Francisco; Santa Clara University; Ames Research Center; Center for AI Safety; California Institute for Regenerative Medicine

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account