COVU Logo

COVU

Senior Data Engineer

Reposted 17 Hours Ago
Remote
Hiring Remotely in United States
Senior level
Remote
Hiring Remotely in United States
Senior level
The Senior Data Engineer will develop and maintain scalable data pipelines, ensure data quality, and modernize ETL processes to support the analytics platform.
The summary above was generated by AI

About COVU
COVU is a venture-backed technology startup transforming the insurance industry. We empower independent agencies with AI-driven insights and digitized operations, enabling them to manage risk more effectively. Our team is building an AI-first company set to redefine the future of insurance distribution.

Location:

This role can be hybrid or remote. If the candidate is based in the Los Angeles (LA) area, it will be a hybrid role working from our office in West Hollywood. For candidates based anywhere else in the US, this will be a fully remote role.

Role Overview

 We are seeking an experienced and product-focused Senior Data Engineer to be a core member of our Platform product team. This is a high-impact role where you will play a pivotal part in evolving our core data infrastructure.

Your primary mission will be to develop key components of our "Policy Journal" - the foundational data asset that will serve as the single source of truth for all policy, commission, and client accounting information. You will work closely with the Lead Data Engineer and business stakeholders to translate requirements into robust data models and scalable pipelines that drive analytics and operational efficiency for our agents, managers, and leadership.

This role requires a blend of greenfield development, strategic refactoring of existing systems, and a deep understanding of how to create trusted, high-quality data products.

What You’ll Do:

  • Develop the Policy Journal: Be a primary builder of our master data solution that unifies policy, commission, and accounting data from sources like IVANS and Applied EPIC. You will implement the data models and pipelines that create the "gold record" powering our platform.
  • Ensure Data Quality and Reliability: Implement robust data quality checks, monitoring, and alerting to ensure the accuracy and timeliness of all data pipelines. You will champion and contribute to best practices in data governance and engineering.
  • Build the Foundational Analytics Platform: Implement and enhance our new analytics framework using modern tooling (e.g., Snowflake, dbt, Airflow). You will build and optimize critical data pipelines, transforming raw data into clean, reliable, and performant dimensional models for business intelligence.
  • Modernize Core ETL Processes: Systematically refactor our existing Java & SQL (PostgreSQL) based ETL system. You will identify and resolve core issues (e.g., data duplication, performance bottlenecks), strategically rewriting critical components in Python and migrating orchestration to Airflow.
  • Implement Data Quality Frameworks: Working within our company's QA strategy, you will build and execute automated data validation frameworks. You will be responsible for writing tests that ensure the accuracy, completeness, and integrity of our data pipelines and the Policy Journal.
  • Collaborate and Contribute to Design: Partner with product managers, the Lead Data Engineer, and business stakeholders to understand complex business requirements. You will be a key technical contributor, translating business needs into well-designed and maintainable solutions.

What We're Looking For:

  • 5+ years of experience in data engineering, with a proven track record of building and maintaining scalable data pipelines in production.
  • Expert-level proficiency in Python and SQL.
  • Deep Airflow & DAG Expertise: Proven hands-on experience designing, creating, and independently managing multiple complex DAGs (Directed Acyclic Graphs) within Airflow to robustly orchestrate interdependent enterprise-scale data pipelines.
  • Strong experience with modern data stack technologies, including a cloud data warehouse (Snowflake or Redshift), a workflow orchestrator (Airflow is highly preferred), and data transformation tools.
  • Hands-on experience with AWS data services (e.g., S3, Glue, Lambda, RDS).
  • Experience in the insurance technology (insurtech) industry and familiarity with insurance data concepts (e.g., policies, commissions, claims).
  • Demonstrated ability to contribute to the design and implementation of robust data models (e.g., dimensional modeling) for analytics and reporting.
  • A pragmatic problem-solver who can analyze and refactor complex legacy systems. While you won't be writing new Java code, the ability to read and understand existing Java/Hibernate logic is a strong plus.
  • Excellent communication skills and the ability to collaborate effectively with both technical and non-technical stakeholders.

Bonus Points For:

  • Direct experience working with data from Agency Management Systems like Applied EPIC, Nowcerts, EZlynx, etc… 
  • Direct experience working with Carrier data (Accord XML, IVANS AL3) 
  • Experience with business intelligence tools like Tableau, Looker, or Power BI.
  • Prior experience in a startup or fast-paced agile environment.

Application Process:

  1. Intro call with People team
  2. Technical interviews
  3. Final interview with leaders

Please be strictly advised that the use of any real-time AI assistance, screen-reading software, or external aids during the application process is strictly prohibited. We employ active detection methods to ensure the integrity of our hiring process. Any violation of this policy will result in the immediate termination of the interview and permanent disqualification of your candidacy.

Top Skills

Airflow
AWS
Dbt
Glue
Lambda
Postgres
Python
Rds
S3
Snowflake
SQL
HQ

COVU Redwood, California, USA Office

Redwood, California, United States

Similar Jobs

9 Days Ago
Easy Apply
Remote
United States
Easy Apply
125K-155K Annually
Senior level
125K-155K Annually
Senior level
Cannabis • eCommerce • Enterprise Web • Logistics • Payments • Software • Database
Lead the evolution of data platform by designing scalable pipelines, improving data processes, mentoring teams, and modernizing data architecture.
Top Skills: AirflowAWSDbtFivetranLambdaPythonRedshiftSigma
16 Days Ago
Easy Apply
Remote
United States
Easy Apply
141K-158K Annually
Senior level
141K-158K Annually
Senior level
Insurance
The Senior Data Engineer will architect, build, and maintain scalable data pipelines and infrastructure, mentor junior engineers, and collaborate on data solutions for Openly's insurance platform.
Top Skills: AivenSparkBigQueryDebeziumGCPGoKafkaPostgresPythonSQLTerraform
16 Days Ago
Easy Apply
In-Office or Remote
United States
Easy Apply
180K-200K Annually
Senior level
180K-200K Annually
Senior level
Artificial Intelligence • Hardware • Healthtech • Software
The Senior Data Engineer will own and evolve data pipelines, collaborate with teams for data models and pipelines, optimize ETL processes, and implement data storage solutions.
Top Skills: AWSDatabricksETLJavaPythonSQL

What you need to know about the San Francisco Tech Scene

San Francisco and the surrounding Bay Area attracts more startup funding than any other region in the world. Home to Stanford University and UC Berkeley, leading VC firms and several of the world’s most valuable companies, the Bay Area is the place to go for anyone looking to make it big in the tech industry. That said, San Francisco has a lot to offer beyond technology thanks to a thriving art and music scene, excellent food and a short drive to several of the country’s most beautiful recreational areas.

Key Facts About San Francisco Tech

  • Number of Tech Workers: 365,500; 13.9% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Google, Apple, Salesforce, Meta
  • Key Industries: Artificial intelligence, cloud computing, fintech, consumer technology, software
  • Funding Landscape: $50.5 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Sequoia Capital, Andreessen Horowitz, Bessemer Venture Partners, Greylock Partners, Khosla Ventures, Kleiner Perkins
  • Research Centers and Universities: Stanford University; University of California, Berkeley; University of San Francisco; Santa Clara University; Ames Research Center; Center for AI Safety; California Institute for Regenerative Medicine

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account