PURE Insurance Logo

PURE Insurance

Senior Data Engineer

Reposted 23 Days Ago
Remote
Hiring Remotely in United States
120K-145K Annually
Senior level
Remote
Hiring Remotely in United States
120K-145K Annually
Senior level
The Senior Data Engineer will design and maintain data pipelines, ensure data quality, and collaborate closely with analytics teams using modern technologies like Databricks and AWS services.
The summary above was generated by AI

About PURE Insurance

PURE Insurance is actively investing in our data, analytics, data science, machine learning (ML), and artificial intelligence (AI) capabilities. We are building a centralized Data & ML/AI department that brings together specialists across data architecture, engineering, analytics, governance, and advanced modeling. This unique structure provides an extraordinary opportunity for growth, collaboration, and innovation (within your craft and across related disciplines).

As part of this team, you will engage with every function across the insurance ecosystem: Claims, Underwriting, Member Experience, Actuarial/Pricing, Product, Sales & Distribution, Marketing, HR, Finance, and more. We work with a modern data technology stack that includes platforms such as Databricks, dbt, GitHub, Hex, and Arize, while also developing in-house production-grade software in Python and other languages when it provides a competitive advantage.

At PURE, we embrace curiosity, innovation, and a relentless pursuit of improvement. We prioritize doing the right thing, even when it requires taking the more challenging path. Our culture values growth, mentorship, transparency, and ownership, and we know that building a strong foundation takes time and a willingness to experiment, learn, and adapt.

The Role

We are looking for a Senior Data Engineer to help design, build, and maintain the upstream components of our data platform (everything from ingestion and pipelines to real-time streaming and data quality frameworks). This is a hands-on, highly collaborative role: you will be coding daily, mentoring peers, and working closely with analytics engineers, data scientists, and business partners to deliver reliable, high-quality data that powers our silver and gold layers.

As part of a team building something from the ground up, you will have the freedom to bring fresh ideas, use the right tools for the job, and help shape the technical foundation of our future. We welcome engineers with strong expertise in Python and Apache Spark, along with deep proficiency in Databricks and Unity Catalog.

What You’ll Do

  • Data Pipelines: Design, build, and maintain scalable data pipelines for batch and real-time processing, ensuring reliability, efficiency, and maintainability.
  • Ingestion and Integration: Implement data ingestion frameworks, including change data capture (CDC) from core systems, APIs, and third-party platforms (Salesforce, Workday, Duckcreek, etc.).
  • Spark and Databricks: Build and optimize distributed data processing jobs using Apache Spark on Databricks. Implement Delta Lake, DLT pipelines, and medallion and lakehouse architectures.
  • Quality, Governance and Security: Ensure data quality, lineage, and governance using Unity Catalog, testing frameworks, and CI/CD practices. Implement role-based access control, encryption, and auditing.
  • Collaboration and Leadership: Partner closely with Analytics Engineers (who own modeling with dbt) to ensure upstream data pipelines deliver clean, well-structured inputs. Familiarity with dbt is a plus. Mentor peers, contribute to architectural decisions, and encourage craftsmanship and pragmatic problem-solving.
  • Cloud and DevOps: Leverage AWS cloud services (such as S3, Glue, Lambda) along with GitHub and CI/CD pipelines to enable scalable, production-grade deployments. Experience with other major cloud platforms (Azure, GCP) is also valued.

What We’re Looking For

  • Experience: 5+ years of professional experience in data engineering, ideally with exposure to P&C insurance or broader insurance or financial services.
  • Core Skills: Proficiency in Python, SQL, and Spark for building and optimizing data pipelines.
  • Databricks & Azure Expertise: Hands-on experience with Databricks (Unity Catalog, Delta Lake, DLT pipelines) and/or Azure Data Services (ADF, ADLS, Synapse) is preferred.
  • Cloud Platforms: Strong experience with AWS data services, with flexibility to apply knowledge from Azure or GCP.
  • Architecture Knowledge: Familiarity with modern data architectures (medallion, lakehouse, streaming).
  • Engineering Practices: Comfortable with GitHub, CI/CD pipelines, and testing frameworks for production-grade engineering.
  • Mindset: Strong problem-solving skills and a pragmatic approach. You know when to build for scale and when to ship quickly. You are passionate about craftsmanship, curious about new approaches, and believe that great work is best done as a team.

Why Join Us?

  • Be part of the ground floor of a major technological reinvention at a highly rated P&C insurance company.
  • Work with modern tools and architectures without being locked into a single way of doing things.
  • Collaborate with smart, passionate colleagues who value ownership, transparency, and innovation.
  • Enjoy flexibility with a remote-first setup, plus the option to connect in person at offices across major U.S. cities.
  • We pay competitive market value for top talent and provide benefits that support your growth and well-being.
The base salary for this role can range from $120,000 to $145,000 based on a full-time work schedule. An individual’s ultimate compensation will vary depending on job-related skills and experience, geographic location, alignment with market data, and equity among other team members with comparable experience

Want to Learn More?

  • [Our Values]
  • [Our Benefits]  
  • [Our Community Impact]
  • [Our Leadership]

Top Skills

AWS
Azure
Ci/Cd
Databricks
Delta Lake
Dlt
GCP
Git
Python
Spark
SQL
Unity Catalog

PURE Insurance San Francisco, California, USA Office

1 Post Street, Suite 1025,, San Francisco, CA , United States, 94104

Similar Jobs

Yesterday
Remote or Hybrid
3 Locations
102K-171K Annually
Senior level
102K-171K Annually
Senior level
eCommerce • Information Technology • Retail • Industrial
As a Senior Data Engineer, you will design and maintain scalable data pipelines, develop data products, and collaborate across teams to deliver insights.
Top Skills: SparkAws GlueCloudFormationDockerGithub ActionsKafkaKubernetesPostgresPythonSap S/4HanaScalaSnowflakeSQLTerraform
Yesterday
Easy Apply
Remote
United States
Easy Apply
118K-184K Annually
Senior level
118K-184K Annually
Senior level
AdTech • Digital Media • Marketing Tech • Software • Automation
The Sr Data Engineer will design, implement, and maintain deployment and ETL pipelines while integrating various data sources and ensuring efficient developer experiences.
Top Skills: Apache AirflowArgo Ci/CdArgo WorkflowsBazelCircle CiDockerFlywayGitHarnessJavaJenkinsKubernetesLookerPower BIPythonSnowflakeSQLThoughtspot
11 Days Ago
Remote or Hybrid
2 Locations
160K-180K Annually
Senior level
160K-180K Annually
Senior level
AdTech • Consumer Web • Digital Media • eCommerce • Marketing Tech
The Senior Data Engineer will build and optimize data integration pipelines, manage data quality, implement business requirements, and ensure system efficiency. A strong emphasis on coding standards and collaboration is required.
Top Skills: Apache BeamApache KafkaSparkAWSGoogle Cloud PlatformPub/SubPythonSQL

What you need to know about the San Francisco Tech Scene

San Francisco and the surrounding Bay Area attracts more startup funding than any other region in the world. Home to Stanford University and UC Berkeley, leading VC firms and several of the world’s most valuable companies, the Bay Area is the place to go for anyone looking to make it big in the tech industry. That said, San Francisco has a lot to offer beyond technology thanks to a thriving art and music scene, excellent food and a short drive to several of the country’s most beautiful recreational areas.

Key Facts About San Francisco Tech

  • Number of Tech Workers: 365,500; 13.9% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Google, Apple, Salesforce, Meta
  • Key Industries: Artificial intelligence, cloud computing, fintech, consumer technology, software
  • Funding Landscape: $50.5 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Sequoia Capital, Andreessen Horowitz, Bessemer Venture Partners, Greylock Partners, Khosla Ventures, Kleiner Perkins
  • Research Centers and Universities: Stanford University; University of California, Berkeley; University of San Francisco; Santa Clara University; Ames Research Center; Center for AI Safety; California Institute for Regenerative Medicine

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account