Exactera Logo

Exactera

Lead Data Platform Engineer

Posted 6 Days Ago
Easy Apply
In-Office or Remote
Hiring Remotely in San Diego, CA
Senior level
Easy Apply
In-Office or Remote
Hiring Remotely in San Diego, CA
Senior level
The Lead Data Platform Engineer will architect a centralized data platform on Databricks, focusing on data governance, performance optimization, and enabling data engineers. Responsibilities include designing data models, implementing ETL pipelines, and managing third-party integrations.
The summary above was generated by AI

Exactera has offices in New York City, Tarrytown NY, San Diego, CA, London, and Argentina. 

The Role

As Lead Data Platform Engineer, you'll architect and implement our centralized data platform on Databricks. You'll establish governance patterns using Unity Catalog, optimize for cost and performance at scale, and enable our existing Data Engineers to build confidently on the platform. This is a data infrastructure role—focused on pipelines, storage, governance, and platform operations. 


The Business Challenge

We operate multiple product lines (Transfer Pricing, R&D Services, RoyaltyStat, Provisioning), each with distinct databases containing enterprise financial data—journal entries, general ledgers, and financial statements. Our immediate challenge is migrating multi-terabyte datasets from legacy systems to a unified Databricks lakehouse while establishing governance patterns that enable multi-product operations at scale.


What You'll Build
  • Data Structuring: Design data models and implement unified schemas across multiple disparate product lines.
  • Unity Catalog Architecture: Design and implement multi-catalog governance strategy supporting data isolation, cross-product data sharing, and comprehensive lineage tracking across our product portfolio
  • Delta Lake Optimization: Establish patterns for Z-ordering, compaction, and liquid clustering at multi-TB scale. Define table structures, partitioning strategies, and retention policies that balance query performance with storage costs
  • ETL Pipeline Framework: Build declarative pipeline patterns using Delta Live Tables. Create orchestration workflows for ingesting data from internal sources such as SQL databases and S3
  • Third Party Integrations: Integrate with third party data sources such as ERP systems (Netsuite etc.) and external data providers (S&P etc.) with automated ingest, robust error handling and monitoring.
  • Platform Operations: Implement cost monitoring and optimization strategies, establish data quality frameworks, create self-service patterns enabling Data Engineers to work independently while maintaining governance standards

Business Problems You'll Solve
  • Key Legacy Product Migrations: Lead the architecture for migrating multi-terabyte datasets from legacy systems to Databricks—establishing patterns that will be reused across multiple product lines
  • Multi-Product Data Architecture: Design Unity Catalog structures enabling secure data separation between product lines while allowing controlled cross-product analytics where appropriate
  • Cost-Efficient Scale: Build infrastructure that scales efficiently—through intelligent caching, query optimization, and compute management strategies that avoid linear cost growth
  • Platform Reliability: Establish monitoring, alerting, and data quality validation ensuring the platform operates reliably as foundation for both analytics and AI workloads

Required Experience

Databricks Expertise (Required)

  • Unity Catalog: Production experience with multi-catalog governance, metastore design, and lineage tracking.
  • Data Structuring: Experience designing and building unified schemas across multiple disparate product lines.
  • Delta Lake: Expert-level experience with Z-ordering, compaction, liquid clustering, and performance tuning at multi-TB scale
  • Delta Live Tables: Strong hands-on experience building declarative ETL pipelines, including change data capture and expectations/constraints
  • Databricks Workflows: Experience with job orchestration, scheduling, and operational monitoring
  • Business Intelligence: Experience enabling company-wide analytics and reporting with modern business intelligence tools and maintaining source of truth data and metrics.
  • PySpark & Databricks SQL: Strong proficiency for code review, performance tuning, and query optimization

Core Platform Engineering

  • 5-8 years in data engineering or data platform roles, with 3+ years hands-on Databricks experience
  • Track record leading at least one significant platform build or migration project
  • AWS experience (S3, IAM, VPC) with ability to collaborate on infrastructure decisions
  • Infrastructure-as-code experience (Terraform preferred)

Technical Leadership

  • Demonstrated ability architecting data platforms from first principles and defending technical decisions
  • Strong written and verbal communication— document architecture decisions and present to both technical and business stakeholders

Preferred But Not Required

  • Experience with financial data, accounting systems (NetSuite), or enterprise ERP platforms
  • Background building platforms that serve AI/ML workloads (experience preparing data for downstream ML consumption, RAG and retrieval, and LLMs.
  • Understand advanced intelligence concepts such as relationship surfacing with knowledge graphs
  • Familiarity with data governance frameworks and compliance requirements for regulated industries



What We Offer:
(The following only applies to US-based positions)
  • A collaborative team culture with opportunities for career development. 
  • Ample opportunities to be recognized, build valuable skills, and grow your career. 
  • Generous vacation policy, including paid parental leave. 
  • Comprehensive health plans with FSA and HSA options. 
  • 401(k) retirement plan. 
  • Life and disability insurance coverage. 
  • Supplemental benefits like a dependent care savings plan, pet insurance, will preparation, and an employee assistance program. 

About Us:
At Exactera, a FinTech SaaS start-up founded in 2016, we stand at the intersection of human and machine intelligence. Our corporate tax solutions are powered by AI and cloud-based technologies, serving customers worldwide. With over $100 million in funding from Savant Venture Fund and Insight Partners, we are poised for growth. We are committed to diversity, inclusion, and equal opportunities for all. 

Top Skills

AWS
Databricks
Databricks Sql
Delta Lake
Pyspark
Terraform
Unity Catalog

Similar Jobs

18 Days Ago
Remote
United States
170K-190K Annually
Senior level
170K-190K Annually
Senior level
Artificial Intelligence • Healthtech • Machine Learning • Biotech
The Lead Data Engineer will manage the AI Data Platform, scale data pipelines, ensure data quality, and lead technical strategy for data initiatives.
Top Skills: AirflowAWSAzureCloudFormationDagsterDaskDockerGCPKubernetesPrefectPythonSparkSQLTerraform
47 Minutes Ago
Remote or Hybrid
San Francisco, CA, USA
Expert/Leader
Expert/Leader
Artificial Intelligence • Fintech • Payments • Business Intelligence • Financial Services • Generative AI
Lead the Americas People & Talent team from San Francisco, providing strategic HR leadership across recruitment, employee relations, performance, rewards, talent development, workplace experience, regional expansion, and cross-functional collaboration with CoE functions to support rapid growth and build high-performing teams across the region.
3 Hours Ago
Remote or Hybrid
United States
125K-159K Annually
Senior level
125K-159K Annually
Senior level
Automotive • Big Data • Information Technology • Robotics • Software • Transportation • Manufacturing
Design, build, and maintain secure, scalable SecOps platforms using C++, Rust, and scripting. Implement CI/CD and DevOps practices, integrate systems via APIs/webhooks and AI-driven tools, architect cloud (AWS/Azure/GCP) environments, optimize Linux/kernel configurations, automate infrastructure, and collaborate with SecOps on monitoring, detection, and response to protect enterprise assets.
Top Skills: Scripting Languages,C++,Rust,Linux,Linux Kernel,Aws,Azure,Gcp,Apis,Webhooks,Ci/Cd,Devops,Ai-Driven Tools

What you need to know about the San Francisco Tech Scene

San Francisco and the surrounding Bay Area attracts more startup funding than any other region in the world. Home to Stanford University and UC Berkeley, leading VC firms and several of the world’s most valuable companies, the Bay Area is the place to go for anyone looking to make it big in the tech industry. That said, San Francisco has a lot to offer beyond technology thanks to a thriving art and music scene, excellent food and a short drive to several of the country’s most beautiful recreational areas.

Key Facts About San Francisco Tech

  • Number of Tech Workers: 365,500; 13.9% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Google, Apple, Salesforce, Meta
  • Key Industries: Artificial intelligence, cloud computing, fintech, consumer technology, software
  • Funding Landscape: $50.5 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Sequoia Capital, Andreessen Horowitz, Bessemer Venture Partners, Greylock Partners, Khosla Ventures, Kleiner Perkins
  • Research Centers and Universities: Stanford University; University of California, Berkeley; University of San Francisco; Santa Clara University; Ames Research Center; Center for AI Safety; California Institute for Regenerative Medicine

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account