MeridianLink Logo

MeridianLink

Principal Data Architect - AI

Posted Yesterday
Remote
Hiring Remotely in US
126K-215K Annually
Expert/Leader
Remote
Hiring Remotely in US
126K-215K Annually
Expert/Leader
The Principal Data Architect will design and oversee the enterprise data architecture, ensuring effective modeling, integration, governance, and consumption of data across the organization. Responsibilities include building data models, implementing lakehouse strategies, and partnering with multiple teams for the evolution of data practices.
The summary above was generated by AI

Principal Data Architect

About the Role

Reporting to the Vice President of Data, the Principal Data Architect is the senior technical authority for how data is modeled, integrated, governed, and consumed across MeridianLink. You will design our enterprise data architecture end-to-end — from source-system ingestion through our Azure Databricks lakehouse and into the analytical, operational, and customer-facing data products our business depends on.

This is a hands-on role. You will not only define the meta-models, conceptual models, logical models, and physical schemas that govern our data — you will build them, prove them out in code, partner closely with data engineers as they implement them, and evolve them as the business grows. The right candidate has done this before in a FinTech SaaS environment and understands the trade-offs that come with multi-tenant data, regulated workloads, and customer-facing analytics.

What You Will Do

• Define the enterprise data architecture: Own the conceptual, logical, and physical data models for MeridianLink's analytical and operational data platform, including source-aligned, integrated, and consumption-ready layers.

• Build the meta-model: Design and maintain a meta-model that captures entities, relationships, business definitions, ownership, lineage, sensitivity classifications, and SLAs — and make sure it is wired into our tooling, not stuck in a slide deck.

• Drive the lakehouse strategy: Architect our medallion (bronze / silver / gold) Delta Lake patterns on Databricks; define standards for partitioning, clustering, schema evolution, slowly changing dimensions, and historical reproducibility.

• Be hands-on: Write PySpark, SQL, and Delta Lake code. Build reference implementations, prototype patterns, review pull requests, and personally model critical domains rather than delegating every detail.

• Lead data integration design: Set patterns for ingestion through Informatica Data Management Cloud (IDMC) and direct Databricks pipelines, including CDC, batch, streaming, and API-based sourcing from our SaaS products and third-party systems.

• Champion data governance and lineage: Partner with data governance, security, and compliance leaders to operationalize cataloging, lineage, classification, masking, and access controls across the platform (Unity Catalog, IDMC, and adjacent tools).

• Standardize data modeling practices: Establish the standards, naming conventions, and review processes used by the Data Engineering team. Coach engineers on dimensional modeling, Data Vault, and other techniques where they best fit the use case.

• Partner across the business: Work closely with Product, Engineering, Analytics, ML, Finance, Risk, and Customer-facing teams to translate business needs into durable data designs.

• Influence the roadmap: Identify gaps in tooling, capability, and skill; propose investments; and drive multi-quarter initiatives that materially improve how MeridianLink uses its data.

Required Qualifications

• 12–15+ years of progressive experience in data engineering, data warehousing, and data architecture roles, with at least the most recent several years at the architect level.

• Demonstrated experience as a Data Architect at a SaaS company in the FinTech or financial services software space (lending, banking, payments, capital markets, insurance, or a closely related domain).

• Deep, hands-on expertise with Databricks and PySpark on Azure, including Delta Lake, Unity Catalog, structured streaming, and performance tuning at scale.

• Production experience with Informatica Data Management Cloud (IDMC) — or comparable enterprise integration platforms — for ingestion, transformation, and metadata-driven pipelines.

• Proven track record of designing and implementing detailed meta-models and end-to-end data models (conceptual, logical, and physical) that have shipped to production and stood up over time.

• Strong command of dimensional modeling (Kimball), Data Vault 2.0, and modern lakehouse patterns, including the ability to choose the right approach for the right use case.

• Expert SQL skills and strong proficiency in Python/PySpark; comfortable writing the code, not just the diagrams.

• Demonstrated experience implementing data governance, lineage, and metadata management programs (e.g., Unity Catalog, IDMC Data Governance, Collibra, Atlan, or similar).

• Working knowledge of FinTech-relevant regulatory and compliance considerations (e.g., GLBA, SOC 2, PCI, NIST, state lending regulations) and how they shape data design.

• Excellent written and verbal communication skills; able to explain complex data concepts to engineers, executives, customers, and auditors.

Preferred Qualifications

• Prior experience designing data architectures for multi-tenant SaaS platforms with customer-facing analytics or embedded reporting.

• Experience supporting Loan Origination, deposit account opening, or other consumer lending workflows and the underlying data domains (applicants, applications, decisions, funding, servicing, credit data, fraud, KYC/AML).

• Experience building feature stores or curated data products that serve both ML/AI workloads and BI consumers.

• Familiarity with Azure data services (ADLS Gen2, Azure Data Factory, Event Hubs, Synapse, Purview) and their interplay with Databricks.

• Experience with dbt, Great Expectations, or other modern data quality and transformation tooling layered on top of Databricks.

• Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related field, or equivalent professional experience.

Our Data Stack

• Lakehouse: Azure Databricks, Delta Lake, Unity Catalog, PySpark, SQL

• Integration: Informatica Data Management Cloud (IDMC)

• Cloud: Microsoft Azure (ADLS Gen2, Azure Data Factory, Event Hubs, Key Vault)

• BI & Consumption: Modern BI tooling, embedded analytics, ML feature delivery

• Governance: Unity Catalog, IDMC governance, lineage, and data quality controls

Similar Jobs

3 Hours Ago
In-Office or Remote
107K-182K Annually
Senior level
107K-182K Annually
Senior level
Artificial Intelligence • Hardware • Information Technology • Machine Learning
The Thin Films Engineer develops deposition processes for EUV-enabled DRAM, focusing on process optimization and defect reduction while collaborating with multifunctional teams.
Top Skills: AfmAldCvdEuvLpcvdPe-AldPecvdSimsXpsXrd
3 Hours Ago
Remote or Hybrid
Orange, AL, USA
116K-145K Annually
Senior level
116K-145K Annually
Senior level
Cloud • Fintech • Information Technology • Machine Learning • Software
The Manager, Enterprise Sales will lead a field team to drive activation and usage across strategic national franchise networks, coaching sales efforts and collaborating with various teams to enhance partner success and product usage.
Top Skills: B2B SaasCrm SystemsFintech
3 Hours Ago
In-Office or Remote
62K-111K Annually
Senior level
62K-111K Annually
Senior level
Fintech
The Project Change Manager ensures successful adoption of project changes by developing strategies, engaging stakeholders, and monitoring adoption metrics for organizational readiness.
Top Skills: ExcelMicrosoft OutlookMicrosoft PowerpointMicrosoft TeamsMicrosoft WordNice Call Center ProductsSalesforce

What you need to know about the San Francisco Tech Scene

San Francisco and the surrounding Bay Area attracts more startup funding than any other region in the world. Home to Stanford University and UC Berkeley, leading VC firms and several of the world’s most valuable companies, the Bay Area is the place to go for anyone looking to make it big in the tech industry. That said, San Francisco has a lot to offer beyond technology thanks to a thriving art and music scene, excellent food and a short drive to several of the country’s most beautiful recreational areas.

Key Facts About San Francisco Tech

  • Number of Tech Workers: 365,500; 13.9% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Google, Apple, Salesforce, Meta
  • Key Industries: Artificial intelligence, cloud computing, fintech, consumer technology, software
  • Funding Landscape: $50.5 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Sequoia Capital, Andreessen Horowitz, Bessemer Venture Partners, Greylock Partners, Khosla Ventures, Kleiner Perkins
  • Research Centers and Universities: Stanford University; University of California, Berkeley; University of San Francisco; Santa Clara University; Ames Research Center; Center for AI Safety; California Institute for Regenerative Medicine

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account