Design, develop, and maintain data warehouse infrastructure while ensuring secure and efficient data pipelines for advanced analytics. Requires collaboration across teams and robust data governance practices.
Bridgeway is seeking a Senior Data Engineer to design, develop, and maintain our data warehouse infrastructure. This role involves working closely with analysts, engineers, and other stakeholders to shape our data architecture, ensuring secure and efficient data pipelines, and enabling advanced analytics across the organization. The ideal candidate will have a strong background in data engineering, data warehousing, and ELT processes, along with a passion for optimizing data systems.
This is a remote position, with preference given to East Coast candidates.
Key Responsibilities:
- Design, develop, and maintain a scalable data warehouse/lakehouse environment.
- Design and implement ELT pipelines to ingest, transform, and deliver high-quality data for analytics and reporting, incorporating current best practices, such as “pipelines as code”.
- Ensure data security and compliance, including role-based access controls for security, encryption, masking, and governance best practices to ensure compliant handling of sensitive information.
- Optimize performance of data workflows and storage for cost efficiency and speed.
- Partner with engineers, analysts, and stakeholders to meet data needs; balance cost, performance, simplicity, and time-to-value while mentoring teams and documenting standards.
- Define and implement robust testing frameworks, enforce data contracts, and establish observability practices including lineage tracking, SLAs/SLOs, and incident response runbooks to maintain data integrity and trustworthiness.
- Monitor, troubleshoot, and resolve data & automation issues.
- Collaborate within an Agile-Scrum framework and develop comprehensive technical design documentation to ensure efficient and successful delivery.
- Serve as a trusted expert on organizational data domains, processes, and best practices.
Requirements:
- 5+ years of experience in data engineering and ELT with a focus on large-scale data platforms
- 3+ years of experience with Databricks
- Advanced proficiency in analytical SQL, including ANSI SQL, T-SQL, and Spark SQL
- Strong Python skills for data engineering
- Expertise in data modeling
- Hands-on experience with data quality and observability practices (tests, contracts, lineage tracking, alerts)
- Practical knowledge of orchestration tools and CI/CD concepts for data workflows
- Excellent communication and a track record of technical leadership and mentoring
- Strong understanding of integrating data solutions with AI and machine learning models
- Strong problem-solving skills and attention to detail.
- Experience with version control systems like Git preferred
- Strong understanding of data governance and best practices in data management, with hands-on experience using Unity Catalog
- Hands-on experience in designing and managing data pipelines using Delta Live Tables (DLT) on Databricks
- Streaming and ingestion tools, such as Kafka, Kinesis, Event Hubs, Debezium, or Fivetran
- DAX, LookML, dbt; Airflow/Dagster/Prefect, Terraform; Azure DevOps; Power BI/Looker/Tableau; GitHub CoPilot knowledge is a plus
- Bachelor’s degree in Computer Science, Information Technology, or a related field. Master’s degree preferred
Top Skills
Airflow
Azure Devops
Dagster
Databricks
Dax
Dbt
Debezium
Delta Live Tables
Event Hubs
Fivetran
Git
Kafka
Kinesis
Looker
Lookml
Power BI
Prefect
Python
Spark Sql
SQL
T-Sql
Tableau
Terraform
Similar Jobs
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
Design and implement automated test suites for AI/ML workflows, analyze clinical data, perform various tests, and mentor junior engineers.
Top Skills:
Api Testing ToolsAWSCloudFormationCypressGCPGithub ActionsPythonSeleniumTerraform
Artificial Intelligence • Big Data • Healthtech • Information Technology • Machine Learning • Software • Analytics
The Senior Data Engineer will design data models, develop ETL pipelines using Azure and Databricks, deploy AI models, and ensure compliance with data regulations.
Top Skills:
AzureAzure Data FactoryDatabricksDelta LakeMlflowPysparkSQLUnity Catalog
AdTech • Artificial Intelligence • Marketing Tech • Software • Analytics
The Senior Data Engineer will design, build, and operate data pipelines for Zeta's AdTech platform, focusing on high-scale data processing and analytics-ready datasets.
Top Skills:
AirflowAthenaAWSCassandraDagsterDeltaDynamoDBEmrFlinkGlueGoHudiIcebergJavaKafkaKinesisMySQLParquetPostgresPythonRedisRedshiftS3ScalaSparkSQLStep Functions
What you need to know about the San Francisco Tech Scene
San Francisco and the surrounding Bay Area attracts more startup funding than any other region in the world. Home to Stanford University and UC Berkeley, leading VC firms and several of the world’s most valuable companies, the Bay Area is the place to go for anyone looking to make it big in the tech industry. That said, San Francisco has a lot to offer beyond technology thanks to a thriving art and music scene, excellent food and a short drive to several of the country’s most beautiful recreational areas.
Key Facts About San Francisco Tech
- Number of Tech Workers: 365,500; 13.9% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Google, Apple, Salesforce, Meta
- Key Industries: Artificial intelligence, cloud computing, fintech, consumer technology, software
- Funding Landscape: $50.5 billion in venture capital funding in 2024 (Pitchbook)
- Notable Investors: Sequoia Capital, Andreessen Horowitz, Bessemer Venture Partners, Greylock Partners, Khosla Ventures, Kleiner Perkins
- Research Centers and Universities: Stanford University; University of California, Berkeley; University of San Francisco; Santa Clara University; Ames Research Center; Center for AI Safety; California Institute for Regenerative Medicine

