Role: Azure Databricks / PySpark Developer with Business Analysis (Banking Domain)
Location: Mount Laurel, NJ
Mode of Work: Onsite
Role Summary
We are seeking a mid‑to‑senior level Data Engineer with strong hands‑on expertise in Azure Databricks, PySpark, Azure Data Factory, and Synapse Analytics, combined with solid Business Analysis skills
This is a hybrid techno‑functional role where you will develop scalable data pipelines and also translate banking business requirements into technical designs.
You will collaborate with business users, data architects, and cross‑functional technology teams to deliver high‑quality data solutions supporting regulatory reporting, risk analytics, fraud detection, customer insights, and core banking operations.
Key Responsibilities
Technical Responsibilities-Data Engineering & Development
- Design, build, and optimize ETL/ELT data pipelines using Azure Databricks (PySpark), ADF, and Synapse.
- Develop scalable data ingestion frameworks for batch and near‑real‑time data.
- Implement Delta Lake for ACID‑compliant, performant data workflows.
- Build data transformation logic using PySpark for cleansing, validation, and enrichment.
- Optimize Spark jobs for performance, reliability, cost, and parallel execution.
- Integrate data across multiple banking systems (CBS, Loans, Cards, Payments, AML, Risk, Regulatory platforms).
- Develop reusable modules, notebooks, parameterized pipelines, and CI/CD‑ready components.
- Ensure data quality, profiling, governance, lineage, and audit requirements are met.
- Work with ADO/Git for version control, branching, and deployment.
Business Analysis Responsibilities
- Interact with business stakeholders: Risk, Finance, Payments, Operations, Compliance, Treasury, etc.
- Gather, analyze, and document business/stakeholder requirements.
- Translate banking use cases into technical specifications and data transformation rules.
- Perform data mapping, source‑to‑target documentation, and business glossary creation.
- Conduct gap analysis, feasibility studies, and propose data solutions.
- Support UAT, perform data validation, and work closely with end‑users for sign‑off.
- Identify data issues, troubleshoot root causes, and propose process improvements.
- Communicate insights and updates to both technical and non‑technical teams.
Top Skills
Similar Jobs
What you need to know about the San Francisco Tech Scene
Key Facts About San Francisco Tech
- Number of Tech Workers: 365,500; 13.9% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Google, Apple, Salesforce, Meta
- Key Industries: Artificial intelligence, cloud computing, fintech, consumer technology, software
- Funding Landscape: $50.5 billion in venture capital funding in 2024 (Pitchbook)
- Notable Investors: Sequoia Capital, Andreessen Horowitz, Bessemer Venture Partners, Greylock Partners, Khosla Ventures, Kleiner Perkins
- Research Centers and Universities: Stanford University; University of California, Berkeley; University of San Francisco; Santa Clara University; Ames Research Center; Center for AI Safety; California Institute for Regenerative Medicine

