MOHELA Logo

MOHELA

Data Engineer

Posted 9 Days Ago
Remote
Hiring Remotely in USA
Mid level
Remote
Hiring Remotely in USA
Mid level
The Data Engineer will enhance the Enterprise Data Warehouse, modernizing ETL processes, optimizing data transformations, ensuring data quality, and providing production support while collaborating with cross-functional teams.
The summary above was generated by AI
Job Summary & Responsibilities

POSITION OVERVIEW:

We are seeking a technically skilled and experienced Data Engineer to provide support and enhancement of our Enterprise Data Warehouse. The role focuses on modernizing ETL processes within an on-premises Cloudera Data Platform (CDP) environment, leveraging technologies like Apache Spark, Apache Iceberg, and Apache Airflow for scalable, efficient, and reliable data transformation and management. The ideal candidate will have strong ETL development and troubleshooting skills, along with experience participating in production support environments.

 


Essential job functions:

  1. Development
    • Contribute development efforts for ETL pipelines in the Enterprise Data Warehouse (EDW)
    • Support and rebuild legacy ETL jobs (currently not using ACID transactions) with modern solutions using Apache Spark and Apache Iceberg to support ACID transactions.
    • Transform and integrate EBCDIC Mainframe data into Hive and Impala tables using Precisely Connect for Big Data.
    • Optimize data transformation processes for performance, scalability, and reliability.
    • Ensure data consistency, accuracy, and quality across the ETL pipelines.
    • Utilizes best practices for ETL code development, version control, and deployment using Azure DevOps.
  2. Production Support
    • Shares weekly 24/7 production support with managed service vendor on a 4-week rotation.
    • Monitor ETL workflows and troubleshoot issues to ensure smooth production operations.
    • Research and resolve user requests and issues
  3. Collaboration and Stakeholder Engagement
    • Collaborate with cross-functional teams, including data engineers, business analysts, administrators, and quality analyst engineers to ensure alignment on requirements and deliverables.
    • Engage with business stakeholders to understand data requirements and translate them into scalable technical solutions.
  4. Technical Governance
    • Contribute to process documentation, and follow best practices within the Enterprise Data Warehouse
    • Follow proper SDLC protocols within Azure DevOps code repository
    • Stay updated on emerging technologies and trends to continuously improve data platform capabilities.
    • Other tasks as assigned by management

 

MINIMUM REQUIREMENTS:

  • Bachelor’s degree in IT or similar field. (Additional equivalent experience above the required minimum may be substituted for the degree requirement.)
  • 3+ years of experience in ETL development and data engineering roles
  • 3+ years of advanced SQL experience
  • 3+ years in Python and Linux for Spark-based development.
  • Proven experience in using Apache Spark or Apache Iceberg or Airflow for ETL pipelines.
  • Strong familiarity with version control systems, especially Azure DevOps.
  • Knowledge of data governance and security best practices in a distributed data environment.
  • Familiarity with data modeling, schema design, and building data models for reporting needs.
  • In-depth understanding of ETL frameworks, ACID transactions, change data capture, and distributed computing.
  • Experience in designing and managing large-scale data pipelines and workflows.
  • Excellent problem-solving and troubleshooting skills.
  • Effective communication and collaboration abilities to collaborate with diverse teams and stakeholders.
  • Timeline centric mindset
  • Enterprise application awareness and technical alignment standards
  • This position requires (6C) personnel security screening in accordance with the U.S. Department of Education’s (ED) policy regarding the personnel security screening requirements for all contractor and subcontractor employees. A qualified applicant must successfully submit for personnel security screening within 14 calendar days from employment offer. Some travel may be required for PIV support.

 

PREFERRED QUALIFICATIONS:

  • Experience with Cloudera Data Platform (CDP), including Hive and Impala
  • Knowledge of Precisely Connect for Big Data or similar tools for mainframe data transformation

Top Skills

Apache Airflow
Apache Iceberg
Spark
Azure Devops
Linux
Python
SQL

Similar Jobs

2 Days Ago
In-Office or Remote
Chicago, IL, USA
80K-120K Annually
Mid level
80K-120K Annually
Mid level
Fintech
The Data Engineer is responsible for designing and maintaining data pipelines, optimizing data systems, and collaborating with developers and analysts to ensure data quality and consistency.
Top Skills: Apache AirflowC#/.NetLinuxPythonSQL
4 Days Ago
Remote or Hybrid
United States
60K-120K Annually
Mid level
60K-120K Annually
Mid level
Cloud • Insurance • Payments • Software • Business Intelligence • App development • Big Data Analytics
The Data Engineer will build and maintain data solutions, optimize data architectures, and ensure data quality while collaborating with cross-functional teams.
Top Skills: BigQueryGoogle Cloud PlatformPythonSQL
5 Days Ago
Remote or Hybrid
Richmond, VA, USA
123K-141K Annually
Junior
123K-141K Annually
Junior
Fintech • Machine Learning • Payments • Software • Financial Services
As a Data Engineer at Capital One, you will design and implement cloud-based data solutions, collaborate with Agile teams, and utilize technologies such as Java, Scala, and SQL. You will ensure code quality through testing and performance tuning while mentoring less experienced engineers.
Top Skills: AWSDatabricksEmrGCPHadoopHiveJavaKafkaMapreduceAzureMySQLPythonScalaSnowflakeSparkSQLUnix/Linux

What you need to know about the San Francisco Tech Scene

San Francisco and the surrounding Bay Area attracts more startup funding than any other region in the world. Home to Stanford University and UC Berkeley, leading VC firms and several of the world’s most valuable companies, the Bay Area is the place to go for anyone looking to make it big in the tech industry. That said, San Francisco has a lot to offer beyond technology thanks to a thriving art and music scene, excellent food and a short drive to several of the country’s most beautiful recreational areas.

Key Facts About San Francisco Tech

  • Number of Tech Workers: 365,500; 13.9% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Google, Apple, Salesforce, Meta
  • Key Industries: Artificial intelligence, cloud computing, fintech, consumer technology, software
  • Funding Landscape: $50.5 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Sequoia Capital, Andreessen Horowitz, Bessemer Venture Partners, Greylock Partners, Khosla Ventures, Kleiner Perkins
  • Research Centers and Universities: Stanford University; University of California, Berkeley; University of San Francisco; Santa Clara University; Ames Research Center; Center for AI Safety; California Institute for Regenerative Medicine

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account