Mitiga Security Inc. Logo

Mitiga Security Inc.

Data Engineer

Reposted 6 Days Ago
Be an Early Applicant
In-Office or Remote
Hiring Remotely in Tel Aviv
50K-120K Annually
Senior level
In-Office or Remote
Hiring Remotely in Tel Aviv
50K-120K Annually
Senior level
Design and implement data processing architectures for cloud security analysis, optimize PySpark workflows, and develop solutions for analyzing security datasets.
The summary above was generated by AI

We're seeking a Data Engineer to architect and develop sophisticated data solutions using advanced Spark, PySpark, Databricks and EMR implementations in our mission to transform the cyber-security breach readiness and response market.

Why Mitiga?

Mitiga preemptively detects and stops attacks before damage is done.Mitiga moves your security beyond configuration-focused prevention. In today’s cloud-first, AI-driven world, attackers inevitably get in. Mitiga promptly stops them.Our platform connects Cloud, SaaS, AI, and Identity into one panoramic forensic system that gives SecOps total awareness, attack decoding, and autonomous containment. The result: attacks stop mid-flight, investigations are instant, and impact disappears.We replace the false promise of “zero breach” with a promise we can keep - Zero Impact.

When attackers get in, Mitiga ensures they get nothing.

Zero Impact Breach Mitigation.Mitiga is used by many well-known brands to reduce risk, enhance their SecOps, and improve business resilience.

What You’ll Do:  

Join us in crafting cutting-edge solutions for the cyber world using Spark/PySpark ETLs and data flow processes. Dive into the realm of multi-Cloud environments while collaborating closely with investigators to fine-tune PySpark performance. Harness the power of top-notch technologies like Databricks to elevate our technical projects, scaling them for efficiency. Embrace innovation as you research and implement new techniques. Evolve with us as a key member of the Mitiga R&D team. 

Technical Impact:

  • Design and implement complex data processing architectures for cloud security analysis
  • Optimize and scale critical PySpark workflows across multi-cloud environments
  • Develop innovative solutions for processing and analyzing massive security datasets
  • Drive technical excellence through sophisticated ETL implementations
  • Contribute to architectural decisions and technical direction

Core Responsibilities:

  • Build robust, scalable data pipelines for security event processing
  • Optimize performance of large-scale PySpark operations
  • Implement advanced data solutions using Databricks and cloud-native technologies
  • Research and prototype new data processing methodologies
  • Provide technical guidance and best practices for data engineering initiatives

Preferred Qualifications:

  • Experience with security-focused data solutions
  • Deep expertise with Splunk and AWS services (S3, SQS, SNS, Stream)
  • Advanced understanding of distributed systems
  • Strong Linux systems knowledge
  • Experience with real-time data processing architectures

Who You Are: 

  • 4+ years of hands-on data engineering experience in cloud-based SaaS environments
  • Deep expertise in PySpark, Python, and SQL optimization
  • Advanced knowledge of AWS, Azure, and GCP cloud architectures
  • Proven track record implementing production-scale data systems
  • Extensive experience with distributed computing and big data processing
  • Strong collaboration skills and technical communication abilities

Top Skills

AWS
Azure
Databricks
Emr
GCP
Pyspark
Spark
Splunk
SQL

Similar Jobs

3 Days Ago
Remote
Israel
Mid level
Mid level
Artificial Intelligence • Automotive
As a Data Engineer, you will design and optimize ETL processes and data pipelines for time-series data, ensuring data quality and efficiency while collaborating with Data Scientists on production systems.
Top Skills: AWSDatabricksDelta LakePysparkPythonSparkSQL
Yesterday
Remote or Hybrid
Tel Aviv, ISR
5-5 Annually
Senior level
5-5 Annually
Senior level
Cloud • Fintech • Software • Database • Analytics
The role involves designing scalable data platforms, optimizing production systems, and managing ETL/ELT pipelines. Strong communication skills for technical processes are required.
Top Skills: Apache AirflowDagsterPython
Yesterday
In-Office or Remote
Tel Aviv, ISR
Mid level
Mid level
Software • Cybersecurity
As a Data Engineer, you will develop and maintain data pipelines, manage production environments, and deploy applications to support business decisions.
Top Skills: GCPHelmIcebergKubernetesPythonTrino

What you need to know about the San Francisco Tech Scene

San Francisco and the surrounding Bay Area attracts more startup funding than any other region in the world. Home to Stanford University and UC Berkeley, leading VC firms and several of the world’s most valuable companies, the Bay Area is the place to go for anyone looking to make it big in the tech industry. That said, San Francisco has a lot to offer beyond technology thanks to a thriving art and music scene, excellent food and a short drive to several of the country’s most beautiful recreational areas.

Key Facts About San Francisco Tech

  • Number of Tech Workers: 365,500; 13.9% of overall workforce (2024 CompTIA survey)
  • Major Tech Employers: Google, Apple, Salesforce, Meta
  • Key Industries: Artificial intelligence, cloud computing, fintech, consumer technology, software
  • Funding Landscape: $50.5 billion in venture capital funding in 2024 (Pitchbook)
  • Notable Investors: Sequoia Capital, Andreessen Horowitz, Bessemer Venture Partners, Greylock Partners, Khosla Ventures, Kleiner Perkins
  • Research Centers and Universities: Stanford University; University of California, Berkeley; University of San Francisco; Santa Clara University; Ames Research Center; Center for AI Safety; California Institute for Regenerative Medicine

Sign up now Access later

Create Free Account

Please log in or sign up to report this job.

Create Free Account