Senior Big Data Administrator

| South Bay | Hybrid
Sorry, this job was removed at 5:43 a.m. (PST) on Thursday, June 17, 2021
Find out who's hiring in South Bay.
See all Developer + Engineer jobs in South Bay
Apply
By clicking Apply Now you agree to share your profile information with the hiring company.

For over 10 years, Zscaler has been disrupting and transforming the security industry. Our 100% purpose built cloud platform delivers the entire gateway security stack as a service through 150 global data centers to securely connect users to their applications, regardless of device, location, or network in over 185 countries protecting over 4,500 companies and 100 Million threats detected a day.

We work in a fast paced, dynamic and make it happen culture. Our people are some of the brightest and passionate in the industry that thrive on being the first to solve problems. We are always looking to hire highly passionate, collaborative and humble people that want to make a difference.

Zscaler's data engineering team is seeking a Senior Big Data Administrator. As a member of our group, you'll have the rewarding opportunity to work on cutting edge technologies to deliver and manage a platform that would be foundational to next gen security analytics.

Responsibilities/What You’ll Do:

  • Automate, deploy and operate data pipelines
  • Implement facilities to monitor all aspects of data pipeline 
  • Administer and manage data in Spark and large-scale Hadoop environments with an emphasis on automation
  • Troubleshoot and address operational issues as they come up
  • Develop tools to monitor workload on Hadoop cluster and tune the cluster for improving data processing throughput
  • Support and improve the build, delivery, and deployment pipeline of software developed in Java, Scala and Python

Qualifications/Your Background:

  • 5+ years of Big Data Platform Administration experience
  • Proficiency in data management and automation on Spark, Hadoop, and HDFS environments
  • Proficiency in understanding various log files emitted by Hadoop and troubleshooting performance bottlenecks in the cluster
  • Strong scripting skills for automating tasks (Python/Shell)
  • Experience developing ETL pipelines
  • Experience using Spark SQL 
  • Experience implementing and administering logging and monitoring tools such as Nagios, ELK
  • BS in Computer Science or related field

Desirable:

  • Experience developing build and deployment automation
  • Experience managing source in git (GitHub ops, branching, merging, etc) a big plus
  • Experience in orchestration tools like Ansible 
  • Experience with ETL tools like Airflow
  • Experience with SaaS operations
  • Experience in CI build tools such as Gradle and Jenkins 

Why Zscaler?

People who excel at Zscaler are smart, motivated and share our values. Ask yourself: Do you want to team with the best talent in the industry? Do you want to work on disruptive technology? Do you thrive in a fluid work environment? Do you appreciate a company culture that enables individual and group success and celebrates achievement? If you said yes, we’d love to talk to you about joining our award-winning team.

Learn more at zscaler.com or follow us on Twitter @zscaler. Additional information about Zscaler (NASDAQ : ZS ) is available at http://www.zscaler.com. All qualified applicants will receive consideration for employment without regard to race, sex, color, religion, sexual orientation, gender identity, national origin, protected veteran status, or on the basis of disability.
#LI-AS2

#LI-PRIORITY


Read Full Job Description
Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.

Location

120 Holger Way, San Jose, CA 95134

Similar Jobs

Apply Now
By clicking Apply Now you agree to share your profile information with the hiring company.
Learn more about ZscalerFind similar jobs