MURAL is a digital workspace for visual collaboration that connects over 75 percent of the Fortune 100. Teams at global enterprises including IBM, USAA, E-Trade, Intuit, SAP, Atlassian, Autodesk, and GitHub embrace visual collaboration to run more productive meetings and workshops. This leads to a more creative, engaging, and fun way of working together, all in a welcoming, simple-to-use online space.
Headquartered in San Francisco, California, MURAL employs over 700 people around the world. MURAL is on a mission to level up teamwork with imagination so that working together is more fun and innovation happens faster. The MURAL® platform transforms teamwork by making meetings and workshops interactive experiences designed for problem solving, play, and imagination. MURAL has raised $200M in financing to date and is growing rapidly to fulfill its mission.
We are looking for a Data Engineer capable of helping create and maintain the data pipeline architecture of Mural, as well as writing APIs and tools to help other teams work with data. You will be working in the BizOps Engineer team and also collaborating closely with Product, Analytics and Data Science teams to help them achieve their goals.
The Data Engineer role is a software development role with knowledge of data architectures, APIs, and the delivery and transformation of data in a reliable way.
The ideal candidate is passionate about both developing software and working with data, and is capable of challenging and redesigning existing solutions. He or she must be a team player, always willing to collaborate with others.
As a member of the Data Platform team, you will:
- Ability to develop private APIs and messaging endpoints for the services for rich functionality and administrative control within the platform.
- Take a central role in the running and development of Apache Kafka and/or other messaging infrastructure.
- Improve the existing data platform and propose alternative solutions.
- Have experience building data pipelines and ETL workflows using cloud platforms like Microsoft Azure or Amazon Web Services.
- Design and maintain ETL/ELT pipelines, SQL queries to meet business needs.
- Design, build, and evolve durable, highly scalable Kafka infrastructure.
- Work with stakeholders including the Executive, Engineering, and Operation teams to assist with data-related technical issues and support their data infrastructure needs.
- Have experience working day-to-day using a popular data warehouse, like Redshift, Vertica, BigQuery, or Snowflake using SQL.
- Owns the development, implementation, assessment, and support of Kafka streaming platform.
- Work closely with Product teams to help them explore the feasibility of experimental data-driven features, helping them narrow down preliminary or unclear requirements, and building the tools and APIs necessary to support those features. A strong analytical mindset is a must.
- Design, develop and deploy backend services with focus on high availability, low latency and scalability.
- Follow modern development best practices such as code reviews, unit testing and continuous integration.
- Work as part of a team. We value team players who share their knowledge and like collaborating with others.
- Show initiative, completing your tasks and providing timely status updates to both the rest of your team and all of the stakeholders.
- Take full ownership of the solutions you build. This means analyzing requirements, building them, monitoring them on production, and troubleshooting them if problems arise.
We are looking for a Software Engineer with 5+ years of experience in a development role, with Data Engineering experience, and who has attained a graduate degree in Computer Science, Software Engineering, or related field, or who has the equivalent relevant experience.
- A MS/BS degree in Computer Science, Software Engineering or 5+ years of proven experience in a similar position.
- Strong understanding of relational database management systems with experience in Snowflake, Redshift, SQL Server, Oracle, or similar systems.
- 5 years of experience in Data or BI Engineering, Data Warehousing/ETL, or Software Engineering.
- 5 years of experience in Big Data Solutions using technologies including one or more of the following: Hadoop, Hive, HBase, MapReduce, Spark, Sqoop, Oozie, Java.
- Experience with Apache Kafka in a high-throughput production environment.
- Experience in designing and developing web services and REST APIs.
- Advanced knowledge of relational databases such as PostgreSQL, and being capable of writing non-trivial SQL.
- Experience designing data models and data warehouses and with non-relational data storage systems (NoSQL and distributed database management systems).
In addition to being part of our quest to help people empower their imagination, we offer:
- Competitive salary and benefits
- Flexible working hours
- Ability to work remotely
- Flexible time off
- Professional development opportunities
- Learning stipend
- Wellness stipend
- MURAL free forever plan
- Design Thinking + Facilitation trainings
We bring people to our team that care about our mission to inspire and connect creative people globally, and who feel aligned with our values:
- Make others successful
- Adapt to thrive
- Play to wow
- Think global
- Experiment like an owner
MURAL is committed to creating diverse and inclusive workspaces where people can make a positive impact on the world and share their vision of how they achieve it. We are dedicated to working alongside multiple communities to help build this dream and bring it to life.