The AI Engineer is responsible for designing, implementing, and deploying AI systems and applications, focusing on machine learning models and large language models (LLMs).
Description
We are seeking an AI Engineer to design, build, and deploy AI-powered capabilities within our product.
This role focuses on integrating machine learning models and large language models (LLMs) into scalable software systems and delivering reliable AI-driven features to production.
The AI Engineer works at the intersection of software engineering, AI systems, and infrastructure.
transforming AI technologies into practical applications.
Responsibilities:
- Build applications powered by machine learning and large language models (LLMs).
- Implement capabilities such as intelligent assistants, semantic search, automation, and recommendation systems.
- Integrate AI functionality into backend services and product workflows.
- Design and implement retrieval pipelines, embedding pipelines, and inference workflows.
- Build Retrieval-Augmented Generation (RAG) systems and AI-driven services.
- Create scalable AI architectures capable of handling production workloads.
- Package and deploy AI models as production services.
- Optimize inference performance, scalability, and latency.
- Monitor AI services to ensure reliability and performance.
- Develop backend services and APIs that expose AI capabilities.
- Integrate AI systems with databases, internal services, and external APIs.
- Contribute to system architecture and microservices design.
- Implement logging, metrics, and observability for AI systems.
- Track model performance and system reliability in production environments.
- Work closely with product managers, engineers, and data scientists.
- 5+ years of programming skills in one or more modern languages (such as Python, Java, Go, or similar).
- Experience building backend services and APIs.
- Experience integrating machine learning models or LLMs into applications.
- Understanding of microservices architecture and distributed systems.
- Experience with Docker and containerized applications.
- Familiarity with Kubernetes or cloud infrastructure.
- Experience working with databases and data processing pipelines.
Preferred Qualifications:
- Experience building LLM-based applications.
- Experience with RAG architectures and embeddings.
- Experience with vector databases or semantic search systems.
- Familiarity with model serving frameworks or inference platforms.
- Experience working in production AI environments.
Strong Advantage:
- Experience working with local or self-hosted AI models (e.g., Llama, Mistral, or similar).
- Experience deploying AI models in on-premise or private cloud environments.
- Familiarity with running LLM inference locally using frameworks such as Ollama, vLLM, or Hugging Face Transformers.
- Experience optimizing models for GPU/CPU inference and resource-constrained environments.
Top Skills
AI
Docker
Go
Java
Kubernetes
Large Language Models
Machine Learning
Microservices
Python
Vector Databases
Similar Jobs at Tufin
Security • Cybersecurity
The Marketing Specialist will develop and execute lead generation programs, support event planning, and collaborate with sales and marketing teams for effective outreach in EMEA and APAC.
Top Skills:
MarketoMonday.ComSalesforce
What you need to know about the San Francisco Tech Scene
San Francisco and the surrounding Bay Area attracts more startup funding than any other region in the world. Home to Stanford University and UC Berkeley, leading VC firms and several of the world’s most valuable companies, the Bay Area is the place to go for anyone looking to make it big in the tech industry. That said, San Francisco has a lot to offer beyond technology thanks to a thriving art and music scene, excellent food and a short drive to several of the country’s most beautiful recreational areas.
Key Facts About San Francisco Tech
- Number of Tech Workers: 365,500; 13.9% of overall workforce (2024 CompTIA survey)
- Major Tech Employers: Google, Apple, Salesforce, Meta
- Key Industries: Artificial intelligence, cloud computing, fintech, consumer technology, software
- Funding Landscape: $50.5 billion in venture capital funding in 2024 (Pitchbook)
- Notable Investors: Sequoia Capital, Andreessen Horowitz, Bessemer Venture Partners, Greylock Partners, Khosla Ventures, Kleiner Perkins
- Research Centers and Universities: Stanford University; University of California, Berkeley; University of San Francisco; Santa Clara University; Ames Research Center; Center for AI Safety; California Institute for Regenerative Medicine

