We are looking for a Machine Learning Infrastructure Engineer to help build cutting edge ML infrastructure at Moveworks. This role will be critical in building, optimizing and scaling end-to-end machine learning systems. The ML infra team covers a variety of responsibilities including distributed model training pipeline, model evaluation and monitoring framework, model inference and ML micro-services optimization, data annotation infrastructure, etc. These frameworks serve as a strong foundation for our hundreds of ML and NLP models in production serving hundreds of millions of enterprise employees. We are solving many challenges on scalability of services as well as optimization of core algorithms.
In this role you will work closely with our machine learning team, data infrastructure team and every core skill. Above all, your work will impact the way our customers experience AI. Put another way, this role is absolutely critical to the long term scalability of our core AI product and ultimately the company. If you are looking for a high-impact, fast-moving role to take your work to the next level, we should have a conversation.
Moveworks is revolutionizing how companies support their employees — with the first AI platform that makes getting help at work effortless. Using advanced conversational AI built for the enterprise, Moveworks gives employees exactly what they need, from IT support to HR help to policy information. Our platform allows customers like Snowflake, Slack, DocuSign, LinkedIn, Instacart, Illumina, Epic Games, Hearst Media to move forward on what matters.
Founded in 2016, Moveworks has raised $315 million in funding, at a valuation of $2.1 billion. We’ve been named to the Forbes AI 50 list for three consecutive years, while earning recognition as the Best Chatbot Solution at the 2021 AI Breakthrough Awards. Above all, we’ve built an AI company that puts people first, which is why Built In named Moveworks the #1 Best Place to Work in the Bay Area.
What will you do?
What do you bring to the table?
Nice to have: