At Serve Robotics, we’re reimagining how things move in cities. Our personable sidewalk robot is our vision for the future. It’s designed to take deliveries away from congested streets, make deliveries available to more people, and benefit local businesses.
The Serve fleet has been delighting merchants, customers, and pedestrians along the way in Los Angeles while doing commercial deliveries. We’re looking for talented individuals who will grow robotic deliveries from surprising novelty to efficient ubiquity.
We are tech industry veterans in software, hardware, and design who are pooling our skills to build the future we want to live in. We are solving real-world problems leveraging robotics, machine learning and computer vision, among other disciplines, with a mindful eye towards the end-to-end user experience. Our team is agile, diverse, and driven. We believe that the best way to solve complicated dynamic problems is collaboratively and respectfully.
Serve Robotics aims to develop dependable and proficient sidewalk autonomy software. Our Autonomy team is looking for a highly skilled Sensor Fusion Engineer. In this role, you will be responsible for developing and maintaining our onboard sensor-to-obstacle pipeline, ensuring robust and reliable perception for our sidewalk delivery robots. You will work with a diverse set of sensor modalities—including mono and stereo cameras, LiDAR etc and dive deep into the mechanics of sensor optics, firmware, and calibration. This role demands strong production-level C++ expertise, with additional skills in CUDA, edge-device perception, machine learning, and sensor calibration considered a plus.
Responsibilities
Lead, design, implement, and optimize sensor fusion algorithms to reliably detect and track obstacles.
Develop robust pipelines that integrate data from multiple sensor modalities (cameras, LiDAR, RealSense) in real-time.
Develop geometric perception pipelines such as point cloud labeling, occupancy grid map generation, obstacle detection etc tailored for resource-constrained, real-time edge devices.
Develop ML based sensor fusion and obstacle detection pipelines. Fine-tune and deploy learning-based perception models using data-centric techniques.
Validate sensor fusion outputs against ground-truth data and refine models as necessary.
Write, maintain, and optimize production-level C++ code for real-time sensor data processing while maintaining strict latency requirements.
Implement parallel computing solutions using CUDA where applicable to enhance performance on edge devices.
Analyze and understand sensor specifications, operating principles, and firmware intricacies, working closely with systems and hardware teams.
Collaborate with system and hardware team, firmware developers, and ML specialists to ensure seamless integration of sensors with the overall perception system.
Work closely with sensor calibration teams to design strategies that ensure high precision and reliability.
Qualifications
3+ years demonstrated industry experience working on sensor fusion applications for robotic or autonomous systems in a fast-paced innovative environment.
Deep understanding of sensor technologies (cameras, LiDAR, RealSense) including optics, firmware, and data characteristics.
Experience in developing real-time applications and low-latency systems.
Production quality programming experience in C++ and Python.
Demonstrated skill building large system designs, gathering system requirements and identifying key stakeholders.
Strong communication and collaboration skills.
What Makes You Standout
5+ years demonstrated industry experience working on sensor fusion applications for robotic or autonomous systems in a fast-paced innovative environment.
Hands-on experience with CUDA for parallel processing and performance optimization.
Experience with state of the art ML approaches for obstacle detection and tracking.
Exposure to integrating ML models within sensor fusion pipelines.
If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.
At Serve Robotics in Los Angeles, we're at the forefront of revolutionizing urban mobility with our incredible sidewalk robots. As a Sensor Fusion Engineer, you'll be a pivotal part of this journey, enhancing our robot's ability to interpret its environment by developing the onboard sensor-to-obstacle pipeline. You will dive into interesting challenges, working closely with a broad spectrum of sensors like cameras and LiDAR. Your expertise in C++, coupled with knowledge in machine learning and sensor calibration, will enable our robots to seamlessly navigate the complexities of city environments. Our collaboration-driven team is filled with industry veterans who are passionate about leveraging cutting-edge technology to create user-friendly and efficient robotic solutions. You'll get to lead your own projects, implement enhanced sensor fusion algorithms, and collaborate with various teams to ensure that our robots operate safely and accurately. Join us and contribute to making robotic deliveries a norm in every neighborhood, while enjoying a friendly work atmosphere that values creativity, innovation, and teamwork.
Why deliver a 2-pound burrito in a 2-ton car? Serve is the future of sustainable, self-driving delivery. Our zero-emissions rovers are designed to serve people in public spaces, starting with food delivery. We partner with platforms and merchants ...
76 jobsSubscribe to Rise newsletter