Posted in

Senior Perception Engineer – Autonomous Systems

Senior Perception Engineer – Autonomous Systems

CompanyMach Industries
LocationHuntington Beach, CA, USA
Salary$180000 – $270000
TypeFull-Time
DegreesBachelor’s, Master’s
Experience LevelSenior

Requirements

  • 5+ years of experience in perception system development for autonomous systems, such as those in automotive, robotics, or aerospace applications.
  • Expertise in computer vision, including object detection, tracking, semantic segmentation, and 3D scene understanding.
  • Proficiency in sensor fusion with cameras, LiDAR, radar, and IMUs for real-time perception.
  • Experience developing perception algorithms in Python, C++, or similar languages for embedded systems.
  • Strong understanding of machine learning frameworks (e.g., TensorFlow, PyTorch) for perception tasks.
  • Hands-on experience with sensor hardware, including camera calibration, LiDAR integration, and radar processing.
  • Bachelor’s or Master’s degree in Computer Science, Electrical Engineering, Robotics, or a related field. PhD is a plus.

Responsibilities

  • Develop perception algorithms for UAVs, focusing on real-time object detection, classification, tracking, and environmental mapping.
  • Design and optimize sensor fusion pipelines integrating cameras, LiDAR, radar, and other sensors to achieve robust scene understanding.
  • Implement computer vision techniques, including semantic segmentation, instance segmentation, and 3D reconstruction, to support autonomous navigation and obstacle avoidance.
  • Collaborate with GNC and autonomy teams to integrate perception outputs into navigation, planning, and control systems.
  • Build and deploy real-time perception software for embedded systems, validated through software-in-the-loop (SITL), hardware-in-the-loop (HITL), and flight tests.
  • Optimize algorithms to handle challenges such as sensor noise, dynamic lighting, occlusion, and high-speed flight dynamics.
  • Develop testing and validation frameworks using simulations and real-world flight testing to ensure perception system reliability.
  • Leverage AI/ML models (e.g., deep learning for object detection and tracking) to enhance perception performance.
  • Ensure perception systems meet defense standards for resilience against environmental challenges and adversarial threats.

Preferred Qualifications

  • Experience adapting perception systems for aerospace applications, including high-speed or high-altitude environments.
  • Familiarity with defense standards for autonomous systems and resilient perception.
  • Expertise in optimizing deep learning models for real-time performance on resource-constrained platforms.
  • Hands-on experience with flight testing or real-world validation of perception systems.
  • Knowledge of perception error modeling and mitigation for dynamic environments.
  • Proficiency with tools like OpenCV, ROS, PCL, or Gazebo for perception development.