Job description As the team lead for this exciting project, you will have the opportunity to work with a group of researchers to investigate the most efficient methods for sensor data fusion, training new algorithms for improving state-of-the-art scene understanding and tracking for accurate navigation in complex scenarios. You will investigate scene understanding for shared control navigation with a novel personal mobility device developed specifically for close interaction with pedestrians and narrow spaces. Advanced scene understanding from onboard sensing should allow real-time evaluation of the environment for shared control with dynamic obstacle avoidance and planning. If you are a highly motivated and creative individual with a passion for innovation, we want to hear from you. Join us and help shape the future of shared control navigation and personal mobility devices. Profile You have outstanding experience in mechatronics systems, and control with a PhD degree from a university in Computer Science, Robotics, Mechanical Engineering, or related fields, with a proven track record in robot navigation control or deep learning optimization for sensor fusion. Highly motivated, self-driven, and shows excellent performance. You have first-rate oral and written English skills and an interest in writing are required. Adaptable and flexible to the continuous changes associated with research demands. Strong robotics experience (ROS) for mobile systems, either in navigation, localization or control (preferred) Confirmed publication on some of the methodological areas of interest : mobile robots, multi-sensing fusion, time series modelling, or computer vision. Strong practical skills in field robotic systems. Strong experience with good coding practices in C/C++/C# and Python. Workplace Workplace