AR Management of Mobile Robot
For my Master's capstone project, I worked with 3 other people and developed an intuitive AR interface to manage mobile robots as collaborative assistants for healthcare workers. We used the Fetch100 OEM Base as the mobile robot and the Hololens 2 as the mixed-reality headset.
The robot's base task is to autonomously navigate and deliver medications to set targets. When the robot encounters a problem, it will inform the worker what it needs help with through AR such as changing the target to red on the task list and updating status when it gets stuck. Workers can also manually move and stop the robot with simple command inputs in AR. Lastly, the workers can see where the robot is going on an AR minimap.
I was in charge of the navigation system. Using the onboard Lidar, the robot creates a map of the environment with designated set targets. With the map, the robot localizes itself with AMCL and navigates to its targets using the ROS move_base package.
With the map from ROS, I imported it into Unity for the minimap. A terrain was created with high "walls" at the boundaries and edges as obstacles where I then use A* to simulate the path that the robot is taking.
I also helped with developing the task list using C# in Unity. A predetermined CSV list of tasks consisting of the item being delivered, the robot's origin, and destination is loaded into a UI that the workers can display at any time. A task is marked green when it is completed and it is marked red when it is deemed invalid.
A video demo of this project can be seen below.
Video Demo