Augmented Reality for training, troubleshooting, manufacturing, service, and support.
Problem/opportunity trying to be solved
as we scale, we will have to train more and more people, faster and faster, to build, ship, deploy, serice, support, and maintain our robots. We will also need to train more customers faster. Building tools will get great people up to speed faster and it will expland the pool of candidates to those with less expertise than we currently require. |
|
Quantify the problem/opportunity
huge |
|
What is the root cause of the problem/opportunity
good problems like exponential growth! |
|
Customer need date | Apr 8, 2022 |
Urgent request | No |
Another great AR diagnostic tool, which could also be a powerful customer facing tool, would be to visualize the cost maps, robot footprints, and point clouds (from ROS) that are tied to a marker for robots.
This is extremely helpful in identifying artifacts in maps, or problematic items in the environment that might cause transient ghost obstacles in the costmap due to spurious lidar returns.
It can also help identify when debris of some kind (like a cobweb) is either on the lidar lenses or hanging from the robot and occasionally falling into the FOV of the lidars.
It can also show when other robots are being recognized (position being shared) through the global state system, and whether or not it is consistent. For example - at Modex, there was a case when an Origin collided with a Vector, and our hypothesis is that the horrible wifi connectivity that is always present at tradeshows caused a loss of realtime connectivity, or at least latency that was bad enough, that the Origin didn't receive the global state information needed to avoid the Vector. If we had this costmap visualization tool then any layperson could have conclusively identified the global state system (more specifically its having been failed by wifi) as the root cause of that collision. More importantly, the tool could have potentially been used to identify the problem before the collision happened.
Nick Varas created a proof of concept for this using the Vector A-Eye camera and it was really cool and really powerful. Not sure how viable it is to try to port that capability into an AR application, but it could be a separate, different app that is accessible though a browser.
Good first use case: visualization of robot footprint, safety field sets, footprint, and costmap inflation / footprint padding.
This would be helpful for sales people to be able to visualize what are acceptable aisle widths.
Also helpful training tool for the entire organization to understand what the safety and navigation systems "look" like and how they operate. It is sometimes difficult for people to understand that the "empty" space around a robot is actually occupied by infrared light that controls the speed and movement of the robots.
This application could be one that uses a marker that is mounted on a robot, so that the visualization of the invisible information could be tied to the physical robot.
Another mode of operation for the same application is one that uses a markerless AR approach, where a virtual robot with virtual safety and navigation visualizations could be included. This would allow people visiting customer sites to see for themselves and show the customers what space the robots and their safety systems and navigation systems occupy.
Jason, do you have a specific use case to start with and any companies in mind?