Connect

There’s a new assistant in town, meet Polaris.

Year Two Demonstration

Another year has passed in the Polaris project and we are now several steps closer to our goal of creating the next level of indoor wayfinding through a network of autonomous robotic wayfinding assistants. Just before we all isolated ourselves in our home offices as a result of the Corona virus, we had our yearly demonstration, with all of the project collaborators and a representative from the Innovation Fund present. This demonstration was the culmination of another year of hard work, especially during the months leading up to the demonstration.

During the second year of the project, we have been building a new and improved version of the robot with more intelligence, better sensors and motors and a more refined look, compared to the first prototype. We have been conducting fields studies at Aarhus University Hospital to understand the needs and pains of the users we are designing for. We have been developing an iOS app that allows users to communicate with the robot and conducted usability tests with potential end-users to make sure the interface is easy to use. We have also improved the intelligence incorporated in the map which the robot uses to navigate. This allows it to make more informed decisions in its route planning. Behind the scenes, we have also been investigating the business potential, other potential use cases, and possible collaborations.

Before the demonstration, we had an integration workshop where we integrated all the systems and tested their ability to communicate with each other.

Polaris contributors testing integrations before the demo

Polaris contributors testing integrations before the demo

The overall goal this year was to demonstrate the use case of a hospital and how the robot might help patients and visitors find their way around. We used the Combine office in Aalborg to simulate a hospital, where we had assigned names of medical wards to some of the conference rooms that the robot was supposed to navigate to.

To demonstrate the improvements and new functionalities, we set up a few tasks for the user to complete using the robot. During these tasks, we showed that the robot is now capable of:

  • Greet patients in a predefined entrance area
  • Scan social security cards to identify a patient and show relevant information from stored appointment data
  • Calculate a route to a desired destination and show it on a map along with the estimated time to destination
  • Lead the visitor to their destination
  • Avoid obstacles along the way
  • Pause and wait for the user if a break is requested along the way
  • Calculate a new route if the path is blocked
  • Recognize the destination and inform users that they have reached their goal
  • Return to its starting point to help the next visitor

Furthermore, it is possible for the admin to add new points of interests on the map from the admin system and to add temporary roadblocks or zones that the robot should avoid. The ability to block or limit traffic on certain routes is especially useful to aid in social distancing during the Corona outbreak.

The robot is still quite slow and not as stable as we would like it to be, but overall the demonstration went well, and we are optimistic about the future of the project.

As we are moving into the third year of the project, the focus is now on building a more robust and stable version of the robot that is technologically ready and safe to implement in use case scenarios, but also to explore new commercial opportunities through collaborations and partnerships.