Thursday, August 27, 2020

Oh, patents! Starship robots (3) Low-light navigation

Copyright © Françoise Herrmann

One of the main advantages of using delivery robots, to solve the last mile logistics of transporting goods, is that robots can operate 24/7 without mandatory resting periods, or extra pay. Last mile robotic deliveries thus already appeared as quite an atractive solution, considering the surge of ecommerce and online deliveries, even before the pandemic,  a forciori during the pandemic, and the gradual re-opening of economies.

However, for robots to moonlight (pun intended) at no extra costs, they also have to be able to navigate in low-light conditions. Easier said, than done.  How does a camera sensor capture the image of an objet that is no longer visible? How can terrain be mapped accurately and efficiently at night? The StarshipTechnologies patent WO2019086465A1, titled Visual localization and mapping in low light conditions, precisely adresses this issue.

 The patent discloses a SLAM (Simultaneous Localization and Mapping) method where the robot can estimate its own position on a map, while simultaneously creating the map, enabling it to continue its route autonomously. The method relies on data collected by a wide variety of sensors, such as GPS, Lidar, gyroscope,  cameras,  odometer, accelerometer and magnetometer. At sundown, when the sun is astronomically positioned between 0 and 6 degrees below the horizon, a twilight map is created. The twighlight map has the advantage of having both daytime features (e.g. ; straight lines) and night time features (e.g.,urban lights) features.  Thus, the twilight map, in fact, bridges the visibility gap by mapping the position of nighttime features onto a daytime model. In turn, position relative to visibility is triangulated with data incoming from other sensors, and mapping is adjusted accordingly. Otherwise captured images might also be downsized to bring blurry or jagged lines into sharper focus, during mapping.  Likewise roads and buildings might be tagged relative to daytime and night time features to facilitate localization.

The abstract of the invention is included below, together with the patent Figure 4 showing a twighlight map with night time visual features (e.g., urban lights) 2T, and day time visual features (e.g., lines) T1,  extracted during twighlight time, when the sun was positionned astronomically  between 3 and 8 degrees below the horizon.

The present invention relates to a method comprising generating a map comprising day-time features and night-time features, wherein the position of night-time features relative to the day-time features is determined by at least one image captured during twilight. The present invention also relates to a corresponding processing unit configured to execute such a method. [Abstract WO2019086465A1]

Most of the time, all goes well. The 99% autonomous Starship robots fullfill their missions, delivering goods at extended hours, seven days a week, to happy customers. For example, according to the Youtube video incuded below, Starship Robots fulfilled 2500 deliveries during their first week of operation at the Univesrity of Houston, TX, in 2019. However, on occasion the robots get stuck. The following Youtube video shows how a Starship robot was rescued by a University of Houston student, in the middle of the night. Equipped with voiced interaction routines, the Starship robot even gratefully thanked the student, after being rescued.  


Reference
Starship Technologies

No comments: