SPRADE2 October   2023 AM69A

 

  1.   1
  2.   Abstract
  3. 1Introduction
  4. 2Localization and Mapping
    1. 2.1 Simultaneous Localization and Mapping
    2. 2.2 Graph SLAM
    3. 2.3 Localization
  5. 3Surroundings Perception
  6. 4Path Planning
  7. 5Summary

Path Planning

With localization and surroundings perception, the mobile robot has all the necessary information to determine a path and navigate safely to the destination. A wide variety of path-planning algorithms are available. Among them, the path planning on the occupancy grid map is simple yet effective. Moreover, since obstacles are detected while updating the occupancy grid map, the occupancy grid map is widely used for obstacle detection and path planning. By identifying occupied and unoccupied grids, the mobile robot can discover an efficient and safe path to the destination.

Figure 4-1 is an example data flow of the autonomous navigation system on the AM69A using an image sensor mounted on the mobile robot. The raw image is processed and demosaiced by VPAC3 VISS and the image distortion is removed by LDC. For surroundings perception, the undistorted image is re-sized by MSC and converted to RGB format that DL networks operate on. DL inferencings are accelerated by MMA to detect objects and determine their poses from the image sensor. For localization, the feature points are extracted from the undistorted image and the corresponding feature points are searched from the map to determine the pose of the mobile robot on the map. The C7x DSP can optimize this localization process. Many feature extraction algorithms use pyramid image and MSC can accelerate the pyramid image generation. The pose of the mobile robot and the detected objects are input to path planning. Since the detected objects can be projected onto the map as well using the pose of the mobile robot, the free spaces around the mobile robot are identified and therefore the path to the destination and the control command for the mobile robot are determined. This example data flow can be extended with multiple image sensors with proper extrinsic calibration between sensors.

GUID-20230929-SS0I-2CNR-2SDT-MXDW2HZ6GRSC-low.svgFigure 4-1 Example Data Flow of AM69A Autonomous Navigation System

The autonomous navigation system using 2D LiDAR and IMU was implemented on top of the Robotics SDK and featured with the SCUTTLE robot. For this system, the occupancy grid map is built offline using the 2D LiDAR SLAM and the map is used for real-time navigation using the same 2D LiDAR sensor. For autonomous navigation, the three tasks in Figure 1-1, that is, localization, surroundings perceptions, and path planning are performed with every LiDAR scan and the control command is converted to a Pulse Width Modulation (PWM) signal to control motors. The data flow is described in the project page and this system can be duplicated on the AM69A processor.