Autonomous Navigation in Dynamic Environments with Multi-Modal Perception Uncertainties

Abstract

This paper addresses the safe path planning problem for autonomous mobility with multi-modal perception uncertainties. Specifically, we assume that different sensor inputs lead to different Gaussian process regulated perception uncertainties (named as multi-modal perception uncertainties). We implement a Bayesian inference algorithm, which merges the multi-modal GP-regulated uncertainties into a unified one and translates the unified uncertainty into a dynamic risk map. With the safe path planner taking the risk map as input, we are able to plan a safe path for the autonomous vehicle to follow. Experimental results on an autonomous golf cart testbed validate the applicability and efficiency of the proposed algorithm.