Abstract :
This paper considers the problem of state estimation in autonomous navigation from a theoretical perspective. In particular, the investigation concerns problems where the information provided by the sensor data is not sufficient to carry out the state estimation (i.e., the state is not observable). For these systems, the concept of continuous symmetry is introduced. Detection of the continuous symmetries of a given system has a very practical importance. It allows the detection of an observable state whose components are nonlinear functions of the original nonobservable state. So far, this theoretical and very general concept has been applied to deal with two distinct fundamental estimation problems in the framework of mobile robotics. The former is in the framework of self-calibration, and the latter is in the framework of the fusion of the data provided by inertial sensors and vision sensors. For reasons of length, only the former is discussed. In particular, the theoretical machinery is used to address a specific calibration problem. The solution constrains the robot to move along specific trajectories in order to be able to apply the calibration algorithm. This paper provides two distinct contributions. The first is the introduction of this concept of continuous symmetry. The second is the introduction of a simple and efficient strategy to extrinsically calibrate a bearing sensor (e.g., a vision sensor) mounted on a vehicle and, simultaneously, estimate the parameters describing the systematic error of its odometry system. Many accurate simulations and real experiments show the robustness, the efficiency, and the accuracy of the proposed strategy.
Keywords :
image sensors; mobile robots; nonlinear functions; path planning; state estimation; autonomous navigation; calibration algorithm; continuous symmetry concept; inertial sensors; mobile robotics; nonlinear functions; observability analysis; odometry system; state estimation; vision sensors; Calibration; Estimation; Observability; Robot kinematics; Robot vision systems; Observability; sensor calibration; sensor fusion; state estimation and navigation;