Feature based adaptive motion model for better localization
MetadataShow full item record
In the 21st century, we are moving ahead in making robots a ubiquitous part of our everyday life. The need for a robot to interact with the environment has become a necessity. The interaction with the world requires a sense of it's pose. Humans clearly are very good in having a sense of their location in the world around them. The same task for robots is very difficult due to the uncertainties in the movement, limitation in sensing of the environment and complexities in the environment itself. When we close our eyes and walk we have a good estimate of our location but the same can't be said for robots. Without the help of external sensors the problem of localization becomes difficult. Humans use their vestibular system to generate cues about their movement and update their position. The same can be done for robots by using acceleration, velocity or odometry as cues to a motion model. The motion model can be represented as a distribution to account for uncertainties in the environment. The parameters to the model are typically static in the current implementation throughout the experiment. Previous work has shown that by having an online calibration method for the model has improved localization. The previous work provided a framework to build adaptive motion model and targeted land based robot and sensors. The work presented here builds on the same framework to adapt motion models for Autonomous Underwater Vehicle. We present detailed results of the framework in a simulator. The work also proposes a method for motion estimation using side sonar images. This is used as a feedback to the motion model. We validate the motion estimation approach with real world datasets.