Partially Observable Markov Decision Processes for Fault Management in Autonomous Underwater Vehicles
MetadataShow full item record
Proposed, is a partially observable Markov decision process (POMDP) model-based schema as the basis of a fault manager system for use by autonomous underwater vehicles (AUV) undergoing long endurance missions with the operator far from the AUV. The thesis explains the reasoning behind using POMDP over traditional static look-up tables at achieve a more autonomous system. The objective was to develop POMDP models for two illustrative AUV sub-systems – depth and power-management. These models were used as fault managers for a series of simulations for each sub-system, individually, and then when there are interactions. This novel solution demonstrated the validity of POMDP as the basis for a fault manager in accounting for the inherent partial observability of AUV states and their environments. Future work aims to expand this with more AUV sub-systems and test on hardware-in-the-loop simulators.