Repository logo
 

Prediction-Based Haptic Interfaces to Improve Transparency for Complex Virtual Environments

Date

2017-06-27T12:56:26Z

Authors

Forbrigger, Shane

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Virtual surgical training simulators are an exciting opportunity to improve surgical training, especially in challenging fields like minimally invasive surgery. Virtual surgical training requires highly accurate haptic feedback to provide effective training, however, accurate human tissue models are computationally intensive and update too slowly. This work investigates a method of improving the haptic feedback realism (or transparency) from slowly updating virtual environments by designing a control structure that takes advantage of higher update rates outside of the virtual environment. The current state of the art in surgical training and controls research is identified through a review of the relevant literature. The contributions of this work are as follows. A predictor is designed for an unknown linear-time invariant system using Lyapunov-based methods to provide an estimate of the ideal virtual environment output at a higher output rate. The predictor design is extended to a gain-scheduled predictor using linear parameter-varying systems analysis. The resulting haptic system is tested both in simulation and experiment. For linear-time invariant systems the predictor provides excellent performance, leading to experimental improvements in transparency of up to 40%. For nonlinear systems the predictor provides mixed results, ranging from negligible results to improvements of approximately 20%.

Description

Keywords

Haptics, Prediction methods, Linear parameter-varying systems, Virtual reality, Nonlinear control, Transparency, Surgical training

Citation