Haptic Rendering Methods for Full-Hand Interaction with Complex Environments
The hand concentrates a large number of the mechanoreceptors in the human body, and serves as the major means of bidirectional interaction with the world. Humans execute common operations such as palpation or grasping with ne dexterity thanks in part to the sensitivity of the mechanoreceptors and actuators in the hand. Since hands are our main touch sensor, the simulation of direct touch with our bare hands can be considered to be one of the ultimate goals of haptic technology. In this PhD thesis, we present a set of methods for haptic rendering of full-hand interaction with complex virtual environments. Our key ingredient is an interactive physical model of the human hand including the articulated skeleton, the deformable esh with frictional contact and the coupling between skeleton and esh. We introduce an algorithm for the e cient computation of coupled skeleton and esh dynamics and a general approach for linking the simulated hand to glove-like haptic devices for bidirectional haptic interaction. For the hand skeleton, we present an algorithm for the simulation of articulated bodies under implicit integration with joint limits and sti joint forces. We also present an approach for haptic rendering of hand interaction using 6 degrees-of-freedom devices taking into account esh deformation. This approach can be applied to the simulation of interaction with any type of deformable tool. The key ingredients are the modeling of the coupling between the handle and the deformable tool and a linear model to approximate contact and coupling forces at haptic rates. We present our examples of interaction with environments composed of static, rigid and deformable objects and we also introduce two application examples of more challenging environments. The rst is an environment for surgery training on the human shoulder. The intricate anatomy of the shoulder with continuous contact and couplings between parts of di erent physical nature requires the development of methods to simplify and accelerate the simulation up to interactive rates. We achieve this through the use of intermediate representations and e cient handling of multiple couplings and contact. The second is an environment of mixed reality for virtual prototyping, where the user can see and touch both virtual and real objects integrated in a single scene. Incorporating the haptic device to the mixed reality scenario impose several challenges from the visual point of view, but also challenges on the integration of the physical simulation and haptic rendering with the visual representation of a real environment.
Tesis Doctoral leída en la Universidad Rey Juan Carlos de Madrid en 2014. Director de la Tesis: Miguel Ángel Otaduy Tristán
- IA - Tesis Doctorales