• Facebook App Icon
  • Twitter App Icon
  • Google+ App Icon
  • Instagram App Icon
  • LinkedIn App Icon

© 2018 Taro Narahara

Tel. 123-456-7890  I  Fax: 123-456-7890  I  info@mysite.com

Haptic Collaboration:

Biomedical Engineering Meets Digital Design

 

Taro Narahara, Kevin Abbruzzese*, and Richard A. Foulds*

 

This project is based on ongoing research collaboration with engineers at the *Department of Biomedical Engineering (BME) at New Jersey Institute of Technology. Our approach couples the BME-developed exoskeleton with my capacity to employ the 3-D graphics, game mechanics, and the physics engines of game development systems to provide therapeutic effects of gaming using stereoscopic glasses and motion tracking systems in addition to the sense of touch to be provided by the exoskeleton. The BME researchers had developed a novel admittance-controlled haptic robotic exoskeleton for assisting the upper extremity motions of people with stroke and cerebral palsy and were seeking to integrate it with an engaging and challenging virtual environment that can retain a user’s interest. Admittance control allows the user to input a force and translates that force into motion, and the user will feel resistance in the robot when the virtual arm makes contact with the objects. The result is a user-controlled haptic manipulator that allows individuals with neurological impairment to be therapeutically assisted by the exoskeleton (BME) while haptically interacting with virtual objects in a 3-D animated environment. 


The proposed game, Target Toss promotes a user to use coordinated movement between wrist and arm in order to throw a ball further to reach targets (motions of each joint from the exoskeleton are acquired before calculating the trajectory in the virtual space) and also to use a controlled force for gripping the ball in order not to crash it. These features were implemented using a C# script in Unity3D as they are known to enhance the therapeutic effects. We have developed interfaces between Unity3D and MATLAB, which is the current primary means for BME researchers to control their haptic robots using the Transmission Control Protocol (TCP). This interface allows for the implementation of GPU-accelerated computational physics, which is sufficiently fast to simulate user interaction with game objects and controls the physical prototype to provide a sense of virtual touch. While Unity animation operates at the computer screen refresh rate (~100 fps), its physics can operate at a higher frequency required for proper haptic perception. Unity3D also serves as a collaboration hub between the BME and me due to its speed and accuracy for computational physics and compatibilities with various CAD model formats common in digital design. This allows designers to provide original game logic with animated high-quality models using its real-time rendering capabilities without compromising accuracy and speed that are required by engineers. Unity’s graphics capability is also compatible with immersive commercial stereo vision glasses and projection systems that can enhance the therapeutic effects.

 

 

Related Publications:

 

Haptic Collaboration: Biomedical Engineering Meets Digital Design

Narahara, T., Abbruzzese, K., and Foulds, R., SIGGRAPH 2015 Talks (The 42nd International Conference and Exhibition on Computer Graphics and Interactive Techniques,) Los Angeles, California, August 9-13, 2015.