A project funding provided by:
N. Tanabe, Y. Sato, K. Morita, M. Inagaki, Y. Fujino,
and K. Sato.
fARFEEL: Providing Haptic Sensation of Touched Objects using Visuo-Haptic Feedback.
IEEE VR ’19 (Research Demonstration).
Providing Haptic Sensation of Touched Objects
using Visuo-Haptic Feedback
We present fARFEEL, an interactive system which provides the haptic feedback matched to the position of the projected virtual hand (VH).
The system allows the local and remote users to communicate by using the projected VH for an agency of his/her own hands.
The haptic information of the real object touched by the projected VH is provided to the local user so that they could feel touching a distance object.
We proposed a projection mapping interface enabling communication with person in remote site by overlaid graphical hands.
We also introduce a set of the visual stimulus that could potentially provide the sense of the body ownership over the projected VH.
A user manipulates own projected virtual hand at remote site.
In the local site, a local user is captured by a camera which embedded with the display.
A local user interacts to the remote site through a touch panel that simultaneously synchronizes with projected VH on the remote site.
When the projected VH touches an object, the haptics information is provided to the local user through the haptic device that simulates the sense of the remote objects.
We design the haptic feedback providing to a non-manipulating hand (i.e., the hand where the user not use to control the VH) so that it will not bother the manipulation and interaction during the remote communication.
In the remote site, a remote user is captured by a camera which embedded with the display.
The projector and RGB-D camera is installed over the user head to project the VH and captured the object information, respectively.
The captured image and depth information from RGB-D camera process to determine the height information for haptic stimulation device.