Extending the Range of Haptic Feedback on Virtual Hand using Drone-based Object Recognition
The range in which human could feel and examine the world is sometimes limited by the body size and movement capabilities, especially for aging or disabilities groups. When a certain individual wants to examine the surrounding environment, the most intuitive way is to walk around, use the eyes to view and use the hands to feel the objects in the environment. However, there are countless scenarios, where users are either inconvenient to move or the environment itself is not suitable or reachable for users to explore.
Virtual Reality (VR) has brought up the virtual body extension applications to reach and explore the virtual world.
By fitting users' actual hand to the existing hand model in virtual reality, users could explore the environment by extending the length of the arm virtually.
However, this virtual body extension technique is still limited in terms of exploration range and its body ownership.
Thus, it is only applicable in surrounding environment exploration while users treat the virtual hand as an auxiliary device.
Users could not take advantage of the technique when they want to reach a further location, or get more texture details from the interaction object.
We proposed a system that allows user to employ a drone with the environment exploration task. (a) The control room where user could issue commands to the drone. (b) The drone would fly and capture the image of the target object. Then, (c) users could use virtual hand to explore remote environment that is captured from the front camera of a drone. At the same time, the image captured by the drone would simulate tactile feedback through ultrasound array.
To extend such limitation, in this paper, we proposed a Head-Mounted Display (HMD) integrated systems that allows users to reach an unreachable world without physically moving its presented.
A drone and the virtual hand act as extra eyes and hand for users, where the drone will see the object and the virtual hand would help the users to examine and feel the target objects.
The design mainly aims at making the drone control part as intuitive as possible, while improving the body ownership of the virtual hand.
The systems consist of three separate parts.
First, the user could use hand gestures to control the drone and image captures of the target object (Figure 1(a)).
In parallel, the drone location is tracked and synchronized into the HMD (Figure 1(b)).
Second, the image captures would then be sent back to a system and perform classification with a trained Neural Network.
Third, the user could virtually manipulate the target object with the virtual hand while perceiving according haptic feedback (Figure 1(c)).
Proposed concept: the system could erase the distance limit between visitor and museums.
To summarize, this paper makes the following contributions:
A system that introduces drone control and tactile feedback to the virtual hand, allowing users to manipulate and perceive both virtual and haptic feedbacks.
Performed a user study on the drone control as well as the virtual hand body ownership.