flyinghand: extending the range of haptic feedback on...

4
FlyingHand: Extending the Range of Haptic Feedback on Virtual Hand using Drone-based Object Recognition Tinglin Duan University of Toronto Toronto, Ontario [email protected] Parinya Punpongsanon Daisuke Iwai Kosuke Sato {parinya,daisuke.iwai,sato}@sys.es.osaka-u.ac.jp Osaka University Toyonaka, Osaka Figure 1: We proposed a system that allows user to employ a drone with the environment exploration task. (a) The control room where user could issue commands to the drone. (b) The drone would fly and capture the image of the target object. Then, (c) users could use virtual hand to explore remote environment that is captured from the front camera of a drone. At the same time, the image captured by the drone would simulate tactile feedback through ultrasound array. ABSTRACT This paper presents a Head Mounted Display (HMD) integrated system that uses a drone and a virtual hand to help the users explore remote environment. The system allows the users to use hand ges- tures to control the drone and identify the Objects of Interest (OOI) through tactile feedback. The system uses a Convolutional Neural Network to perform object classification with the drone captured image and provides a virtual hand to realize interaction with the object. Accordingly, tactile feedback is also provided to users’ hands to enhance the virtual hand body ownership. The system aims to help users assess space and objects regardless of body limitations, which could not only benefit elderly or handicapped people, but make potential contributions in environment measurement and daily life as well. CCS CONCEPTS Human-centered computing Mixed / augmented reality; Interaction devices; KEYWORDS Virtual hand illusion, body ownership, haptic perception, human- drone interaction SA ’18 Technical Briefs, December 4–7, 2018, Tokyo, Japan 2018. ACM ISBN 978-1-4503-6062-3/18/12. . . $15.00 https://doi.org/10.1145/3283254.3283258 ACM Reference Format: Tinglin Duan, Parinya Punpongsanon, Daisuke Iwai, and Kosuke Sato. 2018. FlyingHand: Extending the Range of Haptic Feedback on Virtual Hand using Drone-based Object Recognition. In SIGGRAPH Asia 2018 Technical Briefs (SA ’18 Technical Briefs), December 4–7, 2018, Tokyo, Japan. ACM, New York, NY, USA, 4 pages. https://doi.org/10.1145/3283254.3283258 1 INTRODUCTION The range in which human could feel and examine the world is sometimes limited by the body size and movement capabilities, es- pecially for aging or disabilities groups. When a certain individual wants to examine the surrounding environment, the most intuitive way is to walk around, use the eyes to view and use the hands to feel the objects in the environment. However, there are countless scenarios, where users are either inconvenient to move or the envi- ronment itself is not suitable or reachable for users to explore. For example, it would be hard for a user to examine an object located on the high-shelf without a ladder. Virtual Reality (VR) has brought up the virtual body extension applications to reach and explore the virtual world. By fitting users’ actual hand to the existing hand model in virtual reality, users could explore the environment by extending the length of the arm virtually [9, 10]. However, this virtual body extension technique is still limited in terms of exploration range and its body ownership. Thus, it is only applicable in surrounding environment exploration while users treat the virtual hand as an auxiliary device. Users could not take advantage of the technique either when they want to reach

Upload: others

Post on 23-Aug-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: FlyingHand: Extending the Range of Haptic Feedback on ...daisukeiwai.org/share/paper/Duan_SA18.pdf · Hand using Drone-based Object Recognition Tinglin Duan University of Toronto

FlyingHand: Extending the Range of Haptic Feedback on VirtualHand using Drone-based Object Recognition

Tinglin DuanUniversity of TorontoToronto, Ontario

[email protected]

Parinya PunpongsanonDaisuke IwaiKosuke Sato

{parinya,daisuke.iwai,sato}@sys.es.osaka-u.ac.jpOsaka UniversityToyonaka, Osaka

Figure 1: We proposed a system that allows user to employ a drone with the environment exploration task. (a) The controlroomwhere user could issue commands to the drone. (b) The drone would fly and capture the image of the target object. Then,(c) users could use virtual hand to explore remote environment that is captured from the front camera of a drone. At the sametime, the image captured by the drone would simulate tactile feedback through ultrasound array.

ABSTRACTThis paper presents a Head Mounted Display (HMD) integratedsystem that uses a drone and a virtual hand to help the users exploreremote environment. The system allows the users to use hand ges-tures to control the drone and identify the Objects of Interest (OOI)through tactile feedback. The system uses a Convolutional NeuralNetwork to perform object classification with the drone capturedimage and provides a virtual hand to realize interaction with theobject. Accordingly, tactile feedback is also provided to users’ handsto enhance the virtual hand body ownership. The system aims tohelp users assess space and objects regardless of body limitations,which could not only benefit elderly or handicapped people, butmake potential contributions in environment measurement anddaily life as well.

CCS CONCEPTS•Human-centered computing→Mixed / augmented reality;Interaction devices;

KEYWORDSVirtual hand illusion, body ownership, haptic perception, human-drone interaction

SA ’18 Technical Briefs, December 4–7, 2018, Tokyo, Japan2018. ACM ISBN 978-1-4503-6062-3/18/12. . . $15.00https://doi.org/10.1145/3283254.3283258

ACM Reference Format:Tinglin Duan, Parinya Punpongsanon, Daisuke Iwai, and Kosuke Sato. 2018.FlyingHand: Extending the Range of Haptic Feedback on Virtual Hand usingDrone-based Object Recognition. In SIGGRAPH Asia 2018 Technical Briefs(SA ’18 Technical Briefs), December 4–7, 2018, Tokyo, Japan. ACM, New York,NY, USA, 4 pages. https://doi.org/10.1145/3283254.3283258

1 INTRODUCTIONThe range in which human could feel and examine the world issometimes limited by the body size and movement capabilities, es-pecially for aging or disabilities groups. When a certain individualwants to examine the surrounding environment, the most intuitiveway is to walk around, use the eyes to view and use the hands tofeel the objects in the environment. However, there are countlessscenarios, where users are either inconvenient to move or the envi-ronment itself is not suitable or reachable for users to explore. Forexample, it would be hard for a user to examine an object locatedon the high-shelf without a ladder.

Virtual Reality (VR) has brought up the virtual body extensionapplications to reach and explore the virtual world. By fitting users’actual hand to the existing hand model in virtual reality, userscould explore the environment by extending the length of the armvirtually [9, 10]. However, this virtual body extension technique isstill limited in terms of exploration range and its body ownership.Thus, it is only applicable in surrounding environment explorationwhile users treat the virtual hand as an auxiliary device. Users couldnot take advantage of the technique either when they want to reach

Page 2: FlyingHand: Extending the Range of Haptic Feedback on ...daisukeiwai.org/share/paper/Duan_SA18.pdf · Hand using Drone-based Object Recognition Tinglin Duan University of Toronto

SA ’18 Technical Briefs, December 4–7, 2018, Tokyo, Japan T. Duan et al.

Figure 2: Proposed concept: the system could erase the dis-tance limit between visitor and museums.

a further location, or when they want to get more texture detailsfrom the interaction object.

To extend such limitation, in this paper, we proposed a Head-Mounted Display (HMD) integrated systems that allows users toreach an unreachable world without physically moving its pre-sented. A drone and the virtual hand act as extra eyes and handfor users, where the drone will see the object and the virtual handwould help the users to examine and feel the target objects. Thedesign mainly aims at making the drone control part as intuitiveas possible, while improving the body ownership of the virtualhand. The systems consist of three separate parts. First, the usercould use hand gestures to control the drone and image capturesof the target object (Figure 1(a)). In parallel, the drone location istracked and synchronized into the HMD (Figure 1(b)). Second, theimage captures would then be sent back to a system and performclassification with a trained Neural Network. Third, the user couldvirtually manipulate the target object with the virtual hand whileperceiving according haptic feedback (Figure 1(c)).

To summarize, our paper make the following contribution.

• A system that introduces drone control and tactile feedbackto the virtual hand, allowing users to manipulate and per-ceive both virtual and haptic feedbacks.

• Performed a user study on the drone control as well as thevirtual hand body ownership

2 RELATEDWORKS2.1 Drone for Environment MonitoringIntelligent and interactive robots could play a great role in environ-ment monitoring and measurement. Teymourzadeh et al. adaptedthe spider robot with multiple sensors [8]. The legged robot provedhigh efficiency in path planning and exploration tasks. However,the ground robots’ exploration range and accuracy are yet limitedas their sensors are assembled at a low viewpoint. Compared withground robots, drones’ viewpoints are less restricted as they couldfly over the obstacles and provide real-time data from high above.Drone also have the capabilities for environment monitoring, it’sprovided a relatively high flexibility and adaptivity [7]. Delmericoet al. proposed a drone to help ground robots make real-time pathplanning and localization [2, 6]. Karjalainen et al. showed that par-ticipants preferred a drone companion in a home environment [4].

During the experiment participants treated drones as life assis-tants that could help them accomplish certain tasks. In addition, J.Cauchard et al suggested that hand gesture commands are intuitivewhen interacting with drones[1].

2.2 Virtual Hand and Body OwnershipMultiple attempts have been performed to help users reach theunreachable world with the help of a virtual hand. Ueda et al. pro-posed an extended hand interaction that enables users to manip-ulate home appliance and make human-human communicationwith a projected virtual hand[9]. The model allows users to usea touch panel to control the virtual hand movement while pre-serving some hand gestures on the virtual model. Instead of usingprojection-based visualization, Feuchtner and Muller combines realworld environment with the virtual one [3]. The design records thereal world objects locations into virtual world while fitting a user’shand with a virtual hand model. They allow users to manipulatethe virtual hand in the virtual environments (VEs), whose resultswould in turn be updated back to real world to make the physicalobject move accordingly. However, the systems are still restrainedby the space and the body ownership is yet limited. This extensionspan only suits the need for nearby objects manipulation and theextension of virtual hand more than 150cm may lose the sense ofbody ownership [5]. Our proposed system introduces drone andtactile feedback to the expendable virtual hand design, which notonly improves the application range, but also allows perceived bodyownership. Drone exploration makes it possible for users to interactwith objects more than 150cm from them, while the haptic feed-backs intensifies users’ feeling that the hand extension is actuallyone part of their body.

3 SYSTEM DESIGN3.1 AssumptionWe assume the use case of the remote task scenario where the targetobject is difficult to physically reach by the user such as in the caveor above the shelf. Thus, it requires a remote interaction techniquesuch as drone to explore the scene.

3.2 System Architecture3.2.1 Hardware Setup. We separate the implemented system

into two interaction spaces; the experiment room and control room.The experiment room is consisted of a tracking system (OptiTrackFlex13 cameras) and a drone (Tello Drone by Ryze Tech Inc.). Inorder to track the system, the drone assembles retro-reflective mark-ers. We implement our virtual hand with the Unity3D software andfeed the camera image as the background where the user couldobserve through an HMD. We assume that the target object used inthis study is a sphere and a cube. The sphere has a radius of 1.86cm,and the cube has an edge of 3cm. Both target objects have similarvolumes as 26.95cm3 and 27cm3, respectively. Figure 3 shows theimplementation of the experiment room.

The control room consists of a force feedback device (Ultra-haptics Array Development Kits) embedded with hand trackingsystem (Leap Motion) and an HMD Display (Oculus Rift DK2). Allequipment is connected to a PC (Intel Core i7, 64GB RAM, NvidiaGeforce GTX970 4GB RAM) and communicates through Python

Page 3: FlyingHand: Extending the Range of Haptic Feedback on ...daisukeiwai.org/share/paper/Duan_SA18.pdf · Hand using Drone-based Object Recognition Tinglin Duan University of Toronto

FlyingHand SA ’18 Technical Briefs, December 4–7, 2018, Tokyo, Japan

Figure 3: System implementation. (a) A tracking setupwith adrone, and (b) a Tello drone assembling with retro-reflectivemarkers for tracking capability.

script. Users could view all the objects through the drone frontcamera from an HMD, and use a real hand to control the drone.The hand gesture is tracked with Leap Motion tracking and it’sdirectly mapped to the drone movement. At the same time, thetactile feedback is generated by the Ultrahaptics array according tothe virtual hand location and the target object. The control roomissue commands to the drone (in the experiment room) throughwireless network, and the drone transmit the location informationback via the same network.

3.2.2 Controller. In order to make the best use of human bodyand treat the drone as an extension of user’s hand, we use LeapMotion to issue the control commands to the drone. The movementcontrols are set by the relative positions of user’s hand and theLeap Motion. For instance, when user place their hand to the leftof the Leap Motion, the system will send commands to the droneto move left. Since the movement of drone is slower than howinteractively of human gesture, we implement the transfer functionf that indicate the relative thresholds between the command inputx and the physical movement f (x) of the drone as follows:

f (x) =

{0 if x ∈ [0,α]( x−α1−α

)β if x ∈ [α , 1]

where α controls the beginning of the drone motion and β controlsthe slope of the function.

To separate the controller between drone and virtual hand, weimplement an angle threshold that handles whether moving thedrone or the virtual hand. Figure 4 shows the diagram of separatecontroller for drone and virtual hand. When the user reaches theirhand within the 45deg the system will send command to the drone,while user reaches the hand more than 45deg the system send thecommand to the virtual hand. By this interactionmetaphor, it allowsthe user to perceive force feedback from the haptic feedback devicethat locates on the front of hand tracking.

3.2.3 Estimation of Remote Environments. Since, the cameraonly feeds the captured images back to the system, it is not possibleto estimate the distance between target object location and thedrone location. To solve this problem, we estimate the distance fromthe drone’s position to the target object from our external trackingsystem. Thus, we can accurately calculate the drone position andthe range of virtual hand. As shown in Figure 1(c), we realize the

Figure 4: The diagram indicates the separate controller fordrone and virtual hand.

extension of a hand by adopting a relative linear model with thehand model. Taking the actual hand position for each frame a, thevirtual hand position v and the gain д, we update the virtual handposition as follows:

vt = vt−1 + д ∗ (at − at−1)

where t and t − 1 are a current frame and previous frame, respec-tively. The gain value could be tuned to meet different needs.

3.2.4 Image Classification for Haptic Mapping. Once the dronefinds the target object, it takes an image and transmits back to thePC through a network. Then, we apply a color filtering to removethe background noise and feed its sequences to the ConvolutionalNeural Network (CNN) to derive classification of a target object.We use the CNN with 5-layers architecture, each consisting of a 2Dconvolution layer and a 2D max pooling layer. We sampled a totalof 12400 images for training set, and 2000 images for validation set.We found the accuracy of 98.64% and 96.5% for training set andvalidation set, respectively.

4 EVALUATION4.1 ProcedureWe conducted a user experiment to evaluate the effectiveness of theproposed method to perceive body ownership and the usefulnessof the proposed drone control method. The evaluation is separatedinto two parts: First, the participants were asked to control thedrone with their hand gestures to move forward and backward. Theparticipants observed the drone movement via the HMD. Second,the participants were asked to use the virtual hand to touch thetarget object.

We conducted a qualitative evaluation with 11 participants (10males and 1 female, from 20 to 29-year old). Only one participanthad an experience with force feedback device, and two participantshad the experience with HMD. All participants had or corrected tonormal vision.

4.2 QuestionnaireOnce participants finished the experiment they asked to fill a ques-tionnaire to give us general feedbacks on the experience with theproposed system. The questionnaires have been designed as follows:

Q1: I felt my hand movement was causing the movement of the drone

Page 4: FlyingHand: Extending the Range of Haptic Feedback on ...daisukeiwai.org/share/paper/Duan_SA18.pdf · Hand using Drone-based Object Recognition Tinglin Duan University of Toronto

SA ’18 Technical Briefs, December 4–7, 2018, Tokyo, Japan T. Duan et al.

Figure 5: Boxplots of user feedbacks for each question on afive-point-Likert scale.

Q2: I felt my hand gesture could make the drone to move constantlyQ3: I felt my hand movement was causing the movement of the virtual handQ4: I felt the virtual hand was part of my body

The participants rate from 1: strongly disagree to 5: strongly agreefor each question.

4.3 ResultFigure 5 shows the subjective score obtained for each question. Weconfirm that our subjective questionnaire has a good reliabilityusing the Cronbash’s alpha test (α = 0.702). Overall, the questionswere better ranked for Q1 "I felt my hand movement was causingthe movement of the drone" (avg. 4.0), Q3 "I felt my hand movementwas causing the movement of the virtual hand" (avg. 4.36) and Q4 "Ifelt the virtual hand was part of my body" (avg. 4.09). We analyzethe user preference using the one-way ANOVA Friedman-Test, but,cannot find the significant difference among the questions.

4.4 DiscussionThe experiment result shows that the proposed method is easy tocontrol the movement of the drone (Q1). However, the movementdirection control is realized by exerting forces to the drones. Forexample, when a user issues the command ”backward” to a dronethat being ”forward”, the flying mechanism needs to slow down theforward speed first before moving in the other direction. Thus, theusers feel a latency when manipulating the drone, which reflects tothe lower score in Q2.

For the improvement of body ownership using proposed method,we found that the tactile feedback provides great experience en-hancementwhen users interact with the target object, which reflectsto the higher score in Q3. The body ownership of the virtual handcould be thus concluded to have improved with the introduction oftactile feedback. However, with the limited degree of freedom of thetactile representation of the device, it is noticed that the user mightlose the body ownership once the distance between the actual handand tactile feedback is different.

5 APPLICATIONSFrom the experimental result, we confirm that the proposed systemimproves the experiences of body ownership with virtual hand. Weenvision the application of this system, which could be a virtualexperience of museums as is shown in Figure 2. This applicationcould erase the distance limit between visitors and museums. The

visitors could rent a museum drone remotely, which has the classifi-cation ability to identify the unique codes of different paintings andsculptures. The users could use their hands to control the dronefly around the museum and stop at the sculpture of their interest.Then, the visitors could manipulate and examine the details of thearts with their virtual hands, while receiving tactile feedback basedon the arts details. The design also allows anyone to access anymuseum anywhere, without any capacity issues. A new runningstyle of museums could be opening in regular hours for physicalvisitors, and in closed hours for drone visitors.

6 CONCLUSIONAnHMD integrated system has been proposed to help users exploreand feel the surrounding and remote environment. The systemallows users to treat drone as extra eyes and hand, by controlling itwith hand gestures remotely. Neural Network technique has beenapplied to analyze drone captured information. Users could gethaptic feedback on their palms based on the drone captured data,which helps them to get a sense of the remote objects.

7 ACKNOWLEDGMENTSThis work has been supported by FrontierLab@OsakaU and theJSPS KAKENHI under grant numbers JP15H05925 and JP16H02859.

REFERENCES[1] Jessica R. Cauchard, Jane L. E, Kevin Y. Zhai, and James A. Landay. 2015. Drone

& Me: An Exploration into Natural Human-drone Interaction. In 2015 ACMInternational Joint Conference on Pervasive and Ubiquitous Computing (UbiComp’15). ACM, 361–365. https://doi.org/10.1145/2750858.2805823

[2] J. Delmerico, E. Mueggler, J. Nitsch, and D. Scaramuzza. 2017. Active AutonomousAerial Exploration for Ground Robot Path Planning. IEEE Robotics and AutomationLetters 2, 2 (April 2017), 664–671. https://doi.org/10.1109/LRA.2017.2651163

[3] T. Feuchtner and J. Müller. 2017. Extending the Body for Interaction with Reality.In 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17). ACM,5145–5157. https://doi.org/10.1145/3025453.3025689

[4] Kari D. Karjalainen, Anna Elisabeth S. Romell, P. Ratsamee, Asim E. Yantac, M.Fjeld, and M. Obaid. 2017. Social Drone Companion for the Home Environment:A User-Centric Exploration. In 5th International Conference on Human AgentInteraction (HAI ’17). ACM, 89–96. https://doi.org/10.1145/3125739.3125774

[5] K. Kilteni, Jean M. Normand, Maria V. Sanchez-Vives, and M. Slater. 2012. Extend-ing Body Space in Immersive Virtual Reality: A Very Long Arm Illusion. PLoSONE 7, 7 (7 2012). https://doi.org/10.1371/journal.pone.0040867

[6] R. KÃďslin, P. Fankhauser, E. Stumm, Z. Taylor, E. Mueggler, J. Delmerico, D.Scaramuzza, R. Siegwart, and M. Hutter. 2016. Collaborative localization ofaerial and ground robots through elevation maps. In 2016 IEEE InternationalSymposium on Safety, Security, and Rescue Robotics (SSRR). 284–290. https://doi.org/10.1109/SSRR.2016.7784317

[7] S. Manfreda, Matthew F. McCabe, Pauline E. Miller, R. Lucas, V. Pajuelo Madrigal,G. Mallinis, E. Ben Dor, D. Helman, L. Estes, G. Ciraolo, J. MÃijllerovÃą, F. Tauro,M. Isabel de Lima, JoÃčo L. M. P. de Lima, A. Maltese, F. Frances, K. Caylor, M.Kohv, M. Perks, G. Ruiz-PÃľrez, Z. Su, G. Vico, and B. Toth. 2018. On the Use ofUnmanned Aerial Systems for Environmental Monitoring. Remote Sensing 10, 4(2018). https://doi.org/10.3390/rs10040641

[8] R. Teymourzadeh, Rahim Nakhli Mahal, Ng Keng Shen, and Kok Wai Chan. 2013.Adaptive intelligent spider robot. In 2013 IEEE Conference on Systems, ProcessControl (ICSPC). 310–315. https://doi.org/10.1109/SPC.2013.6735153

[9] Y. Ueda, Y. Asai, R. Enomoto, K. Wang, D. Iwai, and K. Sato. 2017. Body Cyber-ization by Spatial Augmented Reality for Reaching Unreachable World. In 8thAugmented Human International Conference (AH ’17). ACM, Article 19, 9 pages.https://doi.org/10.1145/3041164.3041188

[10] S. Yamashita, R. Ishida, A. Takahashi, H. Wu, H. Mitake, and S. Hasegawa. 2018.Gum-gum Shooting: Inducing a Sense of ArmElongation via Forearm Skin-stretchand the Change in the Center of Gravity. In ACM SIGGRAPH 2018 EmergingTechnologies (SIGGRAPH ’18). ACM, Article 8, 2 pages. https://doi.org/10.1145/3214907.3214909