[ieee 2014 international symposium on technology management and emerging technologies (istmet) -...

4
Wireless Gesture Recognition System using MEMS Accelerometer Othman Sidek School of Electrical & Electronic Engineering Universiti Sains Malaysia Engineering Campus 14300 Nibong Tebal, Pulau Pinang, Malaysia [email protected] Munajat Abdul Hadi School of Electrical & Electronic Engineering Universiti Sains Malaysia Engineering Campus 14300 Nibong Tebal, Pulau Pinang, Malaysia [email protected] AbstractGesture recognition system is a system to interpret movement of hand or head via algorithms. Algorithms to interpret gesture are in the form of software, hardware or combination of both. The main goal of gesture recognition system is to enable humans to communicate with machine. This paper present the development of wireless Bluetooth hand gesture recognition system using six 3-axis accelerometers embedded in a glove and a database system in a computer. This system can recognize any sampled data saved in the database while promoting maximum portability and mobility to the user via wireless Bluetooth technology. Analyses such as static data, dynamic data, and average recognition rates relationships are discussed in this paper. Keywords-component; Accelerometer; Computing; Gesture recognition; Hardware; Software I. INTRODUCTION Gestures are physical movements of hand or head with expressive and meaningful motions to convey or interact with surrounding [1]. Gesture recognition system is a system to interpret movement of hand or head via an algorithm to enable human communication with machine. Gesture recognition systems have various types of applications such as in sign language recognition, socially assistive robotics, alternative computer interfaces, immersive game technology, virtual controller, remote control and etc. [2] There have been many approaches to handle gesture recognition [3], from software to hardware or combination of both. There are many who have developed gesture recognition system by using image processing, accelerometer, gyroscope, or mathematical model for various applications. [4- 9] Practical implementation of gesture recognition system based on possibility distribution of movement can utilize signal of movement based on acceleration measured from using a Micro Electro Mechanical System (MEMS) accelerometer. [10] Accelerometer behaves as a damped mass on a spring. When the accelerometer experiences an acceleration, the mass is displaced to the point that the spring is able to accelerate the mass at the same rate as the casing. The displacement is then measured to give the acceleration. Also, gravitational acceleration value, measured in g, is influencing the accelerometer to generate an offset value when the sensor is on a static state. Ji-Hwan Kim, et al. developed a system using three 3-axis accelerometer placed on thumb, index and middle fingers. But the limitation of this system is that it cannot recognize or differentiate a more complex gesture [11]. In this paper, we have developed a hand gesture recognition system composed of six 3-axis accelerometers, a controller, a Bluetooth module, and database system. II. DESIGN A. Hardware and Software Platform Figure 1 shows the hardware platform of proposed system. The system has six 3-axis accelerometers – one on each finger and one on the back of the palm integrated into the glove to detect hand positions and motions. The output of the 3-axis accelerometer is 8-bit digital signal with decimal value of 0- 255. All the accelerometers are connected to a microcontroller and the raw data received are mapped and arranged in an array before it is transferred serially to a Bluetooth module. Microcontroller Bluetooth Module Six 3-Axis Accelerometer Computer (Database) Battery pack Fig. 1. System overview 2014 International Symposium on Technology Management and Emerging Technologies (ISTMET 2014), May 27 - 29, 2014, Bandung, Indonesia 978-1-4799-3704-2/14/$31.00 ©2014 IEEE 444

Upload: munajat

Post on 01-Apr-2017

218 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: [IEEE 2014 International Symposium on Technology Management and Emerging Technologies (ISTMET) - Bandung, Indonesia (2014.5.27-2014.5.29)] 2014 International Symposium on Technology

Wireless Gesture Recognition System using MEMS Accelerometer

Othman Sidek School of Electrical & Electronic Engineering

Universiti Sains Malaysia Engineering Campus 14300 Nibong Tebal, Pulau Pinang, Malaysia

[email protected]

Munajat Abdul Hadi School of Electrical & Electronic Engineering

Universiti Sains Malaysia Engineering Campus 14300 Nibong Tebal, Pulau Pinang, Malaysia

[email protected]

Abstract— Gesture recognition system is a system to interpret movement of hand or head via algorithms. Algorithms to interpret gesture are in the form of software, hardware or combination of both. The main goal of gesture recognition system is to enable humans to communicate with machine. This paper present the development of wireless Bluetooth hand gesture recognition system using six 3-axis accelerometers embedded in a glove and a database system in a computer. This system can recognize any sampled data saved in the database while promoting maximum portability and mobility to the user via wireless Bluetooth technology. Analyses such as static data, dynamic data, and average recognition rates relationships are discussed in this paper.

Keywords-component; Accelerometer; Computing; Gesture recognition; Hardware; Software

I. INTRODUCTION Gestures are physical movements of hand or head with

expressive and meaningful motions to convey or interact with surrounding [1]. Gesture recognition system is a system to interpret movement of hand or head via an algorithm to enable human communication with machine.

Gesture recognition systems have various types of applications such as in sign language recognition, socially assistive robotics, alternative computer interfaces, immersive game technology, virtual controller, remote control and etc. [2]

There have been many approaches to handle gesture recognition [3], from software to hardware or combination of both. There are many who have developed gesture recognition system by using image processing, accelerometer, gyroscope, or mathematical model for various applications. [4- 9]

Practical implementation of gesture recognition system based on possibility distribution of movement can utilize signal of movement based on acceleration measured from using a Micro Electro Mechanical System (MEMS) accelerometer. [10]

Accelerometer behaves as a damped mass on a spring. When the accelerometer experiences an acceleration, the mass is displaced to the point that the spring is able to accelerate the mass at the same rate as the casing. The displacement is then measured to give the acceleration. Also, gravitational acceleration value, measured in g, is influencing the

accelerometer to generate an offset value when the sensor is on a static state.

Ji-Hwan Kim, et al. developed a system using three 3-axis accelerometer placed on thumb, index and middle fingers. But the limitation of this system is that it cannot recognize or differentiate a more complex gesture [11].

In this paper, we have developed a hand gesture recognition system composed of six 3-axis accelerometers, a controller, a Bluetooth module, and database system.

II. DESIGN

A. Hardware and Software Platform

Figure 1 shows the hardware platform of proposed system.

The system has six 3-axis accelerometers – one on each finger and one on the back of the palm integrated into the glove to detect hand positions and motions. The output of the 3-axis accelerometer is 8-bit digital signal with decimal value of 0-255.

All the accelerometers are connected to a microcontroller and the raw data received are mapped and arranged in an array before it is transferred serially to a Bluetooth module.

Microcontroller

Bluetooth Module

Six 3-Axis

Accelerometer

Computer (Database)

Battery pack

Fig. 1. System overview

2014 International Symposium on Technology Management and Emerging Technologies (ISTMET 2014), May 27 - 29, 2014,Bandung, Indonesia

978-1-4799-3704-2/14/$31.00 ©2014 IEEE 444

Page 2: [IEEE 2014 International Symposium on Technology Management and Emerging Technologies (ISTMET) - Bandung, Indonesia (2014.5.27-2014.5.29)] 2014 International Symposium on Technology

The Bluetooth module with Serial Port Profile (SPP) is used to transfer the data wirelessly to a computer. The Bluetooth module supports up to 38400 baud rate. The Bluetooth module also connects to a battery pack to promote maximum mobility to the wearer. The operating voltage of the system is 5V for the Bluetooth module and 3.3V for the accelerometer and microcontroller. Voltage regulator and bridge circuit are used to generate these voltages.

Data acquired by the computer via Bluetooth channel. Data acquired are saved in a database called library by means of a graphical user interface (GUI) in a computer. The GUI system is created to ease the collection of sample data and in the same time can be used to recognize gestures from the glove. Each gesture must have more than one sample of data to be recognized correctly by the system. The same data in the library are then used to recognize gestures by comparing all probabilities from the sampled data. The recognition system will returns the recognized gesture based on the highest probabilities score.

Figure 1 also shows the location of accelerometers on every fingers and one on the back of the palm together with the microcontroller circuit.

Each accelerometer has three sensing elements, corresponding to the X, Y, and Z signals output from the glove. The Z-sensing element is oriented along the “gravity vector” that is perpendicular to the Earth surface. The X and Y-sensing elements both lay perpendicular to the Z-axis. This is shown in Figure 2 below.

B. Mapping the Sensors Table 1 shows the sensor mappings and corresponding

position of the returned sensor values in raw data string. Each sensors have three axes and each one returned values

are 8-bit data ranging from 0 to 255. Eighteen data (three data for each accelerometers) are arranged in array X(0) Y(1) Z(2) X(3) Y(4) Z(5) X(6) Y(7) Z(8) X(9) Y(10) Z(11) X(12) Y(13) Z(14) X(15) Y(15) Z(17). These array are then converted to a string type data by retaining its structure and order. This string will be used as serial data in the Bluetooth module and database system.

The only acceleration acting on the accelerometer in static state is gravity. When the hand in “Rest” position as shown in Figure 3, the gravitational force acted on all 3-axis of the accelerometer.

Table 1: Table of Sensors Mapping

Sensor Array

Thumb X(0) Y(1) Z(2)

Index X(3) Y(4) Z(5)

Middle X(6) Y(7) Z(8)

Ring X(9) Y(10) Z(11)

Pinky X(12) Y(13) Z(14)

Palm X(15) Y(16) Z(17)

III. EVALUATION AND ANALYSIS

A. Static Data Analysis

Fig. 3. "Rest" and "Thumb-up" position

The readings from all of the accelerometers are as in Figure

4 below. It can be seen that, when all the accelerometers are in static state, force acting on the accelerometers is gravity.

These values are called offset values or initial value. These values will be used to calibrate the system and will be used as intial value to recognize a motion.

In this test of 1300 sample data, five times of “Thumb-up” position and “Rest” position as shown in Figure 3 above are taken. The values of all accelerometers are plotted in a single graph as shown in Figure 5. Based on the graph, we can see a pattern of each gestures performed. Gravitational force acted on every single accelerometers in “Thumb-up” position are different from “Rest” position as the plane of accelerometers changes with different positions.

Figure 6 shows probability graph obtains from the same test. “Thumb-up” probability rose to 100% when in “Thumb-up” position and vice versa. There are only two values (0% and 100%) because there are only two gestures in the library

X

Y

Z

Gravity  (1g)

Fig. 2. Accelerometer axes (X, Y, Z) relative to gravity vector

445

Page 3: [IEEE 2014 International Symposium on Technology Management and Emerging Technologies (ISTMET) - Bandung, Indonesia (2014.5.27-2014.5.29)] 2014 International Symposium on Technology

and the two gestures are completely different from one another. This analysis can be used in the gesture recognition system to distinguish between sampled gestures. Highest probability obtained will deem the correct gesture compared with other sampled gestures.

Fig. 4. Graph of "Rest" samples test

Fig. 5. Graph of "Rest" and "Thumb-up" samples test

Fig. 3. Probability graph of the samples

B. Dynamic Data Analysis When the accelerometer is in motion, there will be an

acceleration due to inertia. These values can also be obtained by the 3-axis accelerometer.

In the next experiment, only the palm accelerometer data is shown for the sake of presentation. The axes affected by the inertia is dependant of the direction of the inertia vector.

The Figure 7 shows graph of palm accelerometer at “Rest” position. The values obtained is called offset values where it can be used as initial values. Any movement in any axes will depends in the direction its axis.

A simple test has been done to test the depandency of the axes with respect to the inertia from a simple movement. The glove is shaken in X-direction and the results are plotted in a

graph for easy viewing. Then the test continues by shaking in Y and Z-direction respectively.

Fig. 4. Palm accelerometer at "Rest" position

Figure 8 above shows graph of palm accelerometer when

the accelerometer is shaken in X-direction. This shows that only x-axis are majorly affected by the motion. The Y and Z axes are still affected minorly from the movement. This analysis can helps eliminate false data obtained from the accelerometers by ignoring any data from other axes. The results are the same with Y-direction and Z-direction

446

Page 4: [IEEE 2014 International Symposium on Technology Management and Emerging Technologies (ISTMET) - Bandung, Indonesia (2014.5.27-2014.5.29)] 2014 International Symposium on Technology

Fig. 5. Graphs of palm accelerometers shaken in their

respective axes

C. Number of samples and recognition rates In this experiment, a relationship between number of

samples and recognition rates are performed. A set of gestures was performed and with each gesture, a few numbers of samples are saved in the database (library). Then, average recognition rates are calculated based on the system ability to recognize gesture from the previous samples. The experimental data are as shown in table 2 below.

Table 2: Table of Relationship between Recognition Rates

Due to Number of Samples Gesture No. of saved

samples Average

recognition rates, %

Rock 1 30 Paper 5 50

Scissor 10 75 Rest 15 95

Thumb-up 25 98

From the table above, gesture “Rock” has the lowest success rate of recognition compared with gesture “Thumb-up” with about 98%. This is due to the factor of number of saved samples in the database. It is very hard to obtain recognition for “Rock” as there is only one sample in the database. The only way “Rock” gesture can be recognized is by accurately mimicking back the sample saved in the database. To achieve more than 95% recognition rate, samples of fifteen data or more must be collected for each gestures.

This means that in order to create a feasible database for gesture recognition, each gesture must have more than fifteen samples, as it will widen the probability rate for that respective gesture to be recognized.

IV. Conclusion This research has verified the ability to design a wireless

gesture recognition system using six 3-axis MEMS accelerometer. The usage of Bluetooth module and battery pack have proven to be useful in the research as it promotes

more mobility to the user by minimizing the use of wires and cables from the glove system to the computer (database).

Analyses performed in this research provide more understanding on the behavior of the accelerometer and the complete system. This can help as key points in the more detailed future works by using the analyzed data from this research in the design considerations.

The main drawback of this research is that the system will still need a computer to store its database data. This will limit portability of the system and the system can only be used in a limited set of applications.

Future works will include database system embedded in hand held device (e.g. smart phone or tablet PC) for more mobility and portability. This will promote a wider application for the system.

ACKNOWLEDGEMENT This research is fully supported by Universiti Sains

Malaysia’s APEX Delivering Excellence Grant (1002/PCEDEC/910349). The authors fully acknowledged Universiti Sains Malaysia for the approved fund, which makes this important research viable and effective.

REFERENCES [1] S. Mitra and T. Acharya, "Gesture Recognition: A Survey", IEEE

Transactions on Systems, Man and Cybernetics," vol. 37, no. 3, May 2007, pp. 311-324

[2] C. L. Lisetti and D. J. Schiano, “Automatic classification of single facial images,” Pragmatics and Cognition, vol. 8, pp. 185–235, 2000.

[3] V. I. Pavlovic, R. Sharma, and T. S. Huang, “Visual interpretation of hand gestures for human computer interaction,” IEEE Transactions on Pattern Analysis and Machine Intelligence., vol. 19, no. 7, pp. 677–695, Jul. 1997.

[4] Matthias Rehm, Nikolaus Bee, Elisabeth André, “Wave Like an Egyptian - Accelerometer Based Gesture Recognition for Culture Specific Interactions”, British Computer Society, 2007

[5] V. Pavlovic, R. Sharma, & T. Huang, "Visual interpretation of hand gestures for human-computer interaction: A review”, IEEE Transactions on Pattern Analysis and Machine Intelligence, July, 1997. Vol. 19(7), pp. 677 -695.

[6] R. Cipolla and A. Pentland, “Computer Vision for Human-Machine Interaction”, Cambridge University Press, 1998

[7] Alejandro Jaimesa and Nicu Sebe, “Multimodal human–computer interaction: A survey”, Computer Vision and Image Understanding, Volume 108, Issues 1-2, October–November 2007, Pages 116-134 Special Issue on Vision for Human-Computer Interaction

[8] Thad Starner, Alex Pentland, “Visual Recognition of American Sign Language Using Hidden Markov Models”, Massachusetts Institute of Technology

[9] Kai Nickel, Rainer Stiefelhagen, “Visual recognition of pointing gestures for human-robot interaction”, Image and Vision Computing, Vol. 25, Issue 12, December 2007, pp 1875-1884

[10] E. Benoit, T. Allevard, T. Ukegawa, and H. Sawada, "Fuzzy Sensor for gesture Recognition Based on Motion and Shape Recognition of Hand," VECIMS 2003 - International Symposium on Virtual Environments, Human-Computer Interfaces.

[11] Ji-Hwan Kim, Nguyen Duc Thang, Tae-Seong Kim, “3-D Hand Motion Tracking and Gesture Recognition Using a Data Glove,” IEEE International Symposium on Industrial Electronics (ISIE 2009)

447