tangohapps - an integrated development environment for
TRANSCRIPT
TangoHapps - An IntegratedDevelopment Environment for Smart
Textiles
Juan Haladjian
INSTITUT FÜR INFORMATIK DER TECHNISCHEN UNIVERSITÄTMÜNCHEN
Forschungs- und Lehreinheit I Angewandte Softwaretechnik
TangoHapps - An IntegratedDevelopment Environment for Smart
Textiles
Juan I. Haladjian Madrid
Vollständiger Abdruck der von der Fakultät für Informatik der TechnischenUniversität München zur Erlangung des akademischen Grades eines Doktors der
Naturwissenschaften (Dr. rer. nat.) genehmigten Dissertation.
Univ.-Prof. Dr. Hans-Joachim BungartzVorsitzender:Prüfer der Dissertation: 1. Univ.-Prof. Bernd Brügge, Ph.D
2. Prof. Daniel Siewiorek, Ph.D
Die Dissertation wurde am 28.06.2016 bei der Technischen Universität Müncheneingereicht und durch die Fakultät für Informatik am 05.09.2016 angenommen.
A Chechu, con cariño.
v
Acknowledgments
This work would not have been possible without the support of many other people.First of all I would like my professor Bernd Brügge to know how much of a positive
influence he has been to my personal and professional development and thank him forencouraging me to grow in his team. I would also like to thank my second supervisorDaniel Siewiorek for his great insight and exhaustive feedback on my Interactex andKneeHapp papers.
I am very grateful to all the members of the Chair for Applied Software Engi-neering at the TUM. A special mention goes to my colleagues Zardosht Hodaie, HanXu, Stephan Krusche, Emitzá Guzmán and Yang Li for proof reading my papers anddifferent parts of my thesis and to Jan Knobloch for his support during the Custo-dian project. Furthermore, I would like to thank my friends Mohammed Souiai andJelena Prsa for their emotional support throughout my times at TUM and for ourmotivational meetings at the Mensa.
TangoHapps was developed in collaboration with the Design Research Lab at theUniversität der Künste in Berlin. In particular, TangoHapps came to life thanks to thegreat insights and efforts from Katharina Bredies. I would like to thank Katharina andher colleagues Sara Diaz Rodriguez, Pauline Vierne, Hannah-Perner Wilson, GescheJoost for our fruitful four year collaboration. Thanks also to Martijn ten Böhmerand Oscar Tomico from Technische Universiteit Eindhoven, Kyle Zhang from theUniversiteit Twente and Jussi Mikkonen from Aalto University for contributing toTangoHapps’ code base.
The research leading to these results has received funding from:
• Federal Ministry of Education and Research (BMBF) under grant number”01IS12057” as part of the German program “Software Campus”.
• EIT ICT “Connected Textiles” activity in the Activity Nr. 13087 under the“Smart Spaces” Action Line.
• Bavaria California Technology Center (BaCaTeC).
The responsibility concerning the content in this dissertation is left to the author.
vii
Abstract
Construction kits and development environments have dramatically reduced the timeand expertise required for the development of physical interactive devices. The fieldof smart textiles, however, is relatively new and still lacks of established tools thatsupport their development. In this dissertation, we introduce TangoHapps, a vi-sual programming and simulation environment specifically designed for smart textiles.TangoHapps supports different activities related to smart textile development, includ-ing application software development, testing and circuit design. TangoHapps’ visualprogramming semantics span across mobile phone and smart textile enabling users totake advantage of capabilities present on both devices without writing source code.TangoHapps has a high-ceiling, which we demonstrate by recreating two smart tex-tiles from different application domains. Furthermore, we provide evidence from twouser studies that TangoHapps lowers entrance barrier to smart textile development.
ix
Contents
1 Introduction 11.1 Research Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.1.1 Use Case Development . . . . . . . . . . . . . . . . . . . . . . . 31.1.2 Use Case Validation . . . . . . . . . . . . . . . . . . . . . . . . 41.1.3 IDE Extension . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.2 Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2 Foundations 72.1 History . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.1.1 Wearable Devices . . . . . . . . . . . . . . . . . . . . . . . . . . 92.1.2 Smart Textiles . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.2 Fabrication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132.3 Anatomy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
2.3.1 Sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172.3.2 Output Devices . . . . . . . . . . . . . . . . . . . . . . . . . . . 202.3.3 Connections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
2.4 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3 TangoHapps Framework 253.1 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.1.1 Development Tools for Smart Textiles . . . . . . . . . . . . . . . 263.1.2 Development Tools for Physical Devices . . . . . . . . . . . . . 263.1.3 Hardware Construction Toolkits . . . . . . . . . . . . . . . . . . 283.1.4 Circuit Layout Software . . . . . . . . . . . . . . . . . . . . . . 303.1.5 Simulation Environments and Operating Systems . . . . . . . . 30
3.2 Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313.2.1 Functional Requirements . . . . . . . . . . . . . . . . . . . . . . 313.2.2 Non-Functional Requirements . . . . . . . . . . . . . . . . . . . 33
3.3 Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353.3.1 Model of Smart Textiles . . . . . . . . . . . . . . . . . . . . . . 353.3.2 Model of an Application . . . . . . . . . . . . . . . . . . . . . . 363.3.3 Model of the Editor . . . . . . . . . . . . . . . . . . . . . . . . . 373.3.4 Model of the Simulator . . . . . . . . . . . . . . . . . . . . . . . 383.3.5 Model of the Deployer . . . . . . . . . . . . . . . . . . . . . . . 39
xi
CONTENTS
4 TangoHapps Design 414.1 Design Decisions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 414.2 High-Level Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 454.3 Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 464.4 Hardware/Software Mapping . . . . . . . . . . . . . . . . . . . . . . . . 474.5 Running Engine . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
4.5.1 Electronic Devices . . . . . . . . . . . . . . . . . . . . . . . . . 504.5.2 UI Widgets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 524.5.3 Programming Objects . . . . . . . . . . . . . . . . . . . . . . . 53
4.6 Plugin Interpreter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 544.7 Firmata Library . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 554.8 Editor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
4.8.1 Palette . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 584.8.2 Toolbar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 594.8.3 Canvas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 604.8.4 Circuit Layout . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
4.9 Plugin Editor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 634.10 Simulator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 664.11 Deployer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
5 TangoHapps User Interface 695.1 Interactex Designer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
5.1.1 Project Selection Screen . . . . . . . . . . . . . . . . . . . . . . 695.1.2 Editor Screen . . . . . . . . . . . . . . . . . . . . . . . . . . . . 715.1.3 Canvas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 725.1.4 Circuit Layout . . . . . . . . . . . . . . . . . . . . . . . . . . . 745.1.5 Simulator Screen . . . . . . . . . . . . . . . . . . . . . . . . . . 74
5.2 Interactex Client . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 755.2.1 Download Screen . . . . . . . . . . . . . . . . . . . . . . . . . . 755.2.2 User Applications Screen . . . . . . . . . . . . . . . . . . . . . . 765.2.3 Default Applications Screen . . . . . . . . . . . . . . . . . . . . 765.2.4 Application Screen . . . . . . . . . . . . . . . . . . . . . . . . . 78
5.3 TextIT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 785.4 Usage Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
5.4.1 Development of an Interactex Application . . . . . . . . . . . . 795.4.2 Development of TextIT plugin . . . . . . . . . . . . . . . . . . . 81
6 Applications 856.1 Application 1: KneeHapp Bandage . . . . . . . . . . . . . . . . . . . . 85
6.1.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 866.1.2 KneeHapp Bandage . . . . . . . . . . . . . . . . . . . . . . . . . 876.1.3 Implementation with TangoHapps . . . . . . . . . . . . . . . . . 93
6.2 Application 2: Custodian Jacket . . . . . . . . . . . . . . . . . . . . . . 1026.2.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1026.2.2 Custodian Jacket . . . . . . . . . . . . . . . . . . . . . . . . . . 1036.2.3 Implementation with TangoHapps . . . . . . . . . . . . . . . . . 106
xii
CONTENTS
7 Evaluation 1157.1 User Study 1: Novice Users . . . . . . . . . . . . . . . . . . . . . . . . 115
7.1.1 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1167.2 User Study 2: Professional Smart Textile Developers . . . . . . . . . . 117
7.2.1 Study Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
8 Conclusions and Future Work 1218.1 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
A Application Objects 125A.1 Events Methods and Variables . . . . . . . . . . . . . . . . . . . . . . . 125A.2 Palette Objects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135A.3 Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135A.4 Simulation Objects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144A.5 TextIT Objects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
A.5.1 Code Editor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157A.5.2 Line Chart Displaying Jogging Signal . . . . . . . . . . . . . . 159A.5.3 TextIT Plugin for Counting Peaks in a Signal . . . . . . . . . . 159
Bibliography 167
xiii
CONTENTS
xiv
Chapter 1
Introduction
"The most profound technologiesare those that disappear. Theyweave themselves into the fabric ofeveryday life until they areindistinguishable from it."
Mark Weiser
In the 1950s, computing devices occupied an entire room. They were so expensivethat only governments and big corporations could afford them and only technicianswith special training were able to use them. Miniaturization of electronics, reductionin hardware prices and advances in usability engineering have enabled the appearanceof smaller, yet more powerful, cheaper and more usable computing devices. Comput-ing devices evolved from personal computers to mobile phones and wearable devices.Each new generation of computing device is smaller and integrates more seamlesslyinto users lives.
Mark Weiser foresaw that computing devices would disappear entirely from auser’s perspective, becoming “indistinguishable from the fabric of everyday life”. Hismetaphor has become a reality today, as electronic devices are being integrated in thefibers used to create fabrics. For example, tiny electronic components can be hiddeninside yarn by wrapping fiber strands around them [96]. Textiles with integratedelectronic devices are called smart textiles and are the focus of this dissertation.
Smart textiles worn on the body, also called smart garments, are particularlyinteresting if they access information about their wearer such as his/her posture m[74],physical activity [43], heart rate [17], pH levels [87] or even the presence of an infection[1]. This information can be used for monitoring vital signs of patients [89], soldiers[90], firefighters [20], tracking athlete’s performance [51] and helping blind users avoidobstacles [6]. Smart textiles enable new ways of interaction that can be integratedseamlessly into users’ lives. For example, smart textiles do not require users to lookat a screen (eyes-free) or to use their hands (hands-free) for interaction.
Several construction kits [49, 54, 97] and development environments [24, 48, 7]have been created, which lower the entrance barrier and reduce the time needed todevelop physical electronic devices. Smart textiles, however, still have no establishedtools that support their development. Furthermore, smart textile developers are still
1
CHAPTER 1. INTRODUCTION
using tools originally designed for other purposes such as circuit layout software forconventional electronics and programming environments designed for general-purposemicrocontrollers. Software development environments for smart textiles face the fol-lowing challenges:
1. The development of a smart textile is time consuming. In contrast to thefield of physical devices, ready-to-use textile components can rarely be bought.Instead, they usually have to be designed and fabricated manually and tailoredto the textile. Textile sensors and connections often have to be created in thesame fabrication process as the rest of the textile. In addition, the productionof a smart textile is relatively risky due to the uncertainty during its creation,whether the choice of materials and structuring (e.g. sewing, knitting) patternwill fulfill the requirements of the application.
2. The development of a smart textile requires knowledge in multipledisciplines. Smart textile development encompasses activities of different na-ture: textile design (e.g. choosing a sewing pattern), circuit design (e.g. decidingon the shape and placement of components) and software development. Thisknowledge is rarely present in a single individual and hence, a collaboration ofpeople with a background in different disciplines is required. These disciplinesinclude computer science, electrical engineering, material science and textiledesign [16].
3. Smart textiles require the development of complex algorithms. Oneof the facts that make smart garments so interesting is that they can providediverse kinds of information about the wearer. However, high-level informationneeds to be extracted from the sensor data using complex signal processing andclassification algorithms. Furthermore, data produced by smart garments tendsto be noisy and unreliable for two reasons. First, garments change their positiondue to the wearer’s movements when they slide and fold during usage. Second,textile sensors and connections based on conductive yarn or thread typicallyused in smart textiles are not as accurate and stable as traditional silicon-basedsensors and connections.
To tackle these problems, we have developed TangoHapps. TangoHapps is an Inte-grated Development Environment (IDE) specifically designed for the development ofsmart textiles.
1.1 Research Process
The field of smart textiles is still relatively new and likely to change in the future dueto the new technologies and functionalities that will be available. In order to explorethe field as it evolves, we applied a formative iterative model-based research process.Formative approaches have the goal to form a technology or process by iterativelyapplying, assessing and improving it. Summative approaches, in contrast, do not
2
1.1. RESEARCH PROCESS
SummativeApproach
(a)
Feedback
Feedback
V1 V2 V3
FormativeApproach
(b)
Figure 1.1: a) In the summative research approach the artifacts produced by atechnology are assessed. b) In the formative research approach, the artifacts producedby a technology are used to assess and improve the technology.
intend to improve the technology or process but rather assess the artifacts producedby them. The difference between the formative and summative approach is sketchedin Figure 1.1. In the context of education, a formative approach would assess theteaching methodology with the goal to identify possible improvements to the course’sprogram. A summative approach, on the other hand, would be an examination writtenby students at the end of the course to assess how much they have learned.
The formative assessment approach we applied consists of three phases: Use CaseDevelopment, Use Case Validation and IDE Extension. We applied these three phasesiteratively. In every iteration, we have refined TangoHapps’ abstractions to supportthe use cases described in the following sections.
1.1.1 Use Case Development
We iteratively extended TangoHapps to explore the development of abstractions witha set of use cases. First, the core requirements of TangoHapps were elicited based ona set of smart textiles. These smart textiles were realized during several workshopsin collaboration with researchers from the Universität der Künste (UdK) in Germany,Technische Universiteit Eindhoven and Universiteit Twente in The Netherlands, AaltoUniversity in Finland, Telekom Innovation Laboratories and Philips1. We chose thesmart textiles with two main criteria. First, we aimed for smart textiles with a highcomplexity in terms of development time, amount of functionality and types of sensorsused. Second, we chose smart textiles from different application domains in order toderive general core requirements that are usable also in new application domains.Table 1.1 describes four of the smart textiles we chose for the core requirementselicitation.
1Part of this collaboration was funded by the European Institute of Innovation & Technology(EIT ICT) and is documented in the EIT ICT’s “Connected Textiles” Activity Nr. 13087 under the“Smart Spaces” Action Line.
3
CHAPTER 1. INTRODUCTION
Once an initial version of TangoHapps was developed, we extended its functionalityto support the development of two additional smart textiles developed at the TechnicalUniversity Munich (TUM). The first one is the KneeHapp Bandage and the secondone the Custodian Jacket. KneeHapp is a smart bandage and sock that tracks therehabilitation progress of a patient after a knee injury. The bandage uses its integratedmotion sensors to measure patients’ performance at different rehabilitation exercises.For example, KneeHapp measures the amount of shaking of the leg while a patientperforms squats. The sock has an integrated textile pressure sensor at the sole usedto determine the duration of a one-leg hop - an important measurement used byorthopedists to assess whether the patient is ready to go back to sports. The CustodianJacket supports technicians during maintenance activities in a supercomputer centerwith the goal to improve their safety and work performance. The jacket uses motionsensors to determine user physical activity (e.g. walking, running, lying down on theground) and proximity sensors to help technicians find servers in a rack. Both smarttextiles and their implementation with TangoHapps are described in detail in Chapter6.
1.1.2 Use Case Validation
After a use case was developed, we used TangoHapps to replicate it. We validated theversion of the smart textile developed with TangoHapps upon the following metrics:
• Coverage. The versions of the smart textiles that we developed with TangoHappsusually had less functionality than the original smart textiles. An importantmetric to validate the use cases was the amount of functionality covered by thereplication.
• Wearability. This metric is used to assess how comfortable the smart textile isto the user.
• Accuracy. Most of the smart textiles we developed had the goal to extractinformation about the user’s context based on sensor data. A metric we usedto validate these use cases was the accuracy with which the smart textile couldextract user information.
• Performance. TangoHapps facilitated the development of the use cases by mak-ing high-level reusable functionality components available to developers. Thesefunctionality components were general-purpose and not designed for any specificuse case. As a consequence, the use case might have caused a loss in performance.This metric tried to assess execution performance and energy consumption.
We used different research techniques for the validation of the use cases. One tech-nique included case studies to estimate degree of functionality coverage. Anotherused semi-structured interviews with users and application domain experts to inquireabout wearability and usability of the application. A third one shadowed developerswhile replicating the use case. We also conducted controlled experiments to estimateaccuracy by comparing measurements of the smart textile against base values.
4
1.1. RESEARCH PROCESS
Name Domain DescriptionWaveCap Music A knitted hood and shawl that plays radio
through a textile speaker inside the hood. Theends of the shawl - made of conductive yarn - closea switch when knotted together causing the radioto be turned on. The volume is controlled bypulling strings attached to the hood. Textile-basedswitch and sensor on the hood are used to controlthe radio sender and frequency.
Knit Alarm Elderly A knitted jacket for elderly patients to call forhelp in case of emergencies. Knit Alarm sends anemergency signal when users either pull the leftsleeve or when they touch their right chest withthe left sleeve. An analog textile sensor integratedin the left sleeve changes resistance depending onthe deformation .
CTS-Gauntlet Rehabilitation A sleeve for patients of Carpal Tunnel Syndrome(CTS) that detects strain on the wrist. The sleevetriggers a visual and auditive signal when thestrain goes beyond a certain threshold for aspecific amount of time.
Shuffle Sleeve Music A knitted sleeve that plays music. A handful ofcoins is inserted in the sleeve - which has theshape of a circular tube. Rotating the sleevecauses coins to fall due to gravity, closing textileswitches that control the playlist and volume.Four conductive strings can be knotted together tostart, stop and pause the music.
Table 1.1: Smart textiles used to elicit the initial requirements of TangoHapps.
5
CHAPTER 1. INTRODUCTION
1.1.3 IDE Extension
The research process we followed was iterative. We developed TangoHapps in a seriesof sprints based on feedback provided by smart textile developers from the Univer-sität der Künste in Berlin. In each iteration, we refined and extended TangoHapps byadding the necessary hardware support and the software abstractions to support theuse cases. For instance, the functionality that enabled the Custodian Jacket to de-tect user physical activity (e.g. running, walking, climbing, not moving) is currentlyavailable in TangoHapps as a “first class citizen” type called “Activity Classifier ”.Furthermore, the need for a text-based programming interface and signal processingalgorithms were identified with the KneeHapp project.
1.2 Outline
Chapter 2 provides an overview of the field of smart textiles, including definitions,the evolution from wearable computers in the early 90s and describe different types oftechnologies used to construct smart textiles. We also provide an overview of commonsmart textile applications. The concepts, techniques and processes provided in thischapter set the basis for the requirements elicitation process we followed in order todevelop TangoHapps. Chapter 3 first describes similar development tools for smarttextiles, wearable computers and other physical devices. Furthermore, it lists thecore requirements, models and abstractions behind TangoHapps. Chapter 3 servesas a starting point for the design of TangoHapps, which is introduced in Chapter 4.Chapter 4 starts with a detailed description of the design decisions we made in thedesign of TangoHapps. Then it describes the IDE’s internal structure, including itsarchitecture, how we decided to map the different software components to hardwarenodes and how the functionality of each software component is organized in differentclasses. Chapter 5 focuses on TangoHapps’ user interface. It describes every view ofTangoHapps and provides an example application to illustrate the usage of the IDE.In Chapter 6, we demonstrate the applicability of TangoHapps in the development ofthe KneeHapp Bandage and Custodian Jacket. We chose these smart textiles for tworeasons. First, because we had access to real-world projects with real customers, endusers and target environments. Second, these smart textiles differ considerably fromeach other. This allows us to draw more solid conclusions about the applicability ofTangoHapps to different domains. Chapter 7 presents the results of two user studieswe conducted in order to gain insight into the usage of TangoHapps.
6
Chapter 2
Foundations
Smart textiles, also known as intelligent textiles, electro textiles or e-textiles, are tex-tiles that have the ability to sense and respond to stimuli and responses of diversenature, including mechanical, electrical, thermal, chemical and optical environmentalstimuli [16][117]. Functional textiles are textiles with a specific function added, forinstance, by using additives or coatings [18]. Examples of functional textiles includefire-resistant or phase changing 1 textiles. Smart textiles are sometimes defined to in-clude functional textiles. However, this is technically incorrect because smart textileshave special properties (e.g. electrical and optical conductivity) or special functional-ity (e.g. the ability to sense mechanical stimuli such as pressure) which is not presentin functional textiles.
Smart textiles have been categorized according to the extent of their “intelligence”into passive, active and very smart textiles [113]:
• Passive smart textiles aretextiles that can sense theenvironment.
• Active smart textiles aretextiles that can sense stim-uli from the environmentand react to them.
• Very smart textiles are tex-tiles that can sense, reactand adapt to the environ-ment.
Verysmart textiles
Active smart textiles
Passivesmart textiles
Smart textiles can be wearable (garments) and non-wearable (carpets, curtains).Wearable smart textiles are called smart garments or smart clothing. Smart garmentscan be compared to other wearable devices according to their level of integration with
1Phase-changing materials are materials that release energy when in the process of changing itsphase (e.g. melting or freezing)
7
CHAPTER 2. FOUNDATIONS
smartgarment
wearablecomputer
hearing aid
smart contactlense
smart tattoo
smartimplant
weakintegrationwith body
strongintegrationwith body
worn attached insertedcarried
mobilephone
smartcard
(a)
Device
Carriable Wearable Attachable Insertable
MobilePhone
SmartCard
SmartGarment
WearableComputer
HearingAid
ContactLense
SmartTattoo
SmartImplant
(b)
Figure 2.1: a) Smart garments compared to other devices according to their inte-gration with the human body, based on previous work from Steve Mann [71]. b)Wearable devices represented as a taxonomy.
the human body. Steve Mann categorized devices according to their wearability (orportability) into: environmental intelligence (e.g. cameras and microphones installedin a building), hand-held devices (e.g. mobile phone, tablets), in-pocket devices (e.g.smart card), clothing, underclothing and implants [71]. We refine Mann’s categoriza-tion into devices that are carriable, wearable, attachable or insertable in the body, asshown in Figure 2.1.
The weakest level of integration consists of devices that users carry, for exampleeither in their pockets, or hand-held devices. Devices that are worn on the bodyinclude smart garments and wearable computers. Wearable computers are fully func-tional, self-powered and self-contained computer worn on the body [10]. The differencebetween smart garments and wearable computers is that the electronics of a smartgarment are integrated in the textile, whereas wearable computers have no integrationto the garments of the wearer [18]. The third level of integration includes devices thatare attached to specific parts of the body, such as smart contact lenses and hearingaids. These devices are added to parts of the body without chirurgical intervention.Attachment and de-attachment of devices in the third level of integration is usuallydone by the user itself. The strongest level of integration consists of devices thatare inserted into the human body such as smart tattoos, prostheses and smart im-plants. The insertion and removal of such devices is done by specialized individualsand usually requires surgery.
Smart textiles have also been categorized according to the way in which the elec-tronic components are integrated into the textile [18]. The first generation of smarttextiles used the textile only as a substrate for attachment of electronics (e.g. sensors,
8
2.1. HISTORY
wearablecomputer
First generation:Textile as a substrate
weakintegrationwith textile
strongintegrationwith textile
Third generation:Fibertronics
Secondgeneration
Figure 2.2: The three generations of smart textiles.
output devices or printed circuit boards). Smart textiles in the first generation weresimilar to wearable computers where there was almost no integration between theelectronics and the textile. The second generation adapted traditional textile fabrica-tion methods for adding special functionality to the textile [18]. E-broidery is a textilefabrication technique which uses embroidery machines adapted to sew or weave con-ductive textiles using a numerically controlled process [94]. In the third generation,electronics and materials with sensing properties are integrated directly in the fibersof the textile. This technique is also referred to as fibertronics. Figure 2.2 displaysthe different categories ranging from weak to strong integration into the textile. Inthe next section, we discuss these categories in more detail.
2.1 History
In this section, we summarize the history of wearable computing and their transitioninto smart textiles. We provide first an overview of different wearable devices createduntil 1990 and then describe in more detail the different generations of smart textilesbased on the previous work done by Cherenack et al. [18].
2.1.1 Wearable Devices
Wearables with “computation” capabilities date back to the 17th century. A silverring with an inlaid abacus created by the Qing Dynasty that has recently been foundis considered by some to be the first wearable “device” in history. The ring is shownin Figure 2.3 a).
Wrist watches emerged in the 19th century because pocket watches became im-practical in some situations such as while riding horses or driving cars 2. Wrist watchesbecame popular later on during the war at the end of the 19th century and beginningof the 20th century mainly due to the need to coordinate attacks during battles.
In 1957 Earl Bakken invented the first wearable pacemaker [57, 63]. The pace-maker’s pulse generator and battery were worn by the patient and the leads (the parts
2http://www.smithsonianmag.com/innovation/pocket-watch-was-worlds-first-wearable-tech-game-changer-180951435/?no-ist
9
CHAPTER 2. FOUNDATIONS
(a) (b) (c)
Figure 2.3: a) Qing Dynasty’s abacus ring. Image taken fromhttp://www.chinaculture.org. b) Eudaemonics’ shoe used to cheat at the casino.Image taken from: http://eyetap.org/wearcam/eudaemonic/ with permission ofSteve Mann. c) Ivan Sutherland’s HMD [111]. Communications of the ACM 2002,Vol. 11:2. Copyright c⃝2002 by ACM, Inc. Reprinted with permission of ACM, Inc.
(a) (b) (c)
Figure 2.4: a) Steve Mann’s HMD [69]. Reprinted with permission of Steve Mann.b) CMU’s VuMan 2 wearable computer [105]. Reprinted with permission of DanielSiewiorek. c) Forms for wearability [35]. Reprinted with permission of FrancineGemperle.
10
2.1. HISTORY
that conduct the electrical impulses into the heart) were implanted through the tissueinside the heart.
Most people consider Edward Thorp and Claude Shannon’s smart shoe for cheatingat a casino as the first wearable device in history [116]. The shoe was developed in1961 and had switches controlled by the toes that enabled the user to time when aroulette’s ball crossed a reference line. The electronics in the shoe were connectedover a wire to a computer hidden in a cigarette box. The computer in the cigarettebox communicated wirelessly with a receiver device that provided auditive outputto the user. A similar shoe was created later by a group that called themselves the’Eudaemonics’, shown in Figure 2.3 b).
The first Head-Mounted Display (HMD) or Heads Up Display (HUD) was devel-oped in 1968 by Ivan Sutherland. The HMD recognized the user’s head position androtation within a room using an arm that hanged from the ceiling of the room. TheHMD is also widely considered to be the first Augmented Reality (AR)3 and VirtualReality (VR)4 HMD [111], although the terms would be defined only a decade later.Sutherland’s head mounted display is shown in Figure 2.3 c).
Until 1980 wearable devices were not general purpose, but rather bound to specificuse cases or testing environments. In 1980, Steve Mann created a general purposemultimedia wearable computer [70]. Mann attached a camera, a display and twoantennas to a helmet and connected them to a 6502 computer that was carried in abackpack. During subsequent years, Mann continued shrinking the size of his wearabledevice and transmitted live images from his head-mounted camera to the internet forthe first time in history [69]. An early version of Mann’s device is shown in Figure2.4 a).
Research on wearable computing gained more attention after 1990 and was ledprimarily by institutions such as Carnegie Mellon University (CMU), MassachusettsInstitute of Technology (MIT), Georgia Tech and Columbia University. Technologiesused to develop wearable computers between 1990 and 2000 included HMDs suchas the Private Eye, speech recognition units, cameras, GPS, wearable mouses andkeyboards. Popular application fields of wearable computers during this period werenavigation and maintenance of vehicles and machinery.
Relevant wearable systems created after 1990 include CMU’s VuMan and Navi-gator series, Steve Mann’s backpack-based multipurpose wearable computer and theJava Ring from Sun Microsystems. The VuMan and Navigator series were developedbetween 1991 and 1996. Vuman 1 included a HMD and a unit fixed to the user’sbelt with three buttons and was used to navigate the user through the blueprints ofa building. VuMan 2 displayed maintenance information and helped the user locatepeople and buildings. VuMan 3 provided maintenance information during inspectionof amphibious military tractors. Navigator 1 consisted of a computer mounted on abackpack, the Private Eye HMD, a speech recognition unit, a portable mouse for userinput and a GPS for position tracking. Navigator 2 supported maintenance workers
3Augmented Reality refers to a system in which 1) real and virtual imagery are combined, 2) isinteractive in real time and 3) is registered in three dimensions [5]
4Virtual Reality is a system that “senses the participant’s position and actions and replaces oraugments the feedback to one or more senses, giving the feeling to the user of being immersed orpresent in a virtual world” [99]
11
CHAPTER 2. FOUNDATIONS
during aircraft inspection. An image of the VuMan 3 is shown in Figure 2.4 b). TheJava Ring was a ring introduced by Sun Microsystems in 1998 [21]. The Java Ringhad an integrated microprocessor able to run Java scripts and had no internal powerbut received power when it was in contact with the receiver device. One of the usecases of the Java Ring was unlocking doors after identifying the wearer with a uniqueidentifier integrated within the ring.
In 1991 Mark Weiser coined the term Ubiquitous Computing or ubicomp referringto a vision where computing devices embedded in everyday artifacts support peoplein daily activities and work [62, 120].
In 1997 the first International Symposium in Wearable Computing was co-hostedby CMU, MIT and Georgia Tech. Most wearable devices presented during the eventwere based on Head Mounted Displays and eye glasses [22, 32, 102, 106, 115, 81, 69,108]. Voice was commonly used as an input strategy [22, 104]. Application fieldsincluded blue collar working [22, 81, 86, 102, 115], navigation [32], sports [88], affec-tive wearables [92] and disabled support [108]. Research presented at the conferencefocused on usability of wearables, such as wearable keyboard based input [114] andhaptic output based on vibrations [112]. During the conference, Post and Orth in-troduced the concept of smart fabric together with some techniques to create textileconnections and sensors [93].
In a study published in 1998, Gemperle et al. studied wearability (i.e. the interac-tion between a wearable device and the body) [35, 56]. The study provided guidelinesfor the design of wearable devices. For example, the study identified the positionsaround the body where wearable devices can be placed based on the size and amountof movement in the different body parts. The guidelines led to the term wearableforms. A wearable form is proven ways how wearable devices can be mounted on thebody. Wearable forms are shown in Figure 2.4 c).
2.1.2 Smart Textiles
The first generation of smart textiles used the finished textile as a substrate for in-tegration of electronic devices. The textile itself did not play any role besides thatof a carrier for electronic devices and connections. Examples of this category includeGeorgia Tech’s Wearable Motherboard (GTWM) and Virginia Tech’s Beam-Formingtextile [82]. GTWM, shown in Figure 2.5 a), was a soldier’s vest able to detect bul-let wounds and monitor soldier’s vital signs during combat [18, 90]. GTWM had anintegrated grid of optical fibers and attached off-the-shelf sensors. Virginia Tech’sBeam-Forming textile was used to estimate the position and direction of moving vehi-cles [18, 82]. For this purpose, it contained several microphones attached to a wovengrid of interconnection lines.
In the second generation of smart textiles, the textile played an essential role inthe electrical system, rather than just being a substrate for attachment of electronicdevices. Traditional textile fabrication techniques (e.g. embroidery) were adaptedto provide the textile with special functionality [18]. Another characteristic of thisgeneration was that it merged electronic design techniques with technologies from the
12
2.2. FABRICATION
(a) (b)
Figure 2.5: a) Georgia Tech’s Wearable Motherboard (GTWM) [90]. Copyrightc⃝2002 by ACM, Inc. Reprinted with permission of ACM, Inc. b) Musical Jacket[94]. Reprinted with permission of Maggie Orth.
textile industry. For example, the Musical Jacket - developed by MIT in the late1990s - used Computer-Aided Design (CAD) to specify the circuit layout and stitchpattern of a textile keyboard, which was embroidered in the jacket. The jacket andembroidered textile keyboard are shown in Figure 2.5 b).
The third and most recent generation, fibertronics, is characterized by the inte-gration of electronics and materials with special properties into the textile during thefabrication of the textile. One way to achieve this is by replacing traditional fibers(e.g. cotton) with conductive or sensitive fibers in the creation of yarn. Anotherapproach is the integration of miniaturized electronics during the fabrication of yarnby twisting strands of fiber around an electronic device [121, 96]. Zysset et al. de-veloped a technique to create woven textiles with electronic circuits [125]. Electroniccircuits are integrated in thin plastic film lines which are woven with conductive andtraditional yarn lines.
2.2 Fabrication
The textile fabrication process consists of different phases which may vary dependingon the materials used and the purpose of the textile. In the beginning of this chapter,we described what smart textiles are. In this section, we describe how smart textilesare created. We start by describing how traditional textiles are created and thenexplain how special functionalities are added. It should be noted that the textilefabrication process described in this section has been simplified for the purpose ofthis dissertation - textile fabrication processes include other phases and activitiesthat are not shown in this section. The textile fabrication process we describe in thissection is illustrated in Figure 2.6.
Textiles are hierarchically structured fibrous materials [16]. Fibers are the smallestunit in the textile hierarchy [16]. Fibers are raw materials which are either extracted
13
CHAPTER 2. FOUNDATIONS
cotton
sheep
silkworm
fibers yarn fabric
spinningtwisting
knittingweaving
cuttingsewing
cloth
carpet
T-shirt
Figure 2.6: Textile fabrication process.
from natural sources or created synthetically. Natural sources of fibers are eitheranimal (sheep, silkworm), vegetable (cotton, flax) or mineral (asbestos). Fibers withno further processing have little tensile strength which limits their use to stuffingobjects such as pillows and jackets. For this reason, fibers are spun into long strands(called spinning) which are then twisted together in order to form yarn. Thread isa type of yarn which is usually finer and suitable for sewing. Fabric is created bystructuring yarn in specific patterns. Two popular ways to structure yarn into fabricare knitting and weaving. Knitting is the process of intertwining two sets of yarnin a series of consecutive loops [18]. Weaving is the process of interlacing two setsof yarns perpendicularly [18]. Fabric panels are cut in parts and different parts aresown together in order to form cloth. The terms cloth and textile are often usedinterchangeably. In this document, we use the definitions provided by Castano et al.[16]. We refer to “cloth” as the finished textile product and treat the term “textile” asfibers in any phase of the textile hierarchy.
Special functionality can be added to the textile at any phase of the fabricationprocess (i.e. by making modifications to the fibers, yarns or fabrics used to createcloth). Fibers can be naturally conductive or be treated to become conductive [78].Naturally conductive fibers are made from metallic materials such as stainless steel,titanium or aluminum. Metallic fibers can be created by “shaving-off” fibers fromthe edge of a thin metal sheet [78]. Electrically conductive fibers can be createdby coating non-conductive fibers with metals, galvanic substances and metallic salts[78]. Metallic fiber coatings produce highly conductive fibers. However, it can bechallenging to achieve a metallic coating that adheres to the fiber and does not corrode[78]. Galvanic coatings are coverings rich in zinc that provide high conductivity andcorrosion resistance but can only be applied to conductive substrates [78]. Metallicsalt solutions achieve low conductivity which is further reduced during laundry [78].Fibers can be also mixed with materials sensitive to mechanical or chemical stimuli[16]. For example, piezoresistive materials5 can be used to coat fibers in order to makethem sensitive to strain [16][52]. We discuss textile based sensors in more detail inSection 2.3.1.
Yarns can also be made conductive and sensitive to stimuli. Conductive yarns
5Piezoresistive materials react to mechanical stress by presenting a change in the electrical resis-tance, which can be measured
14
2.2. FABRICATION
(a) (b) (c)
Figure 2.7: Different types of conductive yarns. (a) A copper foil wrapped arounda non-conductive yarn. Reprinted with permission of Maggie Orth of InternationalFashion Machines. (b) A wire wrapped around non-conductive fiber strands [109].Licensed under Creative Commons BY 4.0. (c) Multifilament yarn coated with ametallic layer [109]. Licensed under Creative Commons BY 4.0.
are created using either fully metallic wires or by combining traditional yarn withconductive materials. Metallic wires tend to have a high conductivity but are lessflexible and therefore more likely to break during weaving and knitting [18]. Yarnand conductive material can be combined in different ways, including: wrapping ametallic foil around a non-conductive yarn [94], twisting non-conductive fiber strandsaround a metallic core [78] or by twisting a metallic wire together with non-conductivefiber strands [109]. Like fibers, yarns can be made conductive by coating them withconductive metals or polymers [78] and sensitive, by wrapping sensitive fibers aroundan elastic core [16]. Figure 2.7 displays different types of conductive yarn.
Special functionality can also be added to fabric after its creation. A common wayto achieve this is by attaching conventional electronic devices (e.g. microcontrollers,memory units, wireless links) to interconnect lines in the textile [18]. Electronicdevices can be attached to circuits in the textile using cables, conducting adhesives,conductive thread or by means of mechanical methods. Cables add rigidity to thetextile and are likely to get entangled or break due to strain in areas of the bodywhere movement occurs. The Arduino Lilypad [14] hardware kit elements featurebroad circular pins to facilitate their attachment to the fabric using conductive thread.The same conductive thread is used to connect the circuit. Mechanical methods (e.g.crimping 6) apply force on the electronic devices or conductive elements in order toattach them to the textile. Figure 2.8 shows electronic devices attached to a fabricusing different techniques.
Other techniques to add functionality to the fabric after its creation include lam-ination and coating. Lamination is the process of adding a layer (e.g. a film) withspecial functionality (e.g. an LED display) to a fabric substrate, usually by applyingadhesives [18]. Leah Buechley has attached electronics to fabric by ironing them tothe fabric using heat-activated adhesive [14]. Coatings can be used to add conductingand sensing properties to a fabrics [16]. For example, polymer coatings have beenused to create fabric sensitive to temperature [11] and strain [66].
6Crimping: deforming one or two pieces of metal such that one can hold the other
15
CHAPTER 2. FOUNDATIONS
(a) (b)
Figure 2.8: Electronic devices attached to fabric. (a) Flexible electronic moduleglued to the fabric and connected with embroidered conductive lines developed by theFraunhofer’s Institute for Reliability and Microintegration [67]. (b) Microcontrollerconnected with conductive thread to conductive snap buttons crimped. Developed incollaboration between the Technische Universität München and the Universität derKünste [43].
2.3 Anatomy
SmartTextile
Sensor OutputDevice
ConnectionElectronicDevice
MicrocontrollerTextile
connects
Figure 2.9: Model of a smart textile.
Current generations of computing devices such as personal computers and mobilephones are constructed on rigid plastic substrates. The trend in smart textiles is toreplace the traditional silicon-based plastic substrates with textile-based componentsand connections that integrate more seamlessly into the textile substrate. In thissection, we provide an overview of the different types of textile-based componentsand connections used in smart textiles.
Figure 2.9 displays a simplified UML model of a smart textile. Smart textilesconsist of textile materials and electronic devices. Electronic devices can be sensors,output devices and microcontrollers. Connections transmit signals and power betweentwo or more electronic devices. An electronic device can be connected to severalother electronic devices. In particular, electronic devices usually have two or moreconnections to a microcontroller.
16
2.3. ANATOMY
2.3.1 Sensors
Sensors devices measure physical properties and transform them into digital signalsthat can be processed [18]. Physical properties include parameters from the environ-ment such as temperature, light intensity and physiological parameters from humanssuch as heart rate. Most textile sensors operate following one of three principles:resistive, capacitive and optical sensing. Next, we describe these sensing principles.
Sensing Principles
Resistance is the difficulty of an electrical conductor toconduct an electric current. Conductivity is the oppositeof resistance. Resistive sensing works by measuringthe resistance of a material, which changes dependingon the physical property to be measured. For exam-ple, a temperature sensor can be made with a specifictype of fiber that change their resistance depending ontemperature [18].
Capacitive sensing works by measuring the amountof electric charge transferred from two charged objects- called electrodes - into a third conductive object (e.g.the human finger). The amount of electric charge trans-ferred between both electrodes changes depending onthe parameters to be measured. This principle can beused to measure environmental parameters such as hu-midity and skin proximity - indeed, it is the technologyused by most modern mobile phones to detect humantouch. Capacitive sensing can be realized in smart tex-tiles using conductive fabric and coatings [16].
Optical fibers are flexible and transparent strands ofplastic or glass. Due to the transparency of opticalfibers, light can travel through them. Optical sens-ing works by measuring different properties of the lighttransmitted through optical fibers (e.g. intensity, wave-length), which can change due to external stimuli suchas strain. The GTWM used optical sensing to detectwounds in military scenarios by detecting cuts in theoptical fibers [90].
Source
Sensor Types
This subsection describes the different types of sensors that are used in smart textilesand presents examples of each sensor type. Figure 2.10 displays an UML model oftextile sensors and their relationship with the environment.
17
CHAPTER 2. FOUNDATIONS
TextileSensor
lSensor
OpticSensin
eSensin
citiveSensin
T tureSensor
inSensor
orceSensor Sensor
Envi ent
Sensor
Sensiniple
uses
reacts to
Figure 2.10: Taxonomy of textile sensors.
Pressure sensors, also called force sensors measure the amount of force appliedto a surface. Pressure sensors integrated within textiles can be used as input sources(e.g. buttons integrated in a garment) or to monitor physiological parameters froma wearer (e.g. weight distribution when integrated into socks). Pressure sensors canbe built using the resistive or capacitive principle. Resistive pressure sensors canbe created by mixing conductive and non-conductive fibers or yarn in a textile andmeasuring the resistance of the material. When pressure is applied to the textile,the conductive fibers are squeezed, which increases the conductivity of the material.Another way to create a resistive pressure sensor is by using pressure sensitive polymercoatings (i.e. materials that generate an electric signal when pressure is applied tothem) [16, 13]. At a yarn level, specific structures of yarn can be created so thatresistance of the textile structure changes when deformations are applied to it [55].Capacitive pressure sensors have been achieved by separating two conducting textilesurfaces with a spacer material and measuring changes in capacitance occurring whenthe conductive surfaces get closer to each other due to pressure [79]. Pressure sensorshave been used to create keyboards [3], touchpads [18] and sportswear [51][110].
Heart rate sensors, also called electrocardiogram or ECG sensor measure theelectrical changes on the skin caused by heart contractions. Several textile heart ratesensors have been developed until today, such as the MagIC vest [25], the NuMeTrexsports bra [103] and the Intellitex suit [109]. Respiratory activity sensors arecreated using strain sensors that change their electrical resistance due to stretching ofthe thorax. The WEALTHY suit used strain fabric sensors and piezoresistive yarnsto measure users’ breathing patterns [89].
Temperature sensors can be of resistive or optic types. Resistive temperature
18
2.3. ANATOMY
sensors work by measuring conductivity of metal fibers woven or knitted inside thetextile, which change depending on the temperature [18]. Fiber optic sensors andtemperature sensitive coatings have also been used for temperature sensing [16, 101].Alternatively, thin-film temperature sensors can be integrated in yarn [18].
Strain sensors can be used to detect body postures or joint movements. Thiscan be useful in sports and rehabilitation scenarios. Strain sensors can resistive oroptical. Resistive strain sensors can be created by knitting piezoresistive metal fibers[16]. Zhang et al. structured yarn in a special pattern that favors changes in resistancewhen strain occurs [123]. Optical strain sensors work based on the fact that the strainapplied to the textile alters the strength of the optical signal traveling along the opticalfiber. The strength of an optical signal can be measured by the optical sensor [18, 25].
Biological sensors detect the presence and composition of biological fluids suchas sweat, urine and blood. These sensors are created using chemicals that reactelectrically to specific substances present in the biological fluids. Chemicals are eitherapplied as a coating to textile fibers or screen printed 7 on fabric [18].
Gas sensors detect the presence of gases such as hydrogen (H2), carbon monoxide(CO) in an area [26]. Gas sensors are of particular interest in hazardous or contam-inated environments (e.g. firefighting) because of their ability to detect the presenceof toxic gases for humans or animals. Similar to biological sensors, gas sensors arerealized using coatings that react electrically when exposed to specific gases [16]. ThePROeTEX firefighter’s suit incorporated a CO2 sensor within the boot and a COsensor in the jacket next to the collar [19].
Humidity sensors can be resistive or capacitive. Resistive humidity sensorschange their conductivity depending on the humidity in the air and capacitive hu-midity sensors react to water vapor [16]. Humidity sensors can also be built usingcoatings [16].
7Screen printing is a technique to transfer ink into a textile by using a “mold” to fix the distributionof ink in the textile.
19
CHAPTER 2. FOUNDATIONS
2.3.2 Output Devices
OutputDevice
isuDevice
ticDevice
iveDevice
onMotor
Envi ent
icer
ic e
Figure 2.11: Taxonomy of output devices.
Output devices, also called actuators, are devices used to communicate changes in thesmart textile to the environment [18]. Output devices provide mainly visual, auditiveand haptic information to users. Figure 2.11 displays a taxonomy of output devices.
Visual output suitable for smart textiles include Light Emitting Diodes (LEDs),electroluminescent (EL) fabrics and chromic materials. Arrays of LEDs have beencombined in a grid-fashion using conductive yarn in order to create a textile display[16]. EL fabrics emit light when they receive an electrical stimulus and can be con-structed by applying coatings with EL polymers [16]. Chromic materials are materialsthat change their color depending on external stimuli such as light (photochromic),temperature (thermochromic), pressure (piezochromic) or electrical (electrochromic).[16]. Chromic fabric can be obtained using chromic fibers or material coatings. Figure2.12 a) displays a thermochromic display developed by the UdK. A chromic materialcoating has been applied to the textile. Wires behind the textile heat due electricalstimulus, causing a change in the color of the textile.
Smart textiles can provide auditive output using miniaturized off-the-shelfspeakers. These speakers are smaller than a fingertip and can be sewn to interconnectlines on the textile. Speakers can also be constructed on fabric by arranging conduc-tive yarn or ink in a spiral form. A magnet in the center of the spiral generates anelectromagnetic wave which is intensified as the signal flows along the spiral. Figure2.12 b) shows a textile speaker integrated into a sweatshirt developed by the UdK.
Haptic output technologies include miniaturized vibration motors and shape-changing materials (i.e. materials that change their shape upon certain stimuli).Shape-changing materials are either polymers or alloys. Electrically active polymersare materials able to change their shape or dimensions when they receive an electricalstimulus [15]. Shape-memory alloys and shape-memory polymers are materials that“remember” a shape to which they return when influenced by external stimuli such asheat. Haptic output can also be produced using using motors. The adidas smart shoes
20
2.3. ANATOMY
(a) (b)
Figure 2.12: Textile output devices. a) Textile thermochromic display. Reprintedwith permission of Katharina Bredies. b) Textile thermochromic display. Reprintedwith permission of Katharina Bredies.
use a motor connected to the cushioning system in order to adapt their cushioningdepending on the surface on which the wearer runs [29].
Visual output is the most common output modality of many computing devices- such as desktop computers and mobile phones. However, the usefulness of visualoutputs on a smart garment is limited to the relatively small range of the wearer’svision - around the face and on the upper-front body. Instead, visual outputs in thecase of smart garments is often used to address other individuals around the wearerof the smart garment. For example, a textile display attached to the back of thewearer has been used to communicate performance parameters to athletes during jointsports [77]. On the other hand, auditive output devices rely on a suitable environment(e.g. not too loud and socially acceptable to produce sound). Haptic output is morediscrete than the visual and auditive output modalities, therefore it suffers less fromsocial acceptance issues. However, haptic output devices might not be suitable if theuser is wearing several layers.
2.3.3 Connections
Connections suitable for smart textiles can be realized in several ways: attaching wiresto the textile, building the textile using conductive yarn or thread, applying coatingsor inks, or integrating optical fibers in the textile. Figure 2.13 displays a taxonomyof connections.
21
CHAPTER 2. FOUNDATIONS
Connection
Wire Conductiven
uctive uctiveOptic
uctive
Figure 2.13: Taxonomy of connections used in smart textiles.
One way to connect electronic devices in a smart textile is by attaching metallicwires to the textile. However, wires tend to have a limited flexibility and to be lessresistant to strain. Particularly in the case of smart garments - which are subject tostrain and movement - wires have a higher risk to break or get entangled with objectsin the wearer’s environment.
Conductive yarn and conductive thread can be woven into a textile using tra-ditional weaving machinery. Conductive thread can be additionally sewn and embroi-dered into a textile [78]. Embroidery with conductive thread offers some advantageslike the ability to create multiple layers of fabric in a single step and to specify circuitlayouts using Computer Assisted Design (CAD) [94].
Connections with varying degrees of conductivity can be realized using conduc-tive coatings. Some polymers used to create conductive coatings are even moreconductive than metals and have also high adhesion and non-corrosive properties [78].Additionally, coatings do not alter the textile properties (e.g. flexibility) significantlyand are applicable to most textile substrates [16].
Conductive inks are created by adding conductive materials such as carbon,copper or silver to traditional printing inks [78]. Conductive inks can be applied toa textile using inkjet printers or by means of screen printing. Inkjet printers propeldrops of ink into the textile [78]. Screen printing comprises the printing of a viscouspasta through a patterned fabric screen, which is followed by a drying process [109].Screen printing is also suitable for integration of electronics in a textile [109]. Incontrast to some conductive coatings, conductive inks do not loose conductivity dueto bending or laundering [78].
Optical fibers are transparent fibers where signals are transmitted through pulsesof light [16]. Optical fibers possess good strength and sunlight resistance and are ingeneral not affected by electromagnetic interference [78]. Optic fibers are relativelystiff and possess poor flexibility, drapability (i.e. the ability to wrap loosely with foldsaround an object) and abrasion resistance [78].
22
2.4. APPLICATIONS
2.4 Applications
Smart textiles have been developed for a wide range of fields. In this section wefocus on three main areas: health and rehabilitation; sports and fitness; and securityand safety. We provide examples of relevant research and industrial smart textilesdeveloped in these domains.
Health and Rehabilitation
Several smart T-Shirts are able to detect physiological parameters from the wearersuch as heart, respiration rate, electrocardiogram, activity and posture. Examplesinclude the LifeShirt by Vivometrics [39] the MagIC system [25] and the T-Shirtsdeveloped in the EU-funded projects WEALTHY [87, 89] and MyHeart [40].
Smart textiles for rehabilitation measure performance in different rehabilitationexercises and provide live feedback to the wearer or track the exercising over time[43,4, 36, 2]. Most of these systems use motion sensors, some use the textile itself toprovide feedback to the user [2]. The UdK collaborated with Philipps on a sleeveworn on the wrist for rehabilitation of Carpal Tunnel Syndrome (CTS). The sleeveuses strain sensors to measure the range of motion in the user’s wrist and contain bluelight LEDs which can help relax muscles and increase blood flow. Zhang et al. createda textile band able to react to muscle electrical signals (also called electromyographyor EMG). The textile band classifies electrical signals and enables amputees to controltheir prostheses [124].
Sports and Fitness
Holeczeck et al. developed a sock with integrated pressure sensors that tracks andclassifies movements during snowboarding [51]. Sundolm et al. developed a smarttextile mattress with integrated pressure sensors that classifies physical exercises per-formed on the mattress such as push ups and squats [110]. A more recent researchstudied the use of conductive ink and flexible LED displays for supporting athletesduring group sports [77]. Smart textile products have been introduced to the marketby Nike, Polar and Adidas. Nike+ and Polar collaborated on a heart rate monitorthat straps around the wearer’s chest and Adidas developed a smart jogging shoes[29]. Another smart textile that reached the market was NumeTrex’s sports bra.NumeTrex’s bra used knitted conductive fibers as electrodes to monitor the wearer’sheart rate [18].
Security and Safety
An early smart textile for safety was the Georgia Tech Wearable Motherboard(GTWM). As mentioned in Section 2.1.2, the GTWM was able to monitor soldiers’vital signs during combat [90]. For this purpose, the GTWM integrated a heart rate,temperature sensors and a network of optical fibers to detect possible wounds causedby projectiles into a soldier’s vest. Another smart garment for safety was the ProeTEXfirefighter’s jacket. The ProeTEX jacket used several sensors to measure physiological
23
CHAPTER 2. FOUNDATIONS
parameters from the wearer (e.g. breathing rate, heart rate, body temperature) andother parameters from the environment such as the presence of toxic gases [20] inorder to foresee dangerous situations. The University of Twente in The Netherlandsdeveloped a T-Shirt also for firefighters able to track the user’s motion and interpretheart beats using accelerometer and microphone data.
24
Chapter 3
TangoHapps Framework
This chapter starts by describing the state of the art in development tools for smarttextiles and other related domains, such as wearable computers. Then, we describethe core requirements upon which we created TangoHapps and provide a detaileddescription of the models and abstractions behind TangoHapps.
3.1 Related Work
Our related work research encompasses not only IDEs but also other tools that supportthe development of smart textiles (e.g. circuit design tools). Furthermore, we not onlyfocus on smart textiles but include tools designed for other related fields (e.g. wearablecomputers) for which more research and tools from the industry are available.
Most of the existing integrated development environments and tools do not tar-get smart textiles. Therefore, they do not fulfill the specific requirements of smarttextile development, which we address in the next section. Only few developmentenvironments have been developed for smart textiles. However, most of them havean educational purpose. Their goal is to teach programming and basic electronics tochildren and novice. Furthermore, these tools support only few activities involved insmart textile development (e.g. software development or electronic circuit design).In contrast to other existing tools, TangoHapps is intended for rapid prototyping ofsmart textiles and offers high-level reusable software components such as user postureand motion classifiers. TangoHapps targets a broad user audience of smart textiledevelopers, including professional software developers and users without experiencein programming or electronics. Furthermore, TangoHapps is the first attempt to sup-port the circuit design, application development and testing of a smart textile in asingle environment.
25
CHAPTER 3. TANGOHAPPS FRAMEWORK
3.1.1 Development Tools for Smart Textiles
Few IDEs have been developed that support smart textiles. Indeed we were able tofind only five:
The i*Catch framework consists of a set of plug-and-play components and a flow-based visual programming tool that produces source code as output [84]. Deploymentof source code is done manually by users by copy-pasting the produced source code intothe Arduino IDE1. i*Catch targets novice users (e.g. children) with little backgroundexperience in electronics and programming.
The Co-eTex framework is similar to i*Catch. It consists of craft materials and avisual block-based programming environment to support the creation of smart textiles[83]. Block-based programming is a visual programming paradigm in which languageconstructs (e.g. if-conditions, for-loops) with lego-type blocks. Blocks feature colorcodings and quasi-physical constraints to facilitate comprehension of the language’ssyntax. One of Co-eTex’s distinguishing features is its support for collaborative work- it enables users to work on different activities (e.g. programming, circuit layout)asynchronously.
Plushbot is a tool to facilitate the creation of interactive stuffed toys [53]. Itscurrent prototypical state includes features to layout hardware elements from theArduino Lilypad family and to create textile circuit layouts. Plushbot targets youngusers with little experience and lacks any support for software development. Plushbotfacilitates the design of stuffed textile toys with electronic circuits embedded in thembut lacks functionality to create applications.
Eduwear is a construction Kit for wearables and tangible textile interfaces [58].Eduwear consists of actuators, LEDs, buzzers and vibration motors. A developmentenvironment based on a block-based visual programming language facilitates applica-tion development using the parts in the Eduwear kit. The development environmentoutputs Arduino code that is uploaded to an Arduino Lilypad microcontroller.
Harms et al. developed a system for rapid prototyping smart garments for activityrecognition applications based on a shirt called SMASH [47]. SMASH consists of atechnology to quickly attach sensors to clothing, a platform to interconnect sensorsand a processing architecture optimized for distributed on-body signal processing andpattern recognition.
3.1.2 Development Tools for Physical Devices
Tools that facilitate the creation of physical devices have had a longer history thanthe tools for smart textiles. IDEs for the creation of devices, including smart textiles,support the development of software, construction of hardware (e.g. building a deviceby plug and playing its parts), or both. Two common goals of such IDEs are: 1)to lower the entrance barrier (i.e. enable its usage by users with little experience)and 2) to reach a high ceiling (i.e. to offer enough flexibility to support complex
1https://www.arduino.cc/
26
3.1. RELATED WORK
solutions). With the goal to lower the entrance barrier, many IDEs are based on avisual programming language. Some of these IDEs are:
a CAPpella is a tool for building context aware applications by exploiting theprogramming by demonstration technique [23]. Programming by demonstration is aprogramming paradigm that enables a system to learn behaviors based demonstra-tions performed by humans. a CAPpella records sensor data and displays a visualrepresentation of the data on top of which users annotate and label time fragments.Data and user annotations are then used for training the classifier.
Exemplar is a tool similar to a CAPpella for programming input sensor devices us-ing the programming by demonstration technique [48]. The programming by demon-stration semantics in a Exemplar enables users to grab a physical device and performa gesture in order to program the system to detect that particular user gesture. Ex-emplar divides its workflow in three phases: demonstrate, edit and review. Duringthe demonstrate phase, users demonstrate the usage of an input device. During theedit phase, users label and apply signal-processing algorithms to different parts of thereceived signal stream. During the review phase, users are able to test whether thesystem recognizes user gestures as expected.
VoodooSketch is a toolset for customizing physical interfaces by drawing on a sheetof paper with a smart pen [12]. The content of the pen is digitalized automatically.At the same time physical components are plugged into paper. Next to the physicalcomponents, a label can be written with the smart pen, in order to associate a func-tionality to the physical component. Controls can also be drawn on paper and areassociated to a UI widget based on their shape.
Circuit Stickers is an IDE to create electronic circuits on paper using an inkjetprinter filled with electronic (conductive) ink [50]. Circuit Stickers offers a widget foreach electronic element available in the kit. Users build circuits by dragging widgetsinto a canvas. After printing the circuit on a sheet of paper, electronic elements gluedto a flexible substrate with conductive adhesive are attached to the circuit.
PaperPulse is an IDE similar to Circuit Stickers which additionally enables usersto demonstrate program behavior by recording actions [95]. PaperPulse also imple-ments simulation functionality to test the circuit before printing it. In addition tothe printable file containing the circuit’s layout, it outputs the code that has to beuploaded to the microcontroller.
Topiary is a tool for rapid prototyping location-enhanced applications [65]. Itfeatures a storyboard-like UI where users define transitions between screens based onlocation events such as the arrival of the user to a specific location. Topiary targetsmobile devices.
CRN Toolbox is a visual flow-based programming tool that focuses solely on therapid prototyping of activity recognition algorithms [9]. Some of its features (e.g.multiple threads of execution, data stream merging and splitting) target softwaredevelopers with knowledge on machine learning algorithms.
Intuino is a visual authoring tool that enables users to define behavior of physicaldevices using graphical representations (e.g. time-line operations and drawing splines)[119]. For instance, users can define the pattern how an LED should increase anddecrease its intensity by drawing a spline on a canvas, thus saving users from havingto program such behavior.
27
CHAPTER 3. TANGOHAPPS FRAMEWORK
Node-RED2 is a visual flow-based environment for developing software for differentmicrocontrollers, including the Arduino Lilypad. Its goal is to facilitate integratingdevices to the Internet of Things (IoT). Cloud9 3 is a text-based development environ-ment in the cloud. Facilitates simultaneous collaboration between different developers.Espruino4 is a hybrid text and visual block-based programming environment for theEspruino microcontroller. ModKit5 is a block-based programming environment formicrocontrollers.
These IDEs simplify the communication with the hardware (e.g. reading fromsensors and writing to output elements) but do not provide higher-level functionalityspecific to the domain of smart textiles.
3.1.3 Hardware Construction Toolkits
During the last decade, a large number of hardware construction kits, also calledtoolkits, have been introduced in the research community. A hardware constructionkit is a set of physical devices that users connect to each other in order to createan electronic device. Toolkits are usually accompanied by an IDE that facilitates thedevelopment of software using the toolkit’s components. Components include sensors,output devices, communication modules (e.g. a Bluetooth module) that are connectedto a microcontroller.
Phidgets was one of the first hardware device construction kits [37, 72]. Hardwarecomponents in the kit receive the name of “physical widgets” (i.e. phidgets). Phidgetsare a mean to facilitate the development of physical user interfaces as an analogy tothe UI widgets that developers reuse when building graphical user interfaces. Theframework consists of the phidgets, a software API to access and control the phidgetsand an architecture for communicating and connecting to the phidgets. Phidgets offersa way to simulate the runtime of the hardware device.
Input Configurator Toolkit is a toolkit used to create software for computers visu-ally [28, 27]. Applications developed with the Input Configurator Toolkit accept inputsfrom a variety of computer peripherals such as keyboards, mouses and microphones.The Input Configurator Toolkit uses a flow-based programming paradigm to supportthe development of behaviors that react to user input (e.g. scrolling an application’swindow based on a keystroke).
Calder is an IDE for rapidly building physical devices similar to Phidgets, con-sisting of a variety of reusable input and output devices equipped with a wirelesstransmitter [64]. Calder supports product and interaction designers during early pro-totyping activities.
Papier Mache is a toolkit for building tangible user interfaces that use computervision, electronic tags and barcodes [61]. The input sources sense the presence, removal
2http://nodered.org/3https://c9.io/4http://www.espruino.com/Web+IDE5http://www.modkit.com/
28
3.1. RELATED WORK
and modification of objects and notify the application. Developers can choose howthe application should react to such events.
d.tools is a tool for the visual design and programming of physical devices builtwith off-the-shelf components [49]. d.tools divides its workflow in three phases: design,testing and analyze. During the design phase, designers drag and drop input andoutput elements and define links or transitions between them. Users can demonstratetransition events in the physical world in order to teach the tool desired behaviors. Forexample, to cause the tilting of a device to transition from one screen to another, theuser selects the transition and performs the tilting. d.tools features a tight couplingbetween the digital and physical world. Plugging a sensor to the microcontrollerboard causes the sensor to appear in the canvas immediately. During the testingphase, users use the physical device while being recorded. The recording containsevery state transition performed by users and can be later on reviewed together witha usage video during the analyze phase.
e-blocks is based on reusable off-the-shelf components that are plug-and-playedinto a main board [68]. A rule-based system defines the behavior of the e-blocks and asimulation environment enables developers to test the behavior before producing theactual device.
Midas is a software and hardware toolkit to support the design, construction andprogramming of flexible capacitive touch sensors that are attached to physical objects[97]. Developers use a high-level specification in order to define shape, layout and typeof the areas sensitive to touch. After the touch sensitive area has been defined, Midasprovides instructions to users for creating and assembling the parts using conductiveink and vinyl cutters.
Boxes is a toolkit consisting of easy-access materials like cardboards, foil, tape andthumbtacks to create very rapid prototypes (in seconds or minutes) [54]. Boxes canreact to users pinning or unpinning thumbtacks to a sensitive material and features atool to specify software reactions to user pinning actions.
iStuff is a toolkit of physical devices (e.g. buttons, sliders, speakers, buzzers,microphones) and a software framework to support the development of applicationsusing such physical devices [8].
iStuff Mobile is a tool based on the iStuff environment that helps designers exploreinteractions with mobile phones [7]. iStuff Mobile makes use of a visual flow-based pro-gramming paradigm and enables development of applications that use mobile phonesas output or input device to control other devices.
Amarino is a toolkit that enables the rapid prototyping of mobile ubiquitous com-puting applications [59]. Amarino consists of a mobile phone application, a set oftutorials and a library to facilitate the communication between the mobile phone andthe microcontroller.
.NET Gadgeteer offers a set of hardware components that can be plugged to amotherboard [118]. The .NET Gadgeteer uses different types of connections to pre-vents users from connecting two electrically incompatible components, thus elimi-nating the risk of shortcuts. An IDE facilitates electronic layout and applicationdevelopment using the .NET Gadgeteer hardware components.
Arduino Lilypad is a construction kit for smart textiles [14]. The Arduino LilypadKit consists of conductive thread, needles, small batteries and hardware components.
29
CHAPTER 3. TANGOHAPPS FRAMEWORK
Its hardware components are lightweight, flat and feature broad pin holes. Thread- in particular conductive thread - is wrapped around the pin holes. Conductivethread fulfills the purposes of attaching the hardware components to the textile andinterconnecting them at the same time.
Perner-Wilson et al. propose the usage of craft materials for the creation of elec-tronic textiles [91] and present evidence about the increased freedom the approachgives to creators when compared to the approach of kits of parts.
3.1.4 Circuit Layout Software
A challenging task in the creation of an electronic device is the design of its circuits.In the case of smart textiles, it is particularly important to design the circuits inadvance because of the high amount of effort required for the construction of textilecircuits manually. Circuit design tools range in their complexity and target audience.Smart textile designers use either tools for professional electronic circuit design or toolsfor prototyping electronic circuits. Eagle6 enables the creation of custom hardwaredesigns that can then be printed on a circuit board and targets users with a highlevel of expertise in electronics. Fritzing7, on the other hand, targets hobbyists andusers with minimal experience in electronics and facilitates the creation of hardwarelayouts using breadboards and jumper wires. Fritzting contains diverse hardwarecomponents available in the market, including the Arduino Lilypad and its set ofsensors and actuators.
3.1.5 Simulation Environments and Operating Systems
Further tool support for smart textiles includes simulation environments and operatingsystems.
Martin et al. developed an environment that enables the simulation of differentparameters related to the design of a smart textile (e.g. behavior of sensors whenattached to unstable fabrics) [73, 107].
Mattmann et al. developed a simulation environment to investigate how differentpostures and physical activity influence the elongation of clothing [75]. One of thefindings obtained thanks to the simulation environment is that a T-Shirt can besubject to elongations as long as 20% of their original size on the upper back.
Schneegass et al. developed an operating system called Garment OS [98]. Gar-ment OS facilitates the development of applications for smart garments with softwarecomponents for data processing and user activity recognition.
6http://www.eagle.org/7http://www.fritzing.org/
30
3.2. REQUIREMENTS
3.2 Requirements
In this section, we list the functional and non-functional requirements upon which wedeveloped TangoHapps. As described in Chapter 1, these requirements were elicitediteratively during a four-year collaboration with professional smart textile developers.Before we present the requirements we elicited, we would like to define the targetuser of TangoHapps. Throughout this dissertation, we refer to the target users ofTangoHapps as developers because they use TangoHapps to develop applications forsmart textiles. The term developer, in the context of this dissertation, refers to asmart textile developer and not necessarily to a software developer. Indeed, mostsmart textile developers have no or very limited experience in programming, such aspeople with a background in fashion or textile design. We use the term user to referto target users of the smart textile and applications developed with TangoHapps. Itshould be noted that the term user is not always equivalent to wearer. In most cases,the user will also wear the smart textile. However, in at least two cases it would bewrong to address as users as wearers. First, as described in Chapter 2, not everysmart textile is wearable. Second, the user of a smart garment might be a differentindividual than the one wearing it [77].
3.2.1 Functional Requirements
Software functionality
Similar software functionality repeats across different smart textiles. Software func-tionality includes:
• Transferring sensor signals to remote devices for processing.
• Filtering noise produced by resistive and capacitive textile sensors.
• Calibrating custom-made textile sensors.
• Extracting data from raw sensor signals (e.g. mean, deviation, correlation, peakdetection).
• Recognizing physiological states such as human posture and motion (e.g. walk-ing, standing, running, lying down).
• Showing plots of sensor signals for development purposes.
TangoHapps should make software functionality commonly used in smart textile de-velopment available for reuse.
31
CHAPTER 3. TANGOHAPPS FRAMEWORK
Hardware support
There exists a limited number of electronic devices suitable for attachment or inte-gration onto smart textiles. Electronic devices include:
• Sensors able to measure environmental parameters (e.g. light intensity, temper-ature).
• Sensors that measure physiological parameters from the wearer of a smart gar-ment (e.g. motion, heart rate, pressure applied on a specific surface).
• Output devices to communicate information to users in a visual, auditive andhaptic manner (e.g. LEDs, speakers, vibration motors).
• Microcontrollers that execute the application (e.g. Arduino Lilypad, Intel’sCurie).
TangoHapps should offer high-level APIs to control these electronic devices.
Smartphone support
As opposed to smart textiles, smartphones are widespread and most individuals indifferent cultures nowadays carry a smartphone throughout the entire day. Modernsmartphones offer capabilities that can help complement those of a smart textile.Some capabilities of smartphones include:
• High resolution displays that can render text, images and videos.
• User interfaces with a variety of input mechanisms (e.g. sliders, switches, multi-touch gestures).
• Access to contacts and functionality to make calls.
• Access to music libraries and functionality to play music.
• Ability to use the camera to take photos and videos.
TangoHapps should enable the creation of applications that use capabilities availableon the smart textile and on modern smartphones at the same time.
Circuit design
The design of electronic circuits on textile is more challenging than the design ofcircuits for printed circuit boards, mainly due to the flexibility of the textile substrateand the strain it is subject to. Uninsulated textile connections can break or produceshortcuts when the substrate deforms or folds. The circuit design for a smart garmentmust additionally consider user comfort and wearability [35], adding constraints toplacement, shapes and materials to use. TangoHapps should facilitate the design ofcircuits on top of a model of the textile so that its folding and usage patterns can betaken into consideration. Because a smart textile developer might not have experiencein electronics, the IDE should additionally provide feedback about the correctness ofcircuits.
32
3.2. REQUIREMENTS
Simulation
The production of a smart textile object is relatively risky as it is unclear at the mo-ment of creation whether the choice of materials and structuring pattern (e.g. sewing,knitting) will fulfill the requirements of the application. Malfunctioning circuits andthe unsuitability of data delivered by sensors might be discovered only after havingcreated the smart textile and developed its application. Furthermore, the creation of atextile circuit might be time consuming because connections and custom-made textilesensors usually have to be created (e.g. sewn) manually. Simulations allow flaws to beidentified sooner, before considerable costs have been invested in a product. There-fore, there is a high motivation for simulating the smart textile before its creation.TangoHapps should provide a way to execute applications within the developmentenvironment and to simulate behavior of sensors and output devices.
Debugging
In order to facilitate comprehension of runtime behavior of an application, Tango-Happs should communicate runtime information to developers. In particular, Tango-Happs should display sensors signals as plots and communicate the state of outputdevices during the simulation of an application. For example, the sound emitted bya speaker could be played within the environment and the vibration produced by avibration motor could be communicated to developers by means of an animation.
Deployment
In Circuit Stickers [50] and PaperPulse [95] circuit layouts are printed on a sheetof paper using an inkjet printer filled with conductive ink. After printing the cir-cuit sheet, developers attach the electronic elements to the sheet of paper. Mostdevelopment environments for physical devices facilitate software deployment by gen-erating source code, which is uploaded to a microcontroller either manually by usersor automatically by the environment. TangoHapps should transform a developmentrepresentation of the application (e.g. source code) into an executable application.Furthermore, TangoHapps should be able to upload the generated executable into thesmart textile.
3.2.2 Non-Functional Requirements
Low entrance barrier
Similar development tools for physical devices facilitate development by offering ready-to-use software components [37, 72, 28, 27, 12, 118, 50, 95, 68]. For example, Mi-crosoft’s .NET Gadgeteer offers software components to perform the most commonoperations with the devices available in the kit (e.g. functionality to make the camerastart recording a video) [118]. Developers reuse the provided software componentswithout having to understand its internal behavior. Target users of TangoHapps
33
CHAPTER 3. TANGOHAPPS FRAMEWORK
might not have experience in software development or electronics. In order to lowerthe entrance barrier to such users, TangoHapps should provide common smart textilefunctionality in ready-to-use software components.
High ceiling
The ceiling of a development environment refers to the flexibility it offers developers todevelop new use cases. Low-level software abstractions tend to allow more flexibilityat the cost of a higher degree of expertise needed from developers. High-level softwareabstractions tend to be easier to grasp but might limit development freedom to specificdomains or use cases. TangoHapps should offer software abstractions that provideenough flexibility to experienced software developers, while maintaining a low entrancebarrier to users with less experience in software development.
Performance
Hardware integrated into a smart textile is ideally small, thin and lightweight topreserve the positive properties of the textile (flexibility, softness, lightness). Morecomputations to be performed on the smart textile hardware lead to more memoryand processing power requirements and to bigger and bulkier microcontrollers andbatteries. Software functionality available in TangoHapps should be optimized forexecution on hardware with limited resources.
Extensible with respect to software
In the Functional Requirements, we identified software functionality used across dif-ferent smart textile applications. However, this functionality is unlikely to cover everyuse case smart textile developers might want to address. TangoHapps should make itpossible to add new software functionality components without forcing developers tohaving to refactor big parts of the environment.
Extensible with respect to hardware devices
Smart textiles are not widespread in the market yet and new target hardware plat-forms are likely to appear as the field grows. TangoHapps should make it possibleto add support for new electronic devices without having to refactor big parts of theenvironment.
34
3.3. ANALYSIS
3.3 Analysis
<<component>>Editor
<<component>>or
<<component>>
<<component>>ion
<<component>>
ne
<<component>>t
Textile
Figure 3.1: Overview of TangoHapps.
TangoHapps consists of various components that fulfill specific purposes in thecreation of Applications for smart textiles. The Editor is the main mean by whichdevelopers access the functionality of the IDE in order to create the Application.The Running Engine executes applications created in TangoHapps. The Simulatorsupports developers at understanding application’s behavior and fixing undesired ap-plication behavior. The Deployer is similar to a compiler - it transforms high-levelrepresentations of applications into executable applications and uploads them into theSmart Textile. Figure 3.1 shows an overview of TangoHapps. In the next subsections,we explain each component in more detail.
3.3.1 Model of Smart Textiles
Smart Textiles have Textile and Electronic Circuits, which are integrated into theTextile. Electronic Circuits consist of Electronic Devices and Connections betweenthem. Electronic Devices can be sensors, output devices and microcontrollers, as wehave seen in Chapter 2. Electronic Devices have Pins, which are an interface fortransmission of electric signals with other Electronic Devices. There are two maincategories of Pins: Electric Pins and Data Pins. Electric Pins cause current to flowthrough Electronic Devices and Data Pins are used to transfer electric signals fromSensors to Microcontrollers and from Microcontrollers to Output Devices. On theother hand, there are two types or Electric Pins: Power Pins and Ground Pins. Thecurrent in a Circuit flows from Power Pins, through Electronic Devices into GroundPins.
Data Pins can be digital or analog depending on the way how signals are trans-mitted along them. Digital Pins operate with two possible values: high current and
35
CHAPTER 3. TANGOHAPPS FRAMEWORK
ElectronicDevice
Connection
d
tTextile
Electric
ElectronicCircuit
Textile
connectsintegrated
Figure 3.2: Model of a Smart Textile.
low current and can be used for input (i.e. transmitting data from a Sensor to aMicrocontroller) and for output (i.e. transmitting data from a Microcontroller to anOutput Device). In contrast to Digital Pins, Analog Pins operate with a range ofvalues and can be used only for input purposes. GPIO Pins (General-Purpose In-put/Output Pins) are Pins that can be configured at runtime to be used for input oroutput. Connections connect two or more Electronic Devices through their Pins. AnUML model of smart textiles is shown in Figure 3.2.
3.3.2 Model of an Application
Applications in TangoHapps are object-oriented, hence they consist of a collectionof objects - called Application Objects - that interact with each other. ApplicationObjects also interact with the Electronic Devices on the smart textile (e.g. by readingvalues from Sensors and writing values to Output Devices).
Application Objects contain data in the form of Variables and code in the formof Methods. Methods might require parameters (called Variables) in order to executetheir code. The interaction between objects occurs by means of method invocations.A Method might invoke another Method if the appropriate number and type of pa-rameters required by the target Method are provided.
Application Objects can emit Events. Events are occurrences that might be han-dled application, such as when the user presses a button on his smart jacket. Eventscan deliver Variables, such as the temperature measured by a temperature sensor.An Event can trigger one or more Methods. However, Events invoking a Method arealso constrained by the number and type of parameters required by the Method. The
36
3.3. ANALYSIS
model of Applications is depicted in Figure 3.3.
ApplicationObject
Variable
+trigger()
Event
-name
+execute()
Method
RunningEngine
Application
ElectronicDevice
*
*
*
*
1
*
*
1
0..1
*
**
*
**
invokes
provides
requires
triggers
controls
executes
Figure 3.3: Main abstractions of an Application in TangoHapps.
3.3.3 Model of the Editor
Editor
CircuitDesigner
faceDesigner
CodeEditor
CircuitLa
Smartphone
face
Code
controls
controls
edits
designs
creates
Figure 3.4: Model of the Editor.
The Editor offers developers the means to design electric circuits, to implement thefunctionality of the application and to create user interfaces for smartphones. For thispurpose, the Editor consists of three main components, the Circuit Designer, the CodeEditor and the UI Designer. The Circuit Designer is used to create Circuit Layouts
37
CHAPTER 3. TANGOHAPPS FRAMEWORK
visually by arranging electronic devices on top of a model of the smart textile. TheCode Editor is an interface for implementing, extending and reusing Code. Instancesof the Code class specify the runtime behavior of the Application, including howElectronic Devices and capabilities of a smartphone should be controlled. The UserInterface Designer enables the creation of Smartphone User Interfaces by makingcomponents that wrap common smartphone capabilities (e.g. music playing) availablefor reuse. An UML model of the Editor is shown in Figure 3.4.
3.3.4 Model of the Simulator
DebuggingInformation
Application
UserInput
Simulator
VirtualSmartTextile
VirtualSmartphone
VirtualDevice
RunningEngine
controls
pr
reacts to
executes
Figure 3.5: Model of the Simulator.
The Simulator displays Debugging Information with the purpose of helping de-velopers to understand runtime behavior of an Application. Debugging Informationis provided by the Running Engine during execution of an Application. In order toexecute an Application without having to deploy it to the real hardware, the behaviorof the different devices is simulated within TangoHapps. Virtual Devices fulfill twomain purposes. First, they communicate the state of the device to the developer bymeans of images, animations and sounds. For example, a buzzer will begin to shakeand produce a sound to indicate that it has been turned on. Second, they acceptUser Input in order to simulate user interaction with the device and environmentalparameters. For example, touching a simulated temperature sensor causes it to delivera higher temperature reading. The Virtual Device superclass provides the means forits subclasses to control an Application by modifying its internal state in the sameway as the Running Engine would during real execution. In order for simulations tobe as close to real executions as possible, the Application class does not offer specialfunctionality or a special interface for simulation purposes. An UML model of theSimulator is shown in Figure 3.5.
38
3.3. ANALYSIS
3.3.5 Model of the Deployer
Project
Application
Uploader Smartphone SmartTextile
DeviceDeployer
targetsuploads
represents
inputs
outputs
Figure 3.6: Model of the Deployer.
The main tasks of the Deployer are to “compile” Projects into Applications andto upload executable Applications into the target devices. Projects are high-levelrepresentations of Applications that contain the assets necessary to create an Appli-cation. The main difference between a Project and an Application is that, in contrastto Projects, Applications are executable. The Deployer first transforms a Projectinto an executable Application and then transfers the executable Application into thetarget Device. Target Devices include a Smartphone and the Smart Textile. TheDeployer delegates the task to serialize and transfer applications wirelessly to theUploader. An UML model of the Deployer is shown in Figure 3.6.
39
CHAPTER 3. TANGOHAPPS FRAMEWORK
40
Chapter 4
TangoHapps Design
In previous chapter, we elicited the functional and non-functional requirements andcreated first models that serve as a basis for the development of TangoHapps. Thischapter first describes the trade-offs we encountered and decisions we made in thedesign of TangoHapps. In particular, we describe how the different alternatives weanalyzed would have affected the system in terms of the non-functional requirementselicited in previous section. The rest of the chapter describes TangoHapp’s internalstructure in a top-down manner. We start with TangoHapps’ high-level design andarchitecture and then focus on the details of each subsystem. We f
4.1 Design Decisions
In this section we describe the design decisions we made in order to optimize forqualities of the system, which we described in Section 3.2.2. These are:
Hybrid text and visual programming
Visual programming languages have been shown to lower the entrance barrier to userswith little programming knowledge [68, 49, 7, 83, 84]. In TangoHapps, software func-tionality components, electronic devices and elements composing the user interfaceof a smartphone are represented visually. However, a visual programming languagemight not scale well for complex applications and might result inconvenient to users ofTangoHapps who already possess software development experience. In order to fulfillboth non-functional requirements of Low entrance barrier and High ceiling, we decidedto make TangoHapps support both, a visual and a textual programming language.
41
CHAPTER 4. TANGOHAPPS DESIGN
Prototype-based programming
Prototype-based programming is an object-oriented programming paradigm in whichnew objects are created by cloning existing object prototypes. After an object hasbeen cloned, functionality is added to it that distinguishes it from its prototype. Asan example, to create a tennis ball out of a soccer ball, one might clone the soccerball, change its color to green and reduce its size to 6.5 cm. This is in contrast tothe class-based programming paradigm, where a model has to be created before anyobject can exist. TangoHapps should offer high-level reusable components. In orderto facilitate reuse of these components, we decided to make them available in a paletteof object prototypes that developers can clone by simply drag- and dropping theminto a canvas.
Flow-based programming
Flow-based programming is a programming paradigm in which data is passed throughdifferent components or boxes. Each box executes specific computations in order totransform the data. Program behavior is defined by connecting boxes to each othercausing the output of a box to be the input of a subsequent box. The flow-basedprogramming paradigm has been successfully implemented in several developmentenvironments for physical devices [37, 72, 28, 27, 84, 9] and development tools forother domains, such as Node-RED, Max/MSP and PureData. For example, the CRNToolbox represents ready-to-use software components including filters, feature extrac-tion and classification algorithms as boxes that are shown in a canvas. Users of theCRN Toolbox connect the outputs of a box to the inputs of another box in order tocreate machine learning algorithms.
According to the requirements we elicited in Section 3.2, TangoHapps shouldprovide high-level reusable components. Furthermore, we decided that Tango-Happs should feature a visual programming language. The flow-based programmingparadigm would enable developers of TangoHapps to develop applications by connect-ing visual representations of the functionality components.
Tablet device for design
We decided to implement the visual programming interface on a tablet device basedon the following reasons, aligned with our requirement of Low entrance barrier :
1. Tablet devices allow for direct manipulation of objects on the touchscreen. Rep-resenting software functionality components visually and further making themtangible could help make programming less abstract to novice.
2. Tablet devices enable usage by multiple users simultaneously which can be ben-eficial for developers with less experience in programming and electronics to
42
4.1. DESIGN DECISIONS
learn by collaborating on the same project.
3. TangoHapps should support the creation of applications for smart textiles thatuse the capabilities within smartphones. Because tablet devices have similarcapabilities to smartphones, they allow for a realistic design and simulation ofthe smartphone application.
Delegation of computations to a smartphone
As we discussed in Section 3.2.2, complex computations would lead to heavier andbulkier hardware attached to the textile. In order to allow for lightweight constructionsof smart textiles, we decided to delegate most of TangoHapps’ runtime computationsto a smartphone. As possible devices for delegation of computations, we consideredmobile and non-mobile devices. We decided for a mobile device based on three facts:
1. As discussed in the Functional Requirements, we decided that TangoHappsshould support smartphones because they offer useful capabilities not providedby current smart textiles. The choice to delegate computations to other deviceswould require the smart textile to have to communicate with yet another exter-nal device, which adds a point of failure and might limit performance. Instead,the computations can be delegated to the same smartphone.
2. Non-mobile devices would require the transmission of sensor data over longer-range communication technologies, which in general lead to higher power con-sumption rates. Instead, modern mobile devices enable communication overBluetooth Low Energy (also called Bluetooth 4.0), a Bluetooth technology op-timized for low energy consumption.
3. The delegation of computations to an external device makes the smart textiledependent on it for its functioning. However, long-range communication tech-nologies, might not be available everywhere (e.g. in the metro). Hosting thecomputations on mobile device with a direct communication with the smarttextile would enable the usage of smart textiles everywhere assuming the mobiledevice is carried by the user.
Interpreted language
We chose to have TangoHapps applications interpreted rather than compiled, for thefollowing reasons:
1. An advantage of interpreted languages over compiled languages is platform inde-pendence. Code interpreters handle platform specific issues, making applicationsreusable across different platforms. The use of an interpreted language, there-fore, would make TangoHapps better extensible to new hardware, as describedin the non-functional requirement Extensibility with respect to hardware.
43
CHAPTER 4. TANGOHAPPS DESIGN
2. Related to the previous point is the fact that an interpreted language wouldallow users of TangoHapps to deal with a high-level syntax independent of spe-cific platforms rather than forcing them to deal with the nuances of differentplatforms.
3. Interpreted languages can be easier to debug because interpreters process thesame representation of the software as the one used by developers. Compiledlanguages, instead, transform the representation used by developers making itharder to relate execution errors to mistakes in the language. An interpretedlanguage would offer more potential for TangoHapps to display useful runtimeinformation, which is aligned with the non-functional requirement of Low en-trance barrier.
44
4.2. HIGH-LEVEL DESIGN
4.2 High-Level Design
Figure 4.1: High-level design of TangoHapps.
TangoHapps consists of two main components: Interactex and TextIT. Interactexis a visual programming environment that includes three tools: Interactex Designer,Interactex Client and Interactex Firmware [42]. Interactex Designer is used to developand debug applications. Applications created in Interactex Designer are deployed toInteractex Client, where they are executed. Interactex Client controls the hardware onthe smart textile through APIs offered by Interactex Firmware. Interactex Firmwareruns on the microcontroller attached to the smart textile. TextIT is a hybrid textand visual programming environment used to extend the functionality of Interactex.Software components are developed using TextIT and imported into Interactex De-signer where they are linked to a specific hardware setup and reused in the contextof an application. Figure 4.1 depicts the high-level design of Tangohapps.
45
CHAPTER 4. TANGOHAPPS DESIGN
4.3 Architecture
dware
<<component>>e
Drivers
<<component>>Microcontroller
ecution
<<component>>ata
Libra
<<component>>RunningEngine
Development
<<component>>or
<<component>>Simulator
<<component>>
<<component>>Smartphone
<<component>>n
Interpreter
Figure 4.2: TangoHapps’s layered architecture. Software components on the Hard-ware layer appear grayed out because they consist of libraries developed by third-partydevelopers reused within TangoHapps.
TangoHapps is based on a layered architecture. The Development layer is theuppermost layer in the hierarchy and contains the functionality to develop, debugand deploy smart textile applications. For this purpose, the Development layer hasthree main components: the Editor, the Simulator and the Deployer. The Editor isthe equivalent to a source code editor in a traditional IDE and is the main interfacedevelopers use in order to create applications. The Simulator contains the main func-tionality to simulate and debug an application from within the environment, withouthaving to deploy it to the smart textile. The Deployer transforms the application’ssource code into executables and uploads the executable into the target devices. TheDevelopment layer is distributed among TextIT and Interactex Designer.
The Execution layer contains the main functionality to execute applications de-veloped in TangoHapps. Software components in the Execution Layer include theRunning Engine, the Plugin Interpreter and the Firmata components. The RunningEngine executes applications developed in Interactex Designer and the Plugin In-terpreter interprets plugins developed in TextIT. Firmata executes input and outputoperations on the smart textile’s hardware and offers a service for other components tocontrol the hardware remotely. The Execution layer is distributed among Interactex
46
4.4. HARDWARE/SOFTWARE MAPPING
Client and Interactex Firmware.The lowest layer in the architecture is the Hardware layer. The Hardware layer
contains drivers and libraries to control the electronic devices attached to the textileand the user’s smartphone. The Hardware layer offers an interface to upper layers foraccessing the hardware capabilities without having to deal with the complexity of thehardware implementation. The Hardware layer is composed by Hardware Drivers, theMicrocontroller Firmware and a set of APIs provided by the Smartphone’s OS. TheHardware Drivers software component consists of a set of source code libraries usedto control specific sensor and output devices. Similarly, the Microcontroller Firmwarecomponent consists of a set of source code libraries for controlling capabilities of differ-ent microcontrollers. The Mobile Phone OS component manages the mobile phone’shardware and software resources and provides high-level APIs to other componentsto control it. TangoHapps’ architecture is shown in Figure 4.2.
The following subsections in this chapter describe in detail each software compo-nent in TangoHapps in a bottom-up manner, starting at the Hardware layer up to theDevelopment layer.
4.4 Hardware/Software Mapping
TangoHapps runs on four different hardware devices. TextIT runs on a browser.Therefore, it executes on any computing device able to run a browser. Interactex De-signer executes on multi-touch tablet devices. Interactex Client executes on smart-phones. The current implementation of Interactex Designer and Interactex Clientare made for iOS. Therefore, Interactex Designer can be installed on an iPad deviceand Interactex Client on an iPhone or iPod Touch. Interactex Firmware executes ona microcontroller. The current implementation of Interactex Firmware is made forArduino. Figure 4.3 displays the distribution of the software components introducedin previous section into hardware nodes.
47
CHAPTER 4. TANGOHAPPS DESIGN
Tan
Interactex
eDrivers
rocontroller
ata
TextIT
nInterpreter
nor
InteractexDesigner
Runni gine
Simulatoror
nInterpreter
Interactext
Runni gine
ataLibra
nInterpreter
Mobile Phone OS
Applications
App 1 App 2 App N
>>
<<Bluetooth
Figure 4.3: Deployment of software components in TangoHapps.
48
4.5. RUNNING ENGINE
4.5 Running Engine
-name
+execute()
Application
ApplicationObject
-name
-value
V able-name
ecut
Method
+invoke()
on
+run()
-name
+trigger()
*
* *
**
1
*
*
1
*
invokes
Figure 4.4: Main classes of the Running Engine software component.
The Running Engine is the main responsible component for executing applica-tions. Runner is the class that initiates execution of an application by instantiatingthe Application class and invoking its execute() method. Applications are a containerfor Application Objects and interactions between them. We mentioned earlier thatApplication Objects consist of Events, Methods and Variables and that Events of anApplication Object can invoke Methods of other Application Objects if the number andtypes of parameters match. Types supported in TangoHapps are Numbers, Booleanvalues and Strings. Objects in TangoHapps are treated as every other Type, whichenables Events to provide and Methods to expect Objects as parameters. Instancesof the Invocation class store the necessary information to perform the invocation atruntime. More specifically, Invocations reference the Event that triggers the invoca-tion, the Method that should be invoked, and the Variables that should be passedas a parameter to the Method. Figure 4.4 displays the main classes of the RunningEngine.
The functionality that is executed by the Running Engine is distributed amongApplication Objects. Application Objects are reusable abstractions of physical or dig-ital entities. For example, the Light Sensor Application Object is an abstraction inTangoHapps that represents a physical light sensor. Application Objects include Elec-tronic Devices, UI Widgets and Programming Objects. Electronic Devices encapsulatelow-level microcontroller instructions to access sensors, output devices and microcon-trollers and offer a high-level API to control such devices. UI Widgets representthe widgets present in a smartphone’s user interface. Programming Objects are con-structs commonly present in programming languages such as arithmetic operators andfunctions. In the following subsections, we list every Application Object available inTangoHapps. A complete description of Application Objects including their Events,
49
CHAPTER 4. TANGOHAPPS DESIGN
Methods and Variables is available in the Chapter A in the Appendix. In order todistinguish between an Application Object and the physical or digital entity it repre-sents, Application Objects appear in italics throughout the rest of this Ph.D thesis.A taxonomy of Application Objects is presented in Figure 4.5.
-name
ApplicationObject
-position
oun
Electr cObject
Sensor
t rocontroller
-value
V able--
+operate()
Operator+execute()
F
cat
Operatorson
Operatorecut
Fecut
edF ecut
n
Figure 4.5: Taxonomy of Application Objects.
4.5.1 Electronic Devices
TangoHapps supports every sensor, output device and microcontroller from the Ar-duino Lilypad1 hardware kit and additional sensors that are also suitable for inte-gration in smart textiles, such as the MPU 6050 Inertial Measurement Unit (IMU).Sensors supported in TangoHapps are listed next.
• Buttons are similar to UI Widget Buttons. Buttons can be pressed and emitevents when pressed and when released.
• Switches are on/off switches and like UI Widget Switches, they emit events whentheir state change.
• Light Sensors measure the intensity of the light and emit an event when a newsensor reading is available.
1https://www.sparkfun.com/products/retired/10354
50
4.5. RUNNING ENGINE
• Temperature Sensors are similar to Light Sensors. Temperature Sensors mea-sure temperature and emit an event when a new sensor reading is available.
• Accelerometers measure acceleration forces in a 3D space and emit an eventwhen a new sensor reading is available.
• LSM Compass represents the LSM303C 3-axis accelerometer and 3-axis com-pass2. The LSM Compass emits events when either new accelerometer or com-pass readings become available.
• MPU 6050 represents the MPU 6050 3-axis accelerometer and 3-axis gyroscope3.Like the LSM Compass, the MPU 6050 emits events when new accelerometeror gyroscope readings become available.
• Potentiometer represents a generic analog textile sensor, including sensors basedon the capacitive, resistive and optical principles.
Output Devices supported in TangoHapps are:
• LEDs stand for Light Emitting Diodes. LEDs can be turned on and off andtheir intensity can be set. These LEDs emit light of a single color.
• Buzzers emit a sound at a specified frequency. Like LEDs, Buzzers can beturned on and off and their frequency can be set.
• Three-Color LED emit light in different colors. Three-Color LEDs can be turnedon and off, and the intensities of their red, green and blue components can beset in order to produce different light frequencies.
• Vibration Boards produce a vibration at a specified frequency. Like Buzzers,they can be turned on and off and their vibration frequency can be set.
TangoHapps supports three Arduino Lilypad microcontroller models. All of themoperate with voltages that oscillate between 2.7 and 5.5 V at 8 MHz frequency. Thespecific capabilities of these microcontrollers are listed below:
• Lilypads represent the microcontroller officially called “Lilypad Arduino MainBoard”4. Lilypads contain 14 digital, 6 analog pins and 16 Kb of flash memory.Lilypads have no placeholder for a battery and do not include a module forwireless communication; these hardware parts need to be attached externally tothe board.
• Lilypad Simple represents the microcontroller officially called “Lilypad ArduinoSimple”5. Lilypad Simple microcontrollers have less pins and more memory thanLilypads: 9 digital, 4 analog pins and 32 Kb of flash memory. Lilypad Simplemicrocontrollers have a placeholder for a battery but no wireless communicationmodule.
2https://www.sparkfun.com/products/133033http://www.invensense.com/products/motion-tracking/6-axis/mpu-6050/4https://www.arduino.cc/en/Main/ArduinoBoardLilyPad5https://www.arduino.cc/en/Main/ArduinoBoardLilyPadSimple
51
CHAPTER 4. TANGOHAPPS DESIGN
• BLE-Lilypad represents a microcontroller developed by the Universität der Kün-ste in Berlin based on the Lilypad Arduino Main Board. The BLE-Lilypad hasthe same features as the Lilypad and additionally features a placeholder forbattery and an integrated Bluetooth Low Energy module.
4.5.2 UI Widgets
UI Widgets (i.e. reusable elements of a graphical user interface) are used to create theinterface shown on the mobile device. UI Widgets range from simple elements suchas buttons, labels and sliders to more complex elements to access specific capabilitiesof a smartphone such as its contacts and music lists. UI Widgets have Variablesthat enable developers to configure their visual appearance, such as their positionand dimensions within the mobile device’s screen and their background color. UIWidgets can be simple (e.g. a button) or contain other UI Widgets (e.g. MusicPlayer, which contains Buttons, Labels, Switches and Sliders). The subclasses of UIWidgets supported in TangoHapps are listed next.
• Buttons are equivalent to push buttons on mechanical devices. Buttons emitevents when they start to be pressed and when they are released.
• Labels are used to display text. Their text can be set statically or dynamically.
• Switches are equivalent to an on/off switch (e.g. a light switch). Switches havetwo possible states on and off and emit events when their state change.
• Sliders have a handle that allows users to select a value through a range ofallowed values. Sliders emit events when the handle is moved.
• Touchpads represent an area on the smartphone where multi-touch gestures arerecognized. Multi-touch gestures supported by the Touchpad are: tap, doubletap, long tap, pinch and pan. Touchpads emit events when a multi-touch gesture isrecognized.
• Image Views are used to display an image, which is set statically.
• Music Players are composite widgets that contain Buttons, Labels, Image Viewsand Sliders. Music Players contain functionality to iterate and play music storedon the user’s mobile device. Music playlists can be controlled by users throughthe Music Player ’s user interface or by other Application Objects through itsmethods.
• Contact Books are composite widgets similar to Music Players that containButtons, Labels and Image Views. Contact Books contain functionality to iter-ate through the user’s contacts, and to make phone calls if the mobile devicesupports it. In a similar way to Music Players, Contact Books offer an interfaceto users as well as to other Application Objects to access its functionality.
52
4.5. RUNNING ENGINE
• Monitors are composite widgets used to display series of values in a plot. Thisis useful for applications where users need to monitor sensor values over time.
4.5.3 Programming Objects
Programming Objects are the main means by which developers reuse, extend andimplement behavior in TangoHapps. Programming Objects range from abstractionsover simple programming constructs (e.g. variables, arithmetic operators) to high-level software components (e.g. an accelerometer-based human posture classifier).Programming Objects can be either: Operators, Variables or Functions.
Operators can be Arithmetic, Comparison or Logical Operators. Arithmetic Oper-ators take two numeric inputs and produce a numeric output. Supported ArithmeticOperators are Addition, Subtraction, Multiplication, Division and Modulo. Compar-ison Operators compare two numeric values and provide the result of the comparison asa Boolean Variable. Supported Comparison Operators are: Bigger , BiggerEqual , Smaller ,SmallerEqual , Equal and NotEqual . Logical Operators take two Boolean inputs andprovide the result of a logical operation as output. Supported Logical Operators are theAndOperator (an AND-conjunction) and the OrOperator (an OR-disjunction).
Variables store values and emit events when the stored value has changed sothat other objects can react to the change. Supported Variables in TangoHapps are:Booleans, Numbers, Strings and Objects. Booleans store Boolean values, Numbersstore real numbers including integer and floating point numbers and Strings storechains of characters.
Functions are executable code procedures, comparable to functions in text-basedprogramming. An example of a Function is the Mean Extractor. The Mean Extractorcalculates the mean of a set of values it receives as input. Functions can be PredefinedFunctions, Plugins or Composite Functions. Predefined Functions are functions al-ready available in TangoHapps. In contrast, Plugins are code not originally availablein TangoHapps but imported from TextIT. We explain the concept of Plugins withdetail in the next Section 4.6. Predefined Functions for signal processing are:
• Windows take streams of values, buffer them and deliver chunks of values. Win-dows are useful for signal processing algorithms to process chunks of a signal.The amount of samples contained in a chunk and the amount of overlappingsamples between two consecutive chunks can be configured by developers.
• Low-Pass Filters smooth out a signal by attenuating its high frequencies.
• High-Pass Filters are the opposite of Low-Pass Filters. High-Pass Filters at-tenuate the low frequencies of a signal and pass its high frequencies.
• Mean Extractors compute the mean of a set of values.
• Deviation Extractors compute the deviation of a set of values.
• Peak Detectors compute the highest peak of a set of values and deliver its valueand index within the set of values.
53
CHAPTER 4. TANGOHAPPS DESIGN
• Activity Classifiers determine user physical activity (walking, running, climbing,quiet) based on accelerometer input.
• Posture Classifiers determine the user posture (standing, lying down on belly,lying down on the back) based on accelerometer and gyroscope input.
Other Predefined Functions available in TangoHapps are:
• Timers generate events after a specified amount of time.
• Sounds reproduce audio files over the user’s mobile device. Sounds differ fromMusic Players in that they do not access the device’s music lists, but rather playa sound predefined at development time. Sounds are useful for user interfaces.
• Recorders offer functionality to store a set of values on the mobile device’s harddrive and to reproduce it later.
• Mappers scale numeric values within a specified range. They are useful foradapting either sensor input to a specific range or values to the range requiredby an output device.
Composite Functions group Programming Objects and Invocations between them inorder to construct complex Programming Objects by aggregating simpler ones. Com-posite Functions were designed as a Composite design pattern [34]. Composite Func-tions enable developers to dynamically define their Methods and Events. Methods andEvents of a Composite Function do not contain any specific behavior, but simply redi-rect to a Method or Event of a Programming Object contained within the CompositeFunction.
4.6 Plugin Interpreter
The Plugin Interpreter executes JavaScript plugins developed in TextIT. The Inter-preter class is the Facade of the Plugin Interpreter [34] and offers a service to othercomponents for interpreting Plugins. Plugins have code, which is contained within theScript class. A Script might reuse code contained in other Scripts. The Plugin class isa subclass of Application Object. This allows Plugins to be executed by the RunningEngine in the same way any other Application Object. The Interpreter delegates theexecution of Scripts to the JSContext.
The JSContext is a class provided by Apple’s JavaScriptCore framework that rep-resents a JavaScript execution environment. The data used by the JSContext class isstored in instances of the JSValue class. JSValue represent JavaScript variables. Sup-ported variable types include numbers, arrays, objects and functions. The JSContextproduces execution information when it executes JavaScript code.
The Interpreter parses the execution results provided by the JSContext and createsan instance of the Debug Info class for each error, warning and output message. TheDebug Info contains the message represented in a String and the line number where
54
4.7. FIRMATA LIBRARY
+interprInterpreter
e
Scripter
ecutn
Figure 4.6: Main classes of the Plugin Interpreter component.
the message was originated. The Interpreter passes instances of Debug Info over tothe Debugger. The Debugger displays syntax errors, warnings and output messagesin order for developers to identify issues in the code syntax and undesired runtimebehavior. The main classes of the Plugin Interpreter and their relationships are shownin Figure 4.6.
4.7 Firmata Library
The Firmata Library is the closest component to the Hardware Layer and acts asan adapter between the Execution Layer and the Hardware Layer. The FirmataLibrary transforms high-level operations with hardware devices into instructions thatthe microcontroller attached to the smart textile is able to execute. For example,an operation to turn an LED on will be transformed by the Firmata Library into aninstruction to execute a digital output command on the microcontroller pin connectedto the LED.
The communication between the mobile device and the microcontroller attachedto the smart textile conforms to the Firmata protocol6. Firmata is a generic protocolthat enables the communication between a microcontroller and a host computer. TheFirmata class is the Facade of the component and offers a set of high-level methodsto send messages to the microcontroller and to handle messages received from it.Internally, Firmata converts high-level method calls into sequences of bytes and parsessequences of bytes in order to invoke the appropriate handler method.
The Communication Module defines an interface for sending and receiving data.
6http://www.firmata.org
55
CHAPTER 4. TANGOHAPPS DESIGN
-handleDigitalResponse()-handleAnalogResponse()-handleI2CResponse()+sendDigitalMessage()+sendAnalogMessage()+sendI2CMessage()+sendCapabilitiesRequest()+receiveData()
ata-number-mode-value
-name-address -number
-value
I2C
+registerAsReceiver()+sendData()
CommunicationModule
+registerAsReceiver()+sendData()
unicationModule
+registerAsReceiver()+sendData()
Virtualunication
Module
+subscribe(Subscriber)+unsubscribe(Subscriber)+notify()
Publisher
-bytes-size-register-device
I2Ce
+receiveData()
<<Interface>>unication
Modu e
+update()Subscriber
*
1 1
1
*
1..*
sends
delegates
Figure 4.7: Main classes of the Firmata Library component. The Virtual Communi-cation Module belongs to the Simulator software component.
There are currently two implementations of the Communication Module. The BLECommunication Module sends data over the device’s Bluetooth Low Energy interface.The Virtual Communication Module belongs to the Simulator software componentand is used to display data on a debugging interface in order for developers to keeptrack of messages sent to the microcontroller.
The Firmata Library follows the Observer [34] and the Delegation patterns forsending data to the microcontroller and handling received data. The Observer patternis used by the Firmata class, which observes for changes in Pins and I2C Devices andsynchronizes their state with the actual hardware. The Delegation pattern is usedbetween the Communication Module and the Firmata class. The CommunicationModule delegates data to Firmata without knowing the details of its implementationbesides the fact that it implements the Communication Module Delegate interface.The Communication Module Delegate interface enforces the implementation of thereceiveData() method, which handles data received over the current CommunicationModule. The use of the Observer and Delegation patterns decouples the Pin, I2C
56
4.8. EDITOR
Device and the Communication Module classes from the Firmata class. Furthermore,the usage of these design patterns improves maintainability of the Firmata class andreusability of the classes it communicates with. An UML model of the Firmata Libraryis shown in Figure 4.7.
4.8 Editor
+selectView()alette
+draw()
ObjectsV
+draw()
PropertiesV
+draw()
PaletteV
EditorController
oolbar
+addObject()+removeObject()
as
+handlePressed()
oolbar
+recognize()
Multi-touchGesture
EditionObject
1 *
*
2
1
1
*
1
1
reacts to
draws
Figure 4.8: Overview of the Editor software component.
The Editor offers an interface for developers to design circuits and create applications.The Editor consists of three main components: the Palette, the Toolbar and the Can-vas. The Palette fulfills two tasks. First, it displays a list of Application Objects thatdevelopers drag into the Canvas in order to reuse their functionality. Second, it offersan interface to developers for modifying the Variables of the object that is currentlyselected in the Canvas. The Toolbar contains Toolbar Buttons that are used to ma-nipulate objects in the Canvas and modify the Editor ’s state. The Canvas displaysEdition Objects and enables developers to manipulate them (e.g. move them, deletethem, define Invocations between them). Developers manipulate Edition Objects bymeans of Multi-touch Gestures. The Multi-touch Gesture is an abstract class thatdefines an interface for recognizing multi-touch gestures. Subclasses of Multi-touch
57
CHAPTER 4. TANGOHAPPS DESIGN
Gesture implement the behavior for recognizing a gesture based on sequences of usertouches. Currently supported multi-touch gestures are taps, double tap, pans, pinchesand rotation gestures. An UML model of the Editor’s main classes is shown in Figure4.8. We continue this section with a more detailed description of the components inthe Editor and finally describe how they can be used to design a circuit and developan application.
4.8.1 Palette
+draw()
ObjectsView
-name-image
+handleDrag()+handleDrop(Position)+canBeDropped(Position)
Palette
EditionObject
ationView
-label
+draw()
VariablesView
+draw()
PaletteView
+handleDrStatic Palett ic Palette It
tiate>>
Figure 4.9: Main classes involved in the Palette.
The Palette enables developers to add objects to a Project and edit them. ThePalette contains two views, the Objects View and the Configuration View. The Ob-jects View displays a list of every object available in TangoHapps for reuse and theConfiguration View enables developers to modify an object’s configuration.
Objects displayed in the Objects View are called Palette Items. A Palette Item isa visual representation of an Application Object that contains an image and a name.Palette Items can be dragged into the Canvas. Palette Items can be dropped atspecific positions within the canvas. For example, UI Widgets can only be droppedon top of the smartphone and Hardware Devices can only be dropped on top of atextile. For this reason, each Palette Item subclass has a different implementation ofthe canBeDropped() method.
When dropped, each Palette Item instantiates a different Edition Object. PaletteItems that instantiate Edition Objects available in TangoHapps by default are calledStatic Palette Items. We have described the concept of Composite Functions in Sec-tion 4.5. Composite Functions are Programming Objects that have been created byaggregation of simpler Programming Objects, in order to improve reusability of the
58
4.8. EDITOR
functionality developed with TangoHapps. Composite Functions that have been cre-ated in the canvas can be dragged into the Palette for later reuse in the same or ina different Project. Palette Items that were originally available in TangoHapps arecalled Static Palette Items. Static Palette Items are programmed to instantiate aspecific Editable Object. In contrast, Dynamic Palette Items instantiate a CompositeFunction that has been added to the Palette at runtime.
It is unlikely that the default configuration of an Application Object fulfills therequirements of a specific use case. Application Objects can expose their Variables inorder for developers to modify the default behavior of an Application Object. Vari-ables exposed by an Application Object are modified through the Variables View.Each Edition Object generates a different instance of a Variables View containing auser interface with controls to modify an Application Object ’s exposed Variables. Acomplete list of every Application Object ’s Variables is available in Chapter A in theAppendix. An UML model of the main classes of the Palette and the relationshipsbetween them is provided in Figure 4.9.
4.8.2 Toolbar
T ar
+handlePressed()
T ar
+handlePressed()+handlePressed()
Connection
+handlePressed() +handlePressed()
dware
Editor
+handleGesture()
ToolState
+handleGesture()
ConnectionState
+handleGesture()State
+handleGesture()State
+handleGesture()
dwareState
+recognize()
Multi-touchGesture
1
1
1
1 *
*
handles
sets
Figure 4.10: Main classes involved in the Toolbar.
Toolbar Buttons available in the Toolbar are the Connection Button, the CopyButton, the Remove Button and the Hardware Button. Each Toolbar Button sets theEditor into a different state. The Editor reacts differently to user input dependingon its current state. We have designed the Editor ’s response to user input as a State
59
CHAPTER 4. TANGOHAPPS DESIGN
pattern [34]. Every subclass of Tool State conforms to the Tool State interface byimplementing the handleGesture() method. Each Tool State handles a Multi-touchGesture differently. The Editor ’s state is modified by setting its reference to thecurrent instance of Tool State. Figure 4.10 displays the relationship between theToolbar and the Editor in a class diagram.
4.8.3 Canvas
-applicationObject-image-label
+draw()
ionObject -position
-shape-color
sHook()
+draw()+acceptsHoo...
Method
+draw()+acceptsHoo...
Variable
+draw()+acceptsHook()
+draw()
cationView
as cation
*
*
*
1 1
*
2*connects
represents
Figure 4.11: Main classes involved in the Canvas.
The Canvas is where the main visual programming in Interactex Designer takes place.Applications in Interactex Designer are created by adding Edition Objects into theCanvas and by defining Invocations between them. Edition Objects are representa-tions of Application Objects that expose the Application Objects’ Events, Methodsand Variables in order for developers to define how Events should be handled, whenMethods should be invoked and how Variables should be set at runtime. EditionObjects are shown as an image with a label on screen. Edition Objects have the socalled Hooks. Hooks are visual representations of Methods, Events and Variables thatdevelopers use to define method invocations. Connecting an object’s Event Hook toanother object’s Method Hook causes an Invocation to be created that will invoke thespecified Method when an Event triggers at runtime.
Invocation Views are a visual representation of an Invocation. Invocation Viewsdraw lines between objects in the Canvas and display Variable Hooks. Variable Hooksfulfill two purposes. First, they display the types of parameters being passed in anInvocation. Second, Variable Hooks enable developers to pass an Application Object ’sVariables as parameters into any Invocation. For example, an Invocation between the
60
4.8. EDITOR
Button’s buttonPressed() event and the LED ’s setIntensity(Number)would not validbecause the event does not provide a parameter of type Number, as expected by thesetIntensity(Number) method. In this case, the Invocation View will still be shownwith an icon indicating that the parameters provided by the event do not complywith the parameters expected by the method. The different icons used by InvocationViews are shown in Section 5.1.3. In this case, the Variable Hook of a third objectwith a Variable of Number type might be dragged into the Invocation View. Atruntime, when the buttonPressed() event triggers, the Running Engine will fetch thethird object’s Variable and pass it in the setIntensity(Number) method call. Figure4.11 displays a model of the main classes involved in the Canvas.
4.8.4 Circuit Layout
raint
+fulfills()
AllowedPin
Constraint+fulfills()
Allowed
Constraint+fulfills()
esResistor
Constraint
+connectTo(Microcontroller)
Electronic-position
+canConnectT
PinConnection
RoutinNode
t T
connec
Figure 4.12: Main classes involved in the Editor ’s functionality to create circuitlayouts.
In order to create circuits, developers lay out Electronic Devices on top of theSmart Textile and draw Connections between them. Connections connect two ormore Pins. An Electronic Device’s Pin, however, can have more than one Connection.Connections can be straight lines between two Pins or a collection of line segments.In order to split a Connection into two or more line segments, developers add RoutingNodes to a Connection. Routing Nodes are fixed positions defining the edges of thesegments composing the Connection.
Not every Pin should be connected to every other Pin. A set of Constraints limitthe amount of Pins a Pin can be connected to. The Allowed Pin Constraint enforcesa Pin to be connected to a specific type of Pins. The Allowed Device Constraintenforces a Pin to be connected to Pins of a specific type of Electronic Device. The
61
CHAPTER 4. TANGOHAPPS DESIGN
Requires Resistor Constraint enforces that a Pin is connected to another Pin througha Resistor. A Pin can have multiple Constraints at the same time. An UML modelof the main classes involved in the lay out of electronic circuits and the relationshipsbetween them is provided in Figure 4.12.
Next, we describe the concept of pin constraints with an example of a Buttonconnected to a digital pin of the Arduino Lilypad. The Button has two Pins : a PowerPin and a Digital Pin. The Power Pin has an Allowed Pin Constraint that enforcesit to be connected to other Power Pins. The Digital Pin has three constraints: aConstraint to enforce the usage of a Resistor, a Constraint to enforce its connectionto a Digital Pin, and a Constraint to enforce its connection to a Microcontroller. Allthree Constraints are satisfied because the Digital Pin is connected to the Lilypad’sDigital Pin 9 and to the Lilypad’s Ground Pin through a Resistor. Figure 4.13 showsthe described circuit layout created with Interactex Designer.
(a)
tal Pin
pinType = Digital
PinConstraint
deviceType = Microcontroller
Constraint
resistance = 10KOhms
esResistor
Constraint
lilMicrocontroller
dPin
pintal Pin
uit.AllowedPin
Constraint
(b)
Figure 4.13: Layout (a) and object diagram (b) of an electric circuit that consists ofa Button connected to an Arduino Lilypad.
62
4.9. PLUGIN EDITOR
4.9 Plugin Editor
+handleInput()+produc ()
TextITObject
+execute()
ControlObject
+draw()
Visuali onObject
-name
+send()+r
Port
Connection-paperSize-gridSize-origin
evel
+findView()+addObject(object : TextITObject)+removeObject(object : TextITObject)
as
-tagName
+addObserver(object : TextITObject)+removeObserver(object : TextITObject)+notify()+update(object : TextITObject)
Object Obs
1
2
*
1
1*
connects
Figure 4.14: Main classes in the Plugin Editor.
The Plugin Editor offers an interface to create code plugins in TextIT that can beimported into Interactex. In order to create a plugin, developers drag and drop TextITObjects from a palette into the Canvas. There are two main categories of objects,the Control Objects and the Visualization Objects. Control Objects execute code thatmodifies input data and Visualization Objects display input data without modifyingit. With exception of the Library Manager, every object in TextIT has input andoutput Ports used to send and receive data to other objects. Developers define theflow of data between TextIT Objects by creating Connections. Connections connectthe output Port of a TextIT Object to the input Port of another TextIT Object.
The Plugin Editor follows the Observer pattern to handle the flow of data betweenTextIT Objects. When a Connection between two TextIT Objects is defined, thereceiving TextIT Object subscribes to the receiving object’s Object Observer. Figure4.14 displays the main classes of the Plugin Editor and their relationship.
Next, we describe every Control Object in TextIT.
• The Start Object initiates execution of an application. The Start Object doesnot have input ports and has a single output port that triggers an event whenthe Start Object’s button is clicked.
• The Data Source accesses files containing data in JSON format. The DataSource contains an input port that can be triggered by other objects to load thedata file and an output port that delivers the data organized as a keys/valuesdictionary.
• The Conditional Statement is equivalent to an if/else condition in text-basedprogramming. It is similar to the Code Editor in that it also offers a view withsyntax highlighting where developers can edit and debug code. However, it
63
CHAPTER 4. TANGOHAPPS DESIGN
differs from the Code Editor in that its output is restricted to a single value ofBoolean type.
• The Iterator iterates through an array or list of data. The Iterator accepts aninput of JavaScript types List or Array and has two output ports. The Iterator ’s”result” output port delivers elements of the input array or list sequentially,together with their index within the array or list. The “finished” output port istriggered after the last element of the list or array has been reached.
• The Delayer is used to delay the internal execution of the object it connectsto. The delay time is specified by developers. The Delayer is useful to simulatesensor sampling rates and to deliver data at a convenient speed to VisualizationObjects.
• The Code Editor enables code-based programming and features syntax high-lighting and debugging functionality. The Code Editor has a single input port,which accepts any type of data. Output ports are generated dynamically by theCode Editor after execution of the code.
• The Library Manager enables developers to specify which libraries will be neededat runtime. Libraries are JavaScript files containing JavaScript functions. Whenloaded, every function of the imported library becomes available to every otherobject in the canvas. The Library Manager has no input or output ports.
TextIT ’s Visualization Objects are:
• Labels display a value and have an input port providing the value.
• The Line Chart displays a series of 2D points in a line chart. The Line Charthas an input port that receives an array or pairs of values representing a point.Developers can decide whether data is fed as single array or as a series of valuepairs. Alternatively, developers can also configure the Line Chart to use “time”as one of the axis, and supply only single values.
• The Pie Chart displays data as a pie chart. The Pie Chart has an input portthat accepts key-value pairs. The Pie Chart can be used to compare values (e.g.the accuracies of two algorithms).
• The Scatter Chart is similar to the Line Chart in terms of input and outputports. However, the Scatter Chart displays the input values as non-connectedpoints.
• Gauge Charts display a numeric value that is known to be within a range. Thevalue is provided by the Gauge Chart ’s single input port.
• 3D Viewers render an object in a 3D space. Object rotation values are passedto the 3D Viewer in the form of quaternions. The 3D Viewer might be helpfulto understand body postures and limb movements.
64
4.9. PLUGIN EDITOR
• JSON Viewers provide a visualization of data in the JSON format. The JSONViewer receives a dictionary over its single input port and displays it as nestedkey/value pairs. Because data between TextIT Objects is exchanged in a JSONformat, the JSON Viewer might be useful for debugging TextIT applications.
• The Table Viewer displays data in a tabular format. The Table Viewer has asingle port that accepts 2D arrays and an output port that triggers after thetable data has been rendered on screen.
• Video Players are used to play video files that are loaded from the developer’sfilesystem. Video Players have an input port to control the video (i.e. start,stop) and an output port that emits events to communicate the state of theplayer (i.e. started, stopped). Furthermore, while playing, the Video Playersends the elapsed time through an output port called tick. Video Players areuseful to relate sensor values and algorithm results to situations on the realworld. For example, displaying a slow-motion video of a user jumping nextto the motion data collected during the jump might help developers relate thespecific phases of the jump (jump preparation, take off, height peak, landing)to the signal.
Every Visualization Object has an output port that triggers an event when the lastdata sample has been rendered.
65
CHAPTER 4. TANGOHAPPS DESIGN
4.10 Simulator
The Simulator is responsible for executing an Application in Interactex Designer.This enables developers to test Applications without having to deploy them to thesmart textile. The Simulator Controller class is the Facade of the Simulator. TheSimulator Controller offers an interface to other components for starting and stoppinga simulation.
Applications in TangoHapps are an aggregation of Application Objects. Therefore,in order to test an Application, developers test the different states of each ApplicationObject. The functionality to simulate applications is distributed along the subclassesof the Simulation Object class. Each subclass of Simulation Object is a visual repre-sentation of an Application Object. Simulation Objects display runtime informationabout the Application Object they represent and react to User Gestures by modifyingan Application Objects’s state. Section A.4 in the Appendix lists the behavior of eachSimulation Object subclass.
+start()+stop()
Simulatoroller -image
+handleGesture()
SimulationObject Application
Object
+draw()+handleGesture()
getSimulation Object
+draw()+handleGesture()
ElectroniSimulation Object
+draw()+handleGesture()
ProgrammingSimulation Object
*
1
*
1 1
handles
Figure 4.15: Main classes in the Simulator.
Simulation Objects are designed following the Decorator and Observer design pat-terns [34]. The Decorator design pattern enables adding behavior to ApplicationObjects without modifying them. In agreement with the Decorator pattern, Applica-tion Objects and Simulation Objects share a common interface. Having ApplicationObjects and Simulation Objects share a common interface enables the Simulator Con-troller to interact with Simulation Objects in the same way as the Runner class fromthe Running Engine component would interact with an Application Object. Simu-lation Objects handle user input during simulation time and delegate the state ofthe Application Object accordingly. The Observer design pattern enables SimulationObjects to observe the state of Application Objects. Simulation Objects update theirvisual representation to match the state of the Application Objects they represent.
66
4.11. DEPLOYER
-on-intensity
+turnOn()+turnOff()+setIntensity(Number)
SimulationObject
-on-intensity
+turnOn()+turnOff()+setIntensity(Number)
+turn+turn
)
LEDInterface
SimulatorRunner
+subsc riber)+unsubscr iber)+notify()
Publisher+update()
Subscriber
*
11
modifies
Figure 4.16: Usage of the Decorator pattern in the Simulator.
Figure 4.16 illustrates the application of the Decorator and Observer patterns toextend the functionality of the LED for simulation purposes. The LED and LEDSimulation Object share a common interface, called “LED Interface”. The LED Sim-ulation Object handles user taps by invoking the LED ’s turnOn() method. Whenthe LED has been turned on, it notifies its observers about the state change. Whennotified about the LED ’s state change, the LED Simulation Object displays a spriteof a light beam on top of the LED Simulation Object ’s image. Note that the LEDSimulation Object could directly respond to user taps by displaying and hiding thesprite. However, Application Objects might change their state independently fromuser input applied to a Simulation Object. Having the Simulation Objects respondto changes in Application Objects’ state ensures that Simulation Objects are updatedindependently of how the Application Object ’s update happened.
4.11 Deployer
The Deployer executes on Interactex Designer and is responsible for two main tasks.First, the Deployer converts a Project into an executable Application. For each EditionObject in the Project, the Deployer instantiates an Application Object. If Invocationsbetween Edition Objects have been defined in the Project, the references between themhave to be mapped to references between the corresponding Application Objects in the
67
CHAPTER 4. TANGOHAPPS DESIGN
Application. After every Application Object has been created, the Deployer convertsreferences between Edition Objects into references between Application Objects. Inorder to resolve a reference in constant O(1) time, the Deployer uses the ReferenceLookup Table that contains a reference to an Application Object for each EditionObject.
Second, the Deployer is responsible for serializing an application and uploading itto Interactex Client. The main class responsible for serializing and uploading appli-cations is the Deployer Controller. The Deployer Controller is also the Facade of theDeployer and offers a single method to other software components to deploy an ap-plication. The deploy method first converts a Project into an executable Application,as described earlier in this section. Once the instance of the Application has beencreated, the Deployer Controller passes it over to the Application Uploader.
Every Application Object in TangoHapps implements the Serializable protocol.The Serializable protocol contains a method to serialize and another to deserializethe Application Object. The serialize method is an instance method that returns abyte stream representing the instance. The deserialize method is a class method thatcreates an instance based on a byte stream. The Application Uploader first serial-izes an Application by invoking the serialize method of each Application Object andthen transfers the Application to a smartphone running Interactex Client. The Appli-cation Uploader also discovers nearby smartphones running an instance of InteractexClient, serializes Applications and transfers a serialized Application to the ApplicationDownloader.
The Application Downloader runs in Interactex Client and is responsible for down-loading a byte stream and deserializing it into an Application. In order to deserializean Application, the Deployer Controller invokes the deserialize() method of everyApplication Object.
-r eferences()roject)
oller
Project Application
ApplicationObject
EditionObject
+discover(Devices)
Applicationader
-unm all()+download()+advertise()
ApplicationDownloader
+serialize()Seria ble
1 1
1
*
1 1
*
1
*
*
1
*
1
*
transfers uploads
outputsinputs
Figure 4.17: Main classes of the Deployer.
68
Chapter 5
TangoHapps User Interface
This chapter describes every view of Interactex and TextIT and the navigation be-tween them. Furthermore, at the end of this chapter, we present a step by stepdescription of how a smart textile application is developed in TangoHapps.
5.1 Interactex Designer
Interactex Designer has three main views. The first view the user sees when theapplication has been opened is the Project Selection Screen. The Project SelectionScreen enables developers to start a new project and select already existing projectsto continue working on them. Developers use the Editor Screen to develop applicationsfor smart textiles and the Simulator Screen to test applications. Usually, developersopen a project and then navigate back and forth between Editor Screen, where theyadd new functionality, and the Simulator Screen, where they test the latest developedfunctionality.
5.1.1 Project Selection Screen
The Project Selection Screen displays the projects stored in the iPad’s hard drive.Projects can be arranged on a grid or list fashion. When projects are arranged asa grid fashion, a screenshot of each project’s last state and the project’s name aredisplayed. When projects are displayed as a list, only their name and creation dateare shown. The navigation bar at the top of the Project Selection Screen containstwo buttons: Edit and New. The Edit button turns displayed projects into editablemode. In editable mode, projects can be renamed, rearranged, duplicated and removed.The New button starts an empty project. A project is selected by tapping on itsscreenshot. Selecting an existing project or starting a new project lead to the EditorScreen. Figure 5.1 shows a screenshot of the Project Selection Screen.
69
CHAPTER 5. TANGOHAPPS USER INTERFACE
Fig
ure
5.1:
Inte
ract
exD
esig
ner
’sPro
ject
Sel
ection
Scr
een.
70
5.1. INTERACTEX DESIGNER
5.1.2 Editor Screen
The Editor Screen is the main interface for developers to create an Application forsmart textiles. The main components of the Editor Screen are the Canvas, the Paletteand the Toolbar, which we have already introduced in Section4.8. In order to createan Application, developers drag elements from the Palette into the Canvas.
The Palette has two main tabulations labelled Library and Properties. The Li-brary tabulation displays the Objects View and the Properties tabulation displays theConfiguration View. Figures A.1 and A.2 in the Appendix display the Objects Viewand Figure A.3 displays the properties of a Label and Timer objects.
The Toolbar contains five buttons to edit and manipulate objects in the Canvasand to perform specific actions on the current application. Table 5.1 lists the differ-ent buttons available in the Toolbar and describes their functions. The SimulationButton is used to start and stop the simulation of the current application. Starting asimulation hides the Palette and every connection between objects. Figure 5.2 showsan overview of Interactex Designer ’s Editor Screen.
Figure 5.2: Interactex Designer UI overview.
71
CHAPTER 5. TANGOHAPPS USER INTERFACE
Image Name Description
ConnectSets the Editor in Connection mode so that
lines can be drawn between objects in orderto create Invocations.
DuplicateSets the Editor in Duplicate mode causingthe creation of copies of objects dragged in
the Canvas.
DeleteSets the Editor in Delete mode. DuringDelete mode, user taps on objects on the
Canvas causes objects to be deleted.
CircuitLayout
Switches Editor to the circuit layout view.
Deploy
Deploys the project into Interactex Client.The Deploy toolbar button becomes
enabled when an instance of the InteractexClient is found nearby.
Table 5.1: Toolbar Buttons in Interactex Designer.
5.1.3 Canvas
The Canvas is where the visual programming in Interactex takes place. The Can-vas displays Edition Objects and Invocations between them. Figure 5.3 shows theInvocations between a Button and a LED. Developers draw lines between an EditionObject ’s Event Hook and another Edition Object’s Method Hook in order to define anew Invocation. Hooks display different images depending on the type of parametersprovided by an Event or required by a Method. Hook images also indicate whether theparameters provided by the Event comply with those required by the Method (filledshapes) or not (hollow shapes). Table 5.2 displays the different images used by Hooks.
72
5.1. INTERACTEX DESIGNER
Type Compliant Parameters Non-compliant Parameters
Boolean
Numeric
String
Object
Table 5.2: Images of Hooks in Interactex Designer. Filled shapes indicate that aparameter provided by an Event comply those expected by a Method. Hollow shapesindicate that the Event does not provide a parameter of the type expected by theMethod.
Figure 5.3: Invocations between Events of a Button and Methods of an LED.
73
CHAPTER 5. TANGOHAPPS USER INTERFACE
5.1.4 Circuit Layout
The Canvas is also where developers create the circuit layout of a smart textile.Figure 5.4 shows the circuit layout of the CTS Gauntlet, described in Section 1.1.Circuit layouts are created by drawing connections between Hardware Devices’ Pins.Hardware Devices’ Pins that can be assigned to only one Microcontroller Pin (e.g. thepower pin of a Button) are wired automatically when the Hardware Device is added tothe Canvas. The Canvas can be zoomed in and out to facilitate the accurate drawingof connections.
Different types of connections are drawn in different colors. Connections to aPower Pin are drawn in red; connections to a Ground Pin are drawn in gray andConnections between Data Pins are drawn in purple by default and can be configuredby developers over the Properties View.
Figure 5.4: Circuit layout of the CTS-Gauntlet developed in Interactex Designer.
5.1.5 Simulator Screen
The Simulator Screen displays an interactive representation of the smart textile andsmartphone. Multi-touch gestures and iPad sensors are used to simulate sensor values.Touching (covering) a light sensor causes the light intensity to decrease. The analog
74
5.2. INTERACTEX CLIENT
textile sensor displays a handle to set the value it should deliver. Accelerometersand compasses attached to a smart textile respond to acceleration and compass read-ings measured by the tablet device. Behavior of output devices is represented withanimations, sounds and images. Vibration motors shake and emit a noise with anintensity and sound frequency proportional to their vibration intensity. LEDs displaya semitransparent image of light that gets brighter the higher the LED’s intensityis. Sensors and variables display their values on the Canvas for debugging purposes.Figure 5.5 displays different objects in Simulator Screen.
Figure 5.5: Simulation of different objects in Interactex Designer.
5.2 Interactex Client
Interactex Client has four screens. The Download Screen is used to download appli-cations from Interactex Designer. The User Applications and Default ApplicationsScreens are used to choose and open an application. The Application Screen displaysthe application.
5.2.1 Download Screen
The Download Screen is used to download applications from Interactex Designer.When the user enters the Download View, Interactex Client begins advertising itselfto nearby instances of Interactex Designer using Bluetooth or Wi-Fi. The Download
75
CHAPTER 5. TANGOHAPPS USER INTERFACE
Screen displays a status label to inform users about the connection state. The sta-tus label displays the following connection states: “Connecting”, “Connection RequestSent”, “Connected”, “Downloading Project”, “Download Finished”. Once an appli-cation download started, the Download Screen displays the application’s name anddownload progress in addition to the status label. Applications that finished down-loading appear in the User Application Screen.
5.2.2 User Applications Screen
The User Application Screen displays the projects downloaded from the InteractexDesigner arranged into a grid or list, configurable by users. Projects are representedwith an icon and a name. If no project has been downloaded into Interactex Client,the grid or list are empty and instead a label is shown to users with the message: “Noprojects available.”. The User Application Screen displays a navigation bar at thetop of the screen with two buttons: Edit and +. The Edit button enables users tochange applications’ names and to delete them. The + button is used to navigate tothe Download View. Tapping on an application’s icon leads to the Application View.A screenshot of the User Application Screen is shown in Figure 5.6 a).
5.2.3 Default Applications Screen
Interactex Client offers nine default applications for testing the communication be-tween the smartphone and the smart textile and proper functioning of InteractexFirmware. The default applications are:
• Digital Output. Two Buttons on the smartphone’s interface are used to turn onand off an LED.
• Digital Input. A Label displays the state of a Button attached to the textile.
• Buzzer. Two Buttons and a Slider are used to turn on and off a Buzzer and tocontrol its vibration frequency.
• Analog Input. A Label displays the value of an analog sensor (e.g. TemperatureSensor).
• Three Color LED. Three Sliders enable the user to control the color of a Three-Color LED.
• LSM303. Three Labels display the accelerometer values along x y and z-axesdelivered by the LSM303 sensor.
• Accelerometer. Three Labels display the accelerometer values along x y andz-axes delivered by the Accelerometer sensor.
76
5.2. INTERACTEX CLIENT
(a) (b)
Figure 5.6: Interactex Client ’s User Application Screen (a) and Default ApplicationScreen (b).
• Music Player. A Button attached to the smart textile is used to start and stopthe music on the smartphone.
• MPU 6050. Three Labels display the linear acceleration values along x y andz-axes delivered by the MPU 6050 sensor.
The default applications provide information to developers about how they shouldconnect the electronic devices to the microcontroller. The Default Application Screenis displayed in Figure 5.6 b).
77
CHAPTER 5. TANGOHAPPS USER INTERFACE
5.2.4 Application Screen
The Application Screen displays the smartphone’s user interface developed in Inter-actex Designer. The smartphone’s user interface in Interactex Client have a highfidelity to the design created in Interactex Designer because both tools use the sameUI Widgets available in Apple’s UIKit framework. In addition to the user interfacedeveloped in Interactex Designer, the Application Screen displays a navigation bar atthe top of the view containing two buttons. The Back button is used to navigateback to the project selection screen and the Connect button connects the smartphonewith the smart textile over Bluetooth Low Energy and starts the execution of theapplication.
5.3 TextIT
TextIT ’s user interface was designed to be consistent with Interactex Designer ’s userinterface. TextIT also has a Canvas, a Palette and a Toolbar. An annotated screenshotof TextIT ’s user interface is shown in Figure 5.7. The Palette contains Palette Itemsfor every Control and Visualization Object available in TextIT. A full list of TextIT’sControl and Visualization Objects is provided in Section A.5 in the Appendix. TextITObjects are added to the Canvas by clicking on the appropriate Palette Item.
Toolbar Buttons are used to save the state of the current project, zoom the Canvasin and out and upload the current project to Interactex Designer. The upload buttononly becomes available when a nearby instance of Interactex Designer is detected.Clicking on the upload button sends the code in every Code Editor object to InteractexDesigner.
Figure 5.8 displays the main elements of TextIT objects. TextIT objects aredisplayed as a box with a name. With exception of the Library Manager, every objecthas input and output Ports. Input Ports are displayed as a green circle above theelement’s box, and output Ports are displayed as red dots below the element’s box.Furthermore, every element has an icon, an edit button to open the Edit Window anda cross to remove the element. The Edit Window enables developers to configure thebehavior of the object. For example, the Edit Window of the Scatter Chart enablesdevelopers to define the data series and axis labels.
5.4 Usage Example
In this section, we demonstrate the usage of Interactex by describing how WaveCapis developed, which we have described in Table 1.1. We then extend the Interactexapplication with a TextIT plugin that estimates the jogging speed based on motiondata. The speed estimation is used to adapt the volume of the music player so thatthe faster the user runs, the louder the music is played.
78
5.4. USAGE EXAMPLE
Figure 5.7: Overview of TextIT ’s user interface.
5.4.1 Development of an Interactex Application
Figure 5.9 shows the WaveCap project implemented in Interactex Designer. Theapplication consists of a textile, four Hardware Devices and eight UI Widgets. We usedthe following Hardware Devices: one Textile Speaker, two Textile Sensors and a Switchand the following UI Widgets: 5 Labels, two Switches and a Slider. The behavior ofthe application is determined by six Invocations between objects. The Textile Speakercan be turned on and off with a smartphone switch or with the hardware on/offswitch. The Frequency textile sensor is coupled to the Textile Speaker ’s frequency.The Volume textile sensor sets the value of a Slider on the smartphone. When theSlider ’s value changes, the Speaker Radio’s volume is set. This makes it possibleto set the volume by either pulling the strings attached to the hood or through thesmartphone. Another smartphone switch sets the radio sender to AM or FM. Valuespassed in an Invocation are represented with a shape - yellow circles represent Booleantypes and a brown squares represent Numeric types.
79
CHAPTER 5. TANGOHAPPS USER INTERFACE
Figure 5.8: Visual components of a TextIT Object.
Figure 5.9: WaveCap implemented in Interactex Designer.
80
5.4. USAGE EXAMPLE
5.4.2 Development of TextIT plugin
In order to develop an algorithm that estimates jogging speed, we start by displayingaccelerometer data of a user jogging. A file containing the accelerometer data storedin JSON format can be imported into TextIT with the Data Loader TextIT object.After loading the file containing the jogging data, the Data Loader parses it andinitializes the output ports named “output”, “startTime”, “endTime”, “frequency” and“data”. These ports correspond to keys in the JSON file. The data port provides thesensor signal. When the Data Loader ’s output port is connected to the Line Chart ’sdata port, the Line Chart draws every sample in the signal as a line chart. The plotproduced by the Line Chart is shown in Section A.5.2 in the Appendix.
The Data Loader is connected to a Code Editor, where the jogging speed is cal-culated. To estimate the user’s jogging speed, we filter the acceleration signal andcompute the norm of the deviation vector. A TextIT project that loads the data,calculates and displays the jogging speed in a Gauge Chart is shown in Figure 5.10.The Code Editor contains the code shown in Figure 5.11.
After the plugin has been developed, it is uploaded to Interactex by simply clickingon the Upload Button on TextIT ’s toolbar. Once imported, the plugin appears at thebottom of Interactex ’s Palette with the name “customComponent”. We renamed theplugin to “runningSpeed ” over the Properties View in the Palette. The MPU 6050accelerometer and gyroscope delivers linear acceleration values to the Window object.The Window object is configured over its Properties View to gather 50 accelerationsamples. Once the 50th sample is collected, the Window emits the filled event, whichtriggers runningSpeed ’s execute() Method passing in all 50 samples. Once the speedhas been computed, runningSpeed triggers its executionFinished() event, which in-vokes the setValue() method of the Mapper. The Mapper maps the value provided byrunningSpeed to a range between 0 and 1 and invokes the Slider ’s setValue() method,passing in the normalized speed. The speaker’s volume is automatically adaptedwhen the Slider ’s value changes due to the previously defined Invocation between theSlider ’s valueChanged() event and Textile Speaker ’s setVolume() method.
Next, we enter Hardware Mode, add the BLE-Lilypad microcontroller to the Wave-Cap and draw connections between every Hardware Component and the appropriatepin on the BLE-Lilypad. Figure 5.13 displays WaveCap’s circuit.
The project can be simulated to ensure the application behaves as expected. TheRecorder Application Object can be used to record and replay jogging data. TheTextile Speaker will shake to indicate it has been turned on. The Textile Speaker ’svolume property can be displayed on a label to test whether the volume changes asexpected. Finally, the project is uploaded to Interactex Client by tapping the DeployButton on the toolbar. Once opened from the User Applications Screen, the projectstarts.
81
CHAPTER 5. TANGOHAPPS USER INTERFACE
Figure 5.10: Development of an algorithm to detect jogging speed in TextIT.
82
5.4. USAGE EXAMPLE
Figure 5.11: TextIT code to estimate jogging speed for the WaveCap application.
Figure 5.12: Usage of a TextIT plugin in Interactex Designer to control smartphone’svolume based on estimated jogging speed.
83
CHAPTER 5. TANGOHAPPS USER INTERFACE
Figure 5.13: Circuit layout of WaveCap developed in Interactex Designer. HardwareDevices from left to right: a Textile Sensor, a BLE-Lilypad and a Textile Speaker(renamed to “Radio Speaker)”.
84
Chapter 6
Applications
TangoHapps was developed iteratively based on a series of smart textile applications.In order to ensure a wide breadth of coverage of applications, we have chosen applica-tions from different domains. In this Chapter, we describe two real-world smart textileapplications. The first smart textile is the KneeHapp Bandage, a smart bandage thattracks the rehabilitation progress after a knee injury. The second one is the Custo-dian Jacket, a jacket that monitors and supports technicians’ maintenance activitiesin a supercomputer center. Both applications were developed in collaboration withexperts in the field and with access to end users.
In this Chapter, we first describe how we developed each smart textile applicationwithout TangoHapps and then demonstrate how each application can be realized withTangoHapps.
6.1 Application 1: KneeHapp Bandage
Tear of Anterior Cruciate Ligament (ACL) is a common knee injury that occursmostly among athletes [38]. In the US, the number of ACL injuries was estimated tobe between 80,000 and 200,000 per year [33, 100]. The rehabilitation after an ACLinjury can last as long as a year and often includes physical therapy, strength exercisesand frequent visits to physiotherapists and doctors. Successful recovery of an ACLinjury relies on the appropriate design of rehabilitation exercises performed by patientsdaily with the goal of recovering full range of motion, strength and coordination.
Currently, patients sustaining an ACL injury perform the rehabilitation exercisesmostly unsupervised and lack quantitative ways to measure the quality and trackperformance of their exercising. Orthopedists also lack tools to assess patients’ re-habilitation progress and still have to rely on subjective observations such as howmuch a patient’s leg shakes during a squat due to muscular instability. Furthermore,orthopedists and patients meet at time intervals as long as three months and thetreatment is decided upon observations during these meetings without considerationof the patient’s recovery progress in the periods between visits.
85
CHAPTER 6. APPLICATIONS
6.1.1 Problem
We interviewed two orthopedists who perform ACL surgeries on a daily basis andrecorded and shadowed six patients after ACL-reconstruction surgery during theirdoctor visits. During these visits, patients had to perform a series of rehabilitationexercises. Based on our interviews and observations, we identified the problems de-scribed below.
Range of Motion (RoM)
After a knee surgery, the patient looses range of motion of the knee, mainly due topain and swelling. Recovering a certain range of motion is required before patients cancontinue with the rehabilitation program. Patients can walk without crutches onlyafter being able to fully extend their injured leg and can start pedaling a bicycle onlyafter being able to bend their knees for at least 90 degrees. Therefore, it is importantfor patients to recover the flexibility on their injured leg as early as possible. Therange of motion is measured every day at the beginning of the rehabilitation.
Orthopedists measure the range of motion of a patient’s knee using a mechanicaldevice called goniometer. Goniometers are placed at the knee’s center of rotationand opened such that their arms are in line with the patients upper and lower leg.The alignment of the goniometer to the leg is difficult due to muscle and fat in theleg. Goniometer measurements do not always correlate among different specialists[85]. Besides, the usage of a goniometer requires a second individual to take themeasurement.
One-Leg Squat
After the ACL injury, patients begin to loose strength in their injured leg. Therefore,the second phase of the rehabilitation focuses on muscle building. One of the exercisesorthopedists teach patients for this purpose is one-leg squat and different variationsof it. During a one-leg squat, patients stand on the injured leg, bend it as much asthey can and then go back up into their initial position without deviating their kneesfrom the line between the ankles and hips. The more patients bend their leg duringa one-leg squat, the more difficult the exercise becomes. Therefore, it is desirable toknow how much patients bend their leg during the squat. However, orthopedists lackof convenient ways to measure the degree of flexion and count only the amount ofrepetitions while observing the quality of the movement.
During a squat, many patients are not able to stabilize the leg and deviate fromthe optimum axis and rotate their knees inwards (called medial collapse). Bending theknee inwards indicates a weakness of the hip and leg due to pain or lack of strengthand might result in new knee injuries. Despite the importance of preventing medialcollapses, orthopedists and orthopedists lack ways to quantify the degree of medialcollapse during a squat. To the best of our knowledge, there are currently no solutionsfor quantifying the degree of medial collapse during a squat. The shaking of the legduring a squat is another important parameter indicating muscular instability. Thisis observed subjectively by orthopedists.
86
6.1. APPLICATION 1: KNEEHAPP BANDAGE
One-Leg Hop, Side Hops and Running in Eights
At the end of the rehabilitation phase, orthopedists should assess whether patientsare ready to start doing sports again. In order to do this, orthopedists observe thepatient’s injured leg’s performance on different exercises and compare it to their per-formance on the patient’s healthy leg. While orthopedist’s have not reached a commonagreement on the exercises for the assessment, commonly accepted exercises are one-leg hops, side hops and running in eights.
One-leg hops is an exercise in which patients should jump forward as far as possibleusing only one leg and land stably. Orthopedists measure the distance of the hop byplacing a meter on the ground next to the area where the patient jumps.
During side-hops, patients are asked to hop side-wise over a distance of 20 cmusing only one leg as many times as possible during a period of 15 to 60 seconds oftime. In order for patients to concentrate on the exercise, orthopedists count the hops.Orthopedists say that counting the hops is cumbersome and that they sometimes losecount.
Running in eights is another exercise used by orthopedists to assess patient’srehabilitation progress. Performance of the running in eights exercise is determinedby the time it takes patients to complete the figure of eight. In the current praxis,orthopedists use a timer to measure how long it takes patients to complete the exercise.
6.1.2 KneeHapp Bandage
KneeHapp is a compression bandage that patients wear after the knee surgery, withtwo Inertial Measurement Units (IMUs) inserted into pockets at the upper and lowerleg. The bandage is shown in Figure 6.1 a). Other research such as[4, 2, 122] andcommercial products such as CoreHab1 use leg bands which are fastened to the upperand lower part of the leg. Being a single piece, the sensors in the bandage are alwaysplaced in the same relative distance to each other avoiding the risk of inconsistentmeasurements caused by a misplacement of the sensors. This cannot be achievedusing the leg bands because they consist of two pieces that are placed independentlyon the upper and lower leg. Leg bands should be fastened tight so that they don’tslide while patients perform several repetitions of a rehabilitation exercise. Fasteninga leg band tight might cause discomfort, especially as muscles need to tense and relaxseveral times during an exercise session. Because the compression bandage has a biggercontact surface to the patient’s leg, it is unlikely to slide during exercising. KneeHappmeasures the quality of the different rehabilitation exercises described in previoussection, gives feedback to patients and shares the measurements with orthopedists.
Use Case 1: Range of Motion (RoM)
KneeHapp calculates the angle of flexion of the leg by computing the difference be-tween the Yaw values delivered by upper and lower IMUs. KneeHapp’s coordinate
1http://www.corehab.it/en/
87
CHAPTER 6. APPLICATIONS
(a) (b)
Figure 6.1: a) KneeHapp Bandage. b) KneeHapp sock.
system is shown in Figure 6.2. By convention, the angle of flexion should be equal tozero when the leg is relaxed on a flat surface. Because bones are not perfect straightlines and because of additional muscle and swelling, a direct angle computation doesnot usually yield zero degrees even when the leg is relaxed on a flat surface. There-fore, a calibration is performed to determine the alignment of the IMUs to the leg.KneeHapp supports two calibration techniques. The first one determines the sensoralignment while patients extend their leg on a flat surface. However, because mostpatients are not able to fully extend their leg after the surgery, we designed the fol-lowing alternative calibration approach that uses the healthy leg to estimate the offsetangle:
1. Wear the bandage on the healthy leg and measure the orientation of the IMUon a flat surface.
2. Place leg above any object and measure the angle of flexion.
3. Wear the bandage on the injured leg, place the leg above the same object andmeasure the angle of flexion.
Because the angle measurements when using the object should be equal on both legs,after these calibration steps KneeHapp knows the sensor alignment on the injuredleg. Other approaches we considered were pose calibrations, a static calibration usinga wedge and functional calibration approaches. Pose calibration approaches com-pare the orientation measured by motion sensors to expected measurements whilethe subject performs a specific pose [80]. However, simple pose-based calibration isnot suitable for rehabilitation right after the surgery because of the difference in therange of motion of both knees. Ayoade et al. use a static calibration technique thatrequires patients to place their knee on a wedge with a predefined degree of flexion [4].The accuracy of this approach will depend on the length of each individual’s leg andrequires a wedge that patients might not have in their homes. Functional calibrationapproaches require the individual to perform a series of movements used to infer thealignment of the motion sensors to the body [31, 30]. These movements should beperformed by a specialist, which makes them unsuitable to home-based applicationswith in- expert users [80]. Our calibration approach is suitable for the home and doesnot require additional equipment such as cameras or other individuals to take themeasurements.
88
6.1. APPLICATION 1: KNEEHAPP BANDAGE
x
y
yaw
roll(a)
Pitch
Pitch
(b)
Figure 6.2: a) KneeHapp coordinate system. b) Illustration of the medial collapseduring a squat. and how it affects the pitch.
Use Case 2: One-Leg Squat
Using the orientation vectors provided by both accelerometers, KneeHapp tracks theangle of flexion of the leg during the squat. The technique used to calculate the angleof flexion during a squat is the same as the one used to calculate the angle of flexionin Use Case 1: Range of Motion. However, the calibration for a one-leg squat is donewhile the patient is standing straight. The iPad application triggers a visual andauditive feedback when the minimal angle of the squat has been achieved.
During the squat, medial collapse causes upper and lower leg to bend in oppositedirections. In our coordinate system, the pitch rotation component indicates howmuch the IMUs have rotated due to medial collapse. Figure 6.2 b) shows that theupper and lower IMUs rotate in opposite directions, which is reflected in the Pitchrotation component delivered by the IMUs. Therefore, in order to determine thedegree of medial collapse during a squat, we add up the difference of the Pitch rotationangles of upper and lower IMUs to the difference of the Pitch stored during thecalibration. We negate the upper sensor’s pitch because a smaller value indicates ahigher medial collapse. The iPad application provides auditive feedback when theamount of medial collapse is bigger than four degrees in order for users to correcttheir deviation.
KneeHapp also determines the degree of shaking of the leg. In order to measurethe degree of shaking of the leg, we compute the standard deviation of the linearacceleration produced by the upper IMU along the x-axis. We chose to use only theupper IMU to measure the shaking because squats challenge mostly the muscles on
89
CHAPTER 6. APPLICATIONS
the upper and cause shaking due to muscular instability.
Use Case 3: One-Leg Hop
In order to quantify patient’s performance during one-leg hops, orthopedists measurethe distance of their hops on the injured leg and compare it to the same measurementon their healthy leg. According to the two orthopedists we interviewed, the dura-tion of the hop can be an equivalent measurement of a patient’s performance duringthis exercise. Therefore we chose the calculation of the duration of one-leg hops asperformance parameter in this exercise.
To measure the duration of one-leg hops, we studied two possible solutions basedon different types of sensors. The first one uses KneeHapp’s accelerometers and thesecond uses a sock with integrated textile pressure sensors that can be attached to thebandage. The highest peak in the signal represents the moment where the patient isat the highest height during a hop. The lowest negative peak corresponds to momentwhere the patient landed. Figure 6.3 shows the linear acceleration along the y andz-axes of the upper leg IMU recorded during a one-leg hop with annotations foreach phase of the hop. An average one-leg hop has a duration of 0.23 seconds. Asampling rate of 100 Hz enabled us to detect the different phases of the one-leg hops.KneeHapp uses a peak detection algorithm to detect highest and lowest peaks andthen measures the difference between both peaks. Since the highest peak in the signaldoes not correspond to the moment when the patient jumped off the ground, thecomputed difference is scaled.
The second approach uses textile pressure sensors attached to the sole of a sock.The smart sock is shown in Figure 6.1 and the raw pressure sensor signal recordedby the sock during a one-leg hop is shown in Figure 6.4. The signal produced by thetextile pressure sock contains two peaks that correspond to the take-off and landingphases of the hop, when higher pressure is applied to the sole. The duration of thehop is also estimated by measuring the distance between both peaks. However, in thiscase, the peaks indicate the exact time when the foot left and came back in contactto the ground after a hop.
90
6.1. APPLICATION 1: KNEEHAPP BANDAGE
(a) (b)
Figure 6.3: One-leg hop linear acceleration on y and z-axis raw (a) and filtered (b).
Figure 6.4: One-leg hop average pressure.
91
CHAPTER 6. APPLICATIONS
Use Case 4: Side Hops
For the side hops we also use an approach based on the morphology of the signalproduced by accelerometers. Figure 6.5 a) shows the linear acceleration on the y-axis for seven side hops. In order to eliminate noise in the signal, we apply severaliterations of a low-pass filter until the signal’s standard deviation becomes smallerthan a specific threshold. We call this threshold low-pass filter threshold. We foundthe optimal low-pass filter threshold to vary depending on the algorithm used forcounting the number of hops. We tested the RC low-pass filter with a time constantc = 0.25 and a weighted moving average filter with coefficients [1/4, 1/2, 1/4]. Afterfiltering the signal, we use a peak detection algorithm to count the high and low peaksin the signal. A naive peak detection algorithm counts also local maxima as hops in thesignal, resulting in more side hops being counted. To overcome this issue, we analyzethe surroundings of a peak. If the peak is larger enough than its surroundings, itis counted as a hop. We call this factor peak threshold. We found that the peakthreshold leading to most accurate results depends on the filtering algorithm used.Figure 6.5 b) shows the same hops after applying three iterations of the RC filter andrunning the peak detection algorithm.
(a) (b)
Figure 6.5: Linear acceleration of seven side hops. raw (a) and filtered (b).
Use Case 5: Running in Eights
KneeHapp determines whether patients are standing or running by comparing thestandard deviation of the magnitude of the linear acceleration vectors of the upperIMU with a specific threshold. We use windows of 50 samples with a sampling rateof 100 Hz and an overlapping factor of 50%. This solution removes the requirementfor additional devices and humans who measure the time. Furthermore, this usecase removes external factors that affect the measured performance and makes themeasured performance dependent solely on patient actions.
92
6.1. APPLICATION 1: KNEEHAPP BANDAGE
6.1.3 Implementation with TangoHapps
In this section we describe how the KneeHapp Bandage can be implemented withTangoHapps. The KneeHapp Bandage has only one IMU on the upper leg and one onthe lower leg. TangoHapps supports the MPU 6050 IMU, which fuses accelerometerand gyroscope signals in order to deliver linear acceleration along 3 axes and Yaw,Pitch, Roll (YPR) rotation values. The circuit design of the KneeHapp Bandagerequires of two microcontrollers with integrated Bluetooth Low Energy module andtwo MPU 6050 units.
Figure 6.6: KneeHapp’s Range of Motion use case implemented with TangoHapps.
Use Case 1: Range of Motion
A TangoHapp’s application that computes the range of motion would require thefollowing Application Objects:
• Three Subtraction Arithmetic Operators. We name the Subtraction operators:“Upper Sensor Subtraction”, “Lower Sensor Subtraction” and “Sensor Subtrac-tion”.
• Two Number variables. We name the Number Variables: “Upper Sensor Num-ber ” and “Lower Sensor Number ”.
93
CHAPTER 6. APPLICATIONS
• A Button UI Widget. We name the Button “Calibrate Button”.
• A Label UI Widget.
Furthermore, the application should define the following Invocations between objects:
1. The Calibrate Button’s buttonPressed() Event triggers the Variable’s setValue()Method passing the upper MPU 6050 ’s Yaw Variable as a parameter.
2. The Calibrate Button’s buttonPressed() Event triggers the Variable’s setValue()Method passing the lower MPU 6050 ’s Yaw Variable as a parameter.
3. The upper MPU 6050 yawChanged() Event triggers the Upper Sensor Subtrac-tion’s setOperand1() Method passing the measured yaw as a parameter.
4. The lower MPU 6050 yawChanged() Event triggers the Lower Sensor Subtrac-tion’s setOperand1() Method passing the measured yaw as a parameter.
5. The Upper Sensor Number ’s valueChanged() Event triggers the Upper SensorSubtraction’s setOperand2() Method passing the value stored in the variable ofthe subtraction as a parameter.
6. The Lower Sensor Number ’s valueChanged() Event triggers the Lower SensorSubtraction’s setOperand2() Method passing the value stored in the variable ofthe subtraction as a parameter.
7. The Upper Sensor Subtraction’s computed() Event triggers the Sensor Subtrac-tion’s setOperand1() Method passing the result of the subtraction as a parame-ter.
8. The Lower Sensor Subtraction’s computed() Event triggers the Sensor Subtrac-tion’s setOperand2() Method passing the result of the subtraction as a parame-ter.
9. The Sensor Subtraction’s computed() Event triggers the Label ’s setText() Methodpassing the result of the subtraction as a parameter.
In order to do the sensor calibration, patients should extend their leg on a flat surfaceand press the Calibrate Button. When the user presses the Calibrate Button, theapplication stores the current Yaw value of upper and lower IMUs into the UpperSensor Number and Lower Sensor Number variables. After the initial calibrationphase, the values stored in both variables are used as a reference to compute howmuch the IMUs have offset from their initial position. Therefore, the current Yawvalues of upper and lower IMUs are subtracted from the values stored in the variables.Finally, in order to measure the angle of flexion of the leg, the application computesthe difference of upper and lower yaw offsets and displays it through a Label. Asample application that computes the range of motion of a patient’s knee is displayedin Figure 6.6.
94
6.1. APPLICATION 1: KNEEHAPP BANDAGE
Use Case 2: One-Leg Squat
An application that computes the angle of flexion and medial collapse of the leg duringa squat can be created in TangoHapps with the same functionality used in Use Case 1:Range of Motion to compute an angle using two IMUs. However, in order to computethe degree of the medial collapse, the MPU 6050 ’s Pitch rotation component shouldbe used. Furthermore, the application should indicate to users when the desired angleof flexion during a squat has been reached. The functionality to determine whether anangle is bigger than a threshold can be realized with the Bigger Comparison Operator.The Bigger Comparison Operator ’s setOperand1() and setOperand2() Methods shouldbe invoked with the angle and with a Variable containing the threshold. Next, theBigger Comparison Operator ’s conditionIsTrue() event should invoke the Sound ’splay() Method. This functionality is used to inform the user about a specific squatangle being reached and about a medial collapse bigger than four degrees. A simplemodification to the above function to use the value provided by a Slider UI Widgetin the comparison, would enable users to configure the angle for receiving soundnotifications at runtime. A screenshot of an Interactex Designer application thatimplements this functionality is shown in Figure 6.7.
Figure 6.7: KneeHapp’s functionality to produce auditive feedback when the user’sknee deviates more than 4 degrees during a One-Leg Squat.
The application should also display live feedback about the shaking of his/herleg. This can be achieved with the Monitor Application Object. The upper IMU’sxChanged() Event should trigger the Monitor’s addValue1() object. This causes everylinear acceleration value along the x-axis to be added to a line chart rendered by theMonitor Application Object.
The degree of shaking of the leg can be computed using the Deviation ExtractorApplication Object and displayed on a Label. This functionality can be achieved witha Window Function, a Deviation Extractor Function and a Label UI Widget and thefollowing Invocations:
95
CHAPTER 6. APPLICATIONS
1. The upper MPU 6050 ’s xChanged() Event triggers the Window ’s addValue()Method passing in the linear acceleration along the x-axis.
2. The Window ’s filled() Event triggers the Deviation Extractor ’s addValue()Method passing in the buffered acceleration values.
3. The Deviation Extractor ’s featureExtracted() Event triggers the Label ’s set-Text() Method passing in the computed deviation.
A sample application that supports the squat exercises is displayed in Figure 6.8.
Figure 6.8: KneeHapp’s One-Leg Squat use case implemented with TangoHapps.
96
6.1. APPLICATION 1: KNEEHAPP BANDAGE
Use Case 3: One-Leg Hop
The algorithm to compute hop durations can be implemented in TangoHapps usingthe following Application Objects:
1. A Window.
2. Two Peak Detectors. We name them Lower and Upper Peak Detector.
3. A Subtraction Arithmetic Operator.
4. A Multiplication Arithmetic Operator.
5. A Variable.
6. Two Buttons. We name them “Start” and “Stop” Buttons.
Furthermore, the following Invocations should be defined:
1. The Start Button’s buttonDown() Event triggers the upper MPU 6050 ’s start()Method.
2. The Stop Button’s buttonDown() Event triggers the MPU 6050’s stop() Method.
3. The MPU 6050 ’s yChanged() Event triggers the Window ’s addValue() Methodpassing in the linear acceleration along the y-axis.
4. The Stop Button’s buttonDown() Event triggers the Lower PeakDetector’s com-pute() Method passing in the array of linear acceleration values stored in thedata property of the Window.
5. The Lower PeakDetector ’s indexExtracted() Event triggers the Upper PeakDe-tector’s setRangeStart() Method.
6. The Stop Button should also set the values of the Lower Peak Detector in asimilar way as in step 4. For this purpose, the Stop Button’s buttonDown()Event triggers the Upper PeakDetector ’s compute() Method passing in the arrayof linear acceleration values stored in the data property of the Window.
7. The Lower PeakDetector’s indexExtracted() Event triggers the Subtraction’s se-tOperand1() Method passing in the index of the peak detected.
8. The Upper PeakDetector’s indexExtracted() Event triggers the Subtraction’s se-tOperand2() Method passing in the index of the peak detected.
9. The Subtraction’s computed Event triggers the Multiplication’s setOperand1()Method passing in the result of the subtraction.
10. The Number ’s valueChanged() Event triggers the Multiplication’s setOperand2()Method passing in the the fixed value of 1.36.
11. The Multiplication Arithmetic Operator ’s computed() Event triggers the Label ’ssetText() Method passing in the result of the multiplication.
97
CHAPTER 6. APPLICATIONS
Figure 6.9: KneeHapp’s One-Leg Hop use case implemented with TangoHapps.
The Invocations described above are necessary to create an application that esti-mates the duration of a hop as described in Section 6.1.2. A screenshot of an appli-cation containing every object and Invocation mentioned so far is displayed in Figure6.9. It should be noted that the Invocations between the Start and Stop Buttons andthe upper MPU 6050 are not displayed in Figure 6.9 for clarity purposes. The Startand Stop Buttons are controlled by users and cause the upper MPU 6050 to startand stop recording values. The Window ’s size Variable is set to a very large num-ber so that the Window has enough buffer space to store the accelerometer samplesproduced during the hop. The Stop Button first causes the Window to stop storingvalues and then triggers the execution of the algorithm that uses the data in theWindow to estimate the duration of the hop. The Upper Peak Detector is configuredto detect high peaks and the Lower Peak Detector is configured to detect low peaksover their upper peak Variables. Since the data passed to both Peak Detectors is theaccelerometer signal of an entire hop, the Upper Peak Detector will detect the highestand the second Lower Peak Detector the lowest peak in the signal. An additionaloptimization is done by setting the Lower Peak ’s starting range index to the UpperPeak ’s index. This reduces the number of computations in the algorithm by limitingthe search range of the lowest peak. The Subtraction Operator is used to measure theindex difference between the peaks detected. Because accelerometer The time differ-ence between The final hop duration in seconds is computed by scaling the difference
98
6.1. APPLICATION 1: KNEEHAPP BANDAGE
indexes. For this purpose, the result of the subtraction is multiplied by the constant1.36 by using the Multiplication Operator.
Use Case 4: Side Hops
An algorithm to count the number of side hops based on an accelerometer signal canbe implemented with a TextIT plugin. The plugin identifies and returns the numberof peaks in an array of input values. The algorithm iterates over every sample inthe input signal. Once an upper peak is detected, the algorithm starts looking forlower peaks. Once a lower peak is detected, the algorithm starts looking for upperpeaks again. This process repeats. A counter is increased every time an upper peakis detected. The function receives an additional parameter called “delta”. The deltaparameter is a threshold for peaks to be counted. This threshold is necessary to avoidlocal maxima in the signal. The pseudo-code contained in the TextIT plugin is shownin Section A.5.3 in the Appendix.
An Interactex application that uses the TextIT plugin to count peaks requiresadditionally the usage of the Low-Pass Filter, and a Number Variable where the deltaparameter needed by the TextIT plugin is stored. The Low-Pass Filter receives thesignal directly from the upper IMU and filters it. The filtered signal is passed overto the TextIT plugin. The Low-Pass Filter ’s filtering factor and delta parameter areconfigurable by developers. We configure the Low-Pass Filter ’s factor Variable to 0.5and delta parameter to 60. The UI is controlled in a similar way as the One-Leg Hopapplication described earlier in this chapter. The Start and Stop buttons cause theupper IMU to start and stop receiving data. The result returned by the TextIT pluginis displayed in a label in the smartphone. A screenshot of an Interactex applicationthat computes the side hops is shown in Figure 6.10. The TextIT plugin has beenrenamed to “sideHops”.
99
CHAPTER 6. APPLICATIONS
Figure 6.10: KneeHapp’s Side Hops use case implemented with TangoHapps.
100
6.1. APPLICATION 1: KNEEHAPP BANDAGE
Use Case 5: Running in Eights
An application that starts and stops a timer when it detects the user has started /stopped running can be implemented with TangoHapps using the Activity Classifierand Timer Application Objects. The Activity Classifier requires an array of filteredlinear acceleration data. The application should define the following Invocations inorder to classify user activity based on linear acceleration:
1. The MPU 6050 ’s yChanged() Event invokes the Window ’s addValue() Methodpassing in the linear acceleration along the y-axis as a parameter.
2. The Window ’s filled() Event invokes the Low-Pass Filter ’s addValues() Methodpassing in the array of buffered linear acceleration samples.
3. The Low-Pass Filter ’s filtered() Event invokes the Activity Classifier ’s classi-fySamples() Method passing in the filtered array of linear acceleration samples.
We set the Window ’s size Variable to 50 and the overlapping Variable to 0.5. TheActivity Classifier will emit the running() Event after the patient started running andthe notMoving() Event after the patient stopped running. In order to measure theduration of the run, these Events are used to start and stop a Timer. We call thisTimer “Running Timer ”. The Timer in TangoHapps emits an Event after the timespecified by developers has elapsed. In this application, the timer should be stoppedwhen the user stopped running. This can be achieved by setting the Timer ’s timeVariable to a high value (to ensure it will not finish before the patient completed therun) and by defining two Invocations. First, the Activity Classifier ’s running() Eventinvokes the Timer ’s start() Method. Second, the Activity Classifier ’s notMoving()Event invokes the Timer ’s stop() Method.
Finally, the Running Timer ’s elapsed time can be displayed on a Label on thesmartphone. In order to continuously update the Label as the time passes, a secondTimer that will trigger the Label updates should be used. We call the second Timer“Update Timer ”. The Update Timer is started and stopped by the same Events as theRunning Timer. A Timer in TangoHapps can be configured to trigger periodicallyby setting its repeat Variable to true. We set the Update Timer ’s repeat Variableto true and frequency Variable to 0.05 seconds so that the Label is updated every0.05 seconds. Next, we define an Invocation between the Update Timer ’s triggered()Event and the Label ’s setText() Method passing in the Running Timer ’s time elapsedvariable as a parameter.
101
CHAPTER 6. APPLICATIONS
6.2 Application 2: Custodian Jacket
Siemens is a multinational company with more than 340.000 employees worldwide.In order to store employees’ data and provide services needed for the functioningof the company, Siemens built a supercomputer center in the south of Munich. Thesupercomputer center contains 154 racks that store over 1230 servers. Some of the dataand services stored in the supercomputer center are highly critical to the organization.Siemens might suffer major losses if a service became unavailable for a few hoursor if employees data was lost. Furthermore, some maintenance operations done atthe supercomputer center might be dangerous, such as dealing with high voltages.Therefore, the main goals of this project were to prevent costly human mistakesand to increase technician’s safety while performing maintenance activities in thesupercomputer center.
6.2.1 Problem
In order to gain understanding into the application domain, we conducted several in-terviews with security managers at Siemens during a period of 14 months of time. Weidentified three main problems, which we address as “use cases” in the next sections.
Error prevention
Racks contain up to 46 “height units”. A height unit is a slot where a server can beinserted. A server might occupy one or several height units. A height unit has astandard height of approximately 4.5 cm. Racks have a label attached next to eachheight unit with a numeric identifier. In order to identify a server, technicians findin their worksheets the height unit where the server is installed. Technicians operateservers from the front and backside. Labels might not be visible on the backside ofthe rack due to the presence of cables, as shown in Figure 6.11 b). Technicians havepulled a working server by mistake in the past, causing a large amount of losses tothe organization.
In order to help technicians identify servers that need to be replaced or repaired,Siemens considered installing an LED for each height unit in a rack. The LED wouldbe remotely controlled and be turned on to indicate a faulty server. However, thisapproach would require the installation and maintenance of more than 6000 LEDs.Instead, Siemens is more interested in a solution with less maintenance costs.
Technician’s Safety
In order to repair the faulty servers as fast as possible, a pool of technicians is readyto access the supercomputer room at any day and any time of the week (even at3 am on a Sunday). At specific times of the day, technicians are likely to have toenter the supercomputer center alone. In order to perform a maintenance operation,technicians perform hazardous operations such as dealing with high voltages, lifting
102
6.2. APPLICATION 2: CUSTODIAN JACKET
heavy equipment, climbing ladders and crunching into tight spaces. The closest personthat can assist technicians in case of an accident is the gatekeeper, who sits in adifferent building. Siemens is interested in a solution to monitor and assist techniciansin case of accidents.
Emergency Situation
Siemens installed speakers inside the supercomputer center in order to notify techni-cians about emergency situations such as the presence of fire in the building. Eachrack in the supercomputer center is equipped with fans on both the front and back sidein order to keep a constant air flow that cools the servers’ temperature down. Thesefans produce a considerable amount of background noise. Therefore, the volume ofthe speakers had to be loud enough so that it would be heard from different positionswithin the supercomputer center. Siemens had to remove the speakers after realizingthat the vibrations produced by the speakers damaged the nearby hard drives.
6.2.2 Custodian Jacket
The Custodian Jacket is a smart jacket worn by technicians during maintenance ac-tivities in a supercomputer center. The main purpose of the Custodian Jacket is toreduce the chances of human mistakes, which might cause high losses to the companyand to increase technician’s safety while operating in the supercomputer center. Wehave developed four versions of the Custodian Jacket. The first version was developedin a 2-week summer school by a group of 16 students and was based on the .NETGadgeteer hardware family. The other three versions of the Custodian Jacket weredeveloped in collaboration between the TUM and Siemens during a one-year projectinvolving three doctoral students and five managers at Siemens. These later versionswere based on the Arduino microcontroller family. An image of the last version of theCustodian Jacket is shown in Figure 6.11 a).
In this chapter, we focus on the last version of the Custodian Jacket (version 4).The last version of the Custodian Jacket has two 6-axis Accelerometer and GyroscopeIMU on the chest and sleeve, a proximity sensor on the sleeve and capacitive textilesensors on the shoulders and chest. The textile sensor on the shoulders are used todetect whether a technician is wearing the jacket. The textile sensor on the chest isused as a button. Furthermore, the jacket provides visual, haptic and auditive outputto users with an LED and vibration motor on the wrist and a buzzer at the chest.Hardware devices are connected with conductive thread to custom made BLE-Lilypadattached to the chest. The BLE-Lilypad communicates over BLE with a smartphone.In the rest of the section, we describe three use cases that address the three problemswe identified in the previous section.
Use Case 1: Error prevention
The Custodian Jacket has an ultrasonic proximity sensor attached to the sleeve. Thissensor measures the distance between the user’s sleeve and the ground. By comparing
103
CHAPTER 6. APPLICATIONS
(a) (b)
Figure 6.11: a) Custodian Jacket - fourth version. b) Backside of a rack in a super-computer center. Some height unit labels are hidden by cables.
the distance to the ground, the jacket is able to know what server the user’s arm is infront of. Technicians swipe their arms from top to bottom of a rack passing throughevery server. The jacket produces a light outputs to indicate whether the currentserver should be repaired or not.
The ultrasonic proximity sensor emits an ultrasonic beam and measures the time ittakes the beam to return to the sensor. In order to measure the distance to the groundaccurately, the sensor should be placed straight such that the beam forms a 90 degreeangle with ground. The Custodian Jacket uses a 6-axis accelerometer and gyroscopein order to detect whether the proximity sensor points straight to the ground. Anadditional LED on the sleeve gives technicians feedback about the proper orientationof thee proximity sensor. A green LED indicates that the proximity readings are validand that the technician’s hand is in front of the server that needs to be repaired. Ablue LED indicates the proximity readings are valid the technician’s hand is not infront of a server that needs to be manipulated. A red LED indicates that the sensoris not pointing straight down, hence proximity readings are not valid. Figure 6.11displays a technician swiping through servers on a rack to locate a specific server.
Use Case 2: Technician’s Safety
The Custodian Jacket monitors technician’s physical activity in order to determinewhether a technician might need assistance. The IMU on the torso is used to determinewhether he/she is walking, standing, running or lying down on the ground. If the
104
6.2. APPLICATION 2: CUSTODIAN JACKET
technician is found to be lying down on the ground for more than 20 seconds, the jacketsends an emergency signal to the gatekeeper. In order to avoid sending emergencysignals when the user has taken off the jacket, the application determines whether theuser is wearing the jacket with the capacitive sensor on the shoulder.
The activity recognition is done based on the linear acceleration, which is deliveredby the IMU attached to the Custodian Jacket’s torso. The linear acceleration issampled with a frequency of 100 Hz. Then, the linear acceleration along the y-axisis buffered in windows of 50 samples with an overlapping factor of 50%. We applya low-pass filter and extract the signal deviation. We determine whether the user isstanding, walking, or running based on the signal deviation. The linear accelerationalong all the y-axis recorded while the user was standing a) and walking b) are shownin Figure 6.12. The distribution of the different activities in a 3D space is shown inFigure 6.13.
(a) (b)
Figure 6.12: Linear acceleration along y-axis while user is standing (a) and walking(b).
Use Case 3: Emergency Situation
The Custodian Jacket is equipped with a speaker and a vibration motor in order topropagate emergency events to the technician. Emergency events are sent out bythe gatekeeper and have to be acknowledged by technicians by holding a capacitivebutton on their chest for three seconds. Furthermore, technicians can request forassistance by pressing the capacitive button on their chest, also for three seconds. Inthat case, an emergency signal is sent to the gatekeeper as described in “Use Case 2:Technician’s Safety”.
105
CHAPTER 6. APPLICATIONS
Figure 6.13: Classification of physical activities based on IMU data. Axes correspondto the deviation of the linear acceleration along the x, y and z-axes.
6.2.3 Implementation with TangoHapps
In this section, we describe in detail how the three use cases of the Custodian Jacketare implemented with TangoHapps. The Custodian Jacket can be designed using aProximity sensor, a Three-Color LED, two MPU 6050 IMUs and a Textile Sensor .
Use Case 1: Error prevention
An Interactex application that provides visual output to users based on proximitysensor and IMU readings can be realized with the following Application Objects:
1. Three Bigger Comparison Operators.
2. Three Smaller Comparison Operator.
3. Three AND Logical Operators.
4. Three Number variables. We name them “Tolerance”, “255” and “0”.
5. A Slider UI Widget.
We will describe how this application can be developed with TangoHapps in threesteps. First, we create an interface for users of the Custodian Jacket to set the heightunit they are looking for over their smartphones. The height unit is set over a slider. In
106
6.2. APPLICATION 2: CUSTODIAN JACKET
order for the application to compare the proximity values delivered by the proximitysensor to the height unit input by the users, the application should calculate thedistance between the height unit to the ground. This is achieved by first multiplyingthe number of the height unit by 4.5 cm and then adding a distance of 3.0 cm whichis the distance to the ground of the first height unit in the rack.
Second, we develop a function that checks whether the value we calculated in thefirst step is within a range. This is needed in order to detect whether the user’s armis in front of a specific server. This function can be created as follows:
1. We store the middle value of the range in a Variable Application Object. Wecall this Variable: MidValue.
2. We store a constant in another Tolerance Variable. This function will check forvalues in the range: [MidValue - Tolerance ... MidValue + Tolerance]. MidValueand Tolerance are configurable by developers according to specific applications.
3. We use the Addition and Subtraction operators to compute the lower and up-per limits of the range. We set the MidValue variable as first operand andTolerance variable as second operand using the operators’ setOperand1() andsetOperand2() Methods.
4. We use the BiggerEqual Comparison Operator to compare the reading from theProximity sensor to the lower limit of the range. The Subtraction ArithmeticOperator ’s computed() Event provides the range’s upper limit.
5. Similarly, we use the SmallerEqual Comparison Operator to compare the prox-imity to the upper limit of the range.
6. We use the AND Logical Operator to ensure that comparisons from steps 4.and 5. are true. We use its Methods setOperand1() and setOperand2() to theSmallerEqual and BiggerEqual operators’ conditionChanged() Event.
7. The AND operator will emit the conditionIsTrue() Event when the value iswithin the specified range and the conditionIsFalse() will be triggered when thevalue is not in the range.
Because height units have a height of 4.5 cm, a tolerance of 2.25 cm would lead to arange that covers the entire height unit. The same function can be applied to checkwhether the IMU on the sleeve is oriented properly. An IMU that is held parallelto the ground would shield a yaw and pitch of 0 degrees. Therefore, the MidValueVariable would be set to 0 and the Tolerance Variable to 2. Figure 6.14 a) displaysthe implementation of the function to check whether a value is within a range inInteractex Designer.
Third, we provide feedback to users by setting the Three-Color LED’s color toeither red, green or blue depending on the results of the different comparisons. TheThree-Color LED offers Methods to set its red, blue and green components indepen-dently. We define three Invocations between the AND operator’s conditionIsTrue()and conditionIsFalse() Events and the Three-Color LED ’s setRed(), setGreen() and
107
CHAPTER 6. APPLICATIONS
setBlue() Methods. Each Invocation passes the value Variable of the “0” or “255”Variable as a parameter according to the RGB color coding of the color we need ineach case. Figure 6.14 b) displays the implementation of this function in InteractexDesigner.
(a) (b)
Figure 6.14: TangoHapps implementations of: a) Function that checks whether theproximity measured by a Proximity Sensor is in the range 35±2cm. b) Function thatsets the color of the RGB LED to red.
Finally, we aggregate the results of all three range comparisons (proximity, yawand pitch) using the AND Logical Operator and set the Three-Color LED’s coloraccordingly. If the yaw or pitch are not in the [-2 ... 2] range, then the proximitysensor is not properly aligned. In this case, we set the LED’s color to red. If the yawand pitch are properly aligned and the proximity measured by the proximity sensoris in the range we defined, then the user’s hand is in front of the server he is lookingfor. In this case, we set the LED’s color to green. If the yaw and pitch are properlyaligned but the proximity range comparison returns false, then the user’s arm is notin front of the server he/she is looking for. In this case we set the LED’s color to blue.A schematic of this algorithm is shown in Figure 6.15.
108
6.2. APPLICATION 2: CUSTODIAN JACKET
A range
comparison range
comparison
rangecomparison
display
displayN
display
AND operator
AND operator
result resu
lt
result
false
true
false
true
Figure 6.15: Sketch of a TangoHapps implementation of the Custodian’s Error Pre-vention use case.
109
CHAPTER 6. APPLICATIONS
Use Case 2: Technician’s Safety
We demonstrate an Interactex application that monitors technician’s physical activityin three steps. First, the application should detect whether the user is wearing thejacket. This is done with the capacitive textile sensor sewed to the jacket at theshoulder delivers high analog values when the user’s skin is nearby. We determinewhether the technician is wearing the jacket by comparing the value delivered bythe Textile Sensor with a certain threshold. We chose a threshold of 12.00 based onthe analog values measured by the last version of the Custodian Jacket when beingworn over a T-shirt and a pullover. The threshold can be tuned by developers to suitdifferent capacitive textile sensors. The comparison with a threshold is done by theBigger Comparison Operator. The Bigger Comparison Operator invokes the MPU6050 IMU to start and stop receiving linear acceleration values.
Second, it uses an IMU and the Activity Classifier to determine user activity anda the Contact Book to make emergency calls in case the user is detected to be inertfor a certain period of time. The linear acceleration along the y-axis measured by theMPU 6050 IMU is passed over to the Window Application Object to buffer the linearacceleration values. In order to achieve this, the MPU 6050 ’s yChanged() Event isconnected to the Window ’s addValue() Method passing in a linear acceleration sampleas a parameter. The Window’s size and overlapping Variables are configured over theConfiguration View to gather 50 linear acceleration samples and to overlap 50% ofthe samples with the last set of buffered values. The Window ’s filled() Event invokesthe Low-Pass Filter ’s addValues() Method passing in the buffered samples. The Low-Pass Filter factor Variable is set to 0.5 over the Configuration View. This impliesthat the signal’s standard deviation will be smaller or equal than half of the originalsignal’s standard deviation. The Low-Pass Filter ’s filteredValues() Event invokes theActivity Classifier ’s classifySamples() passing in the array of filtered samples. So far,the application is able to classify user activity.
In the third and last step, the Activity Classifier ’s output should be used to triggeran emergency signal. In order to avoid false positives, the Custodian Jacket triggersan emergency signal if the user has not moved for 20 seconds. The Timer ApplicationObject can be used in order to make the application count for 20 seconds when the useris found to be inert. The Activity Classifier ’s notMoving() Event invokes the Timer ’sstart() Method. The Activity Classifier ’s walking(), running() and climbing() Eventsinvoke the Timer ’s stop() Method to cause the Timer to stop counting time after theuser moved in any way. The Timer’ s frequency Variable is configured to trigger after20 seconds. Finally, the Timer ’s triggered() Event should invoke the Contact Book ’smakeEmergencyCall() Method. Figure 6.16 displays the implementation of this usecase in TangoHapps.
110
6.2. APPLICATION 2: CUSTODIAN JACKET
Figure 6.16: Custodian’s Technician’s Safety use case implemented with TangoHapps.
111
CHAPTER 6. APPLICATIONS
Use Case 3: Emergency Situation
We describe the implementation of an application in TangoHapps that producessounds and vibrations when an emergency signal is received. This application simu-lates emergency signals with a button on the smartphone. The real application wouldreceive emergency signals remotely over Wi-Fi.
This application relies on a function to detect whether a textile button is be-ing pressed for more than three seconds. Two Invocations are needed in order tomake sound and vibrations to start and stop playing. First, an Invocation betweenthe smartphone’s Button’s buttonPressed() and the Buzzer and Vibration Board ’sturnOn() Methods should be defined. Second, the Timer ’s triggered() Event shouldinvoke the Buzzer and Vibration Board ’s turnOff() Method. This causes the Timer toturn off the Buzzer and Vibration Board. The Timer’s frequency Variable is config-ured to trigger three seconds after it has been started. The Timer is started wheneverthe textile button is detected to be pressed. In order to detect whether the button hasbeen pressed, we compare whether the capacitance measured by the Textile Sensoris bigger than a threshold. We chose a threshold of 97.00 based on the analog valuesmeasured by the textile fabric we used on the last version of the Custodian Jacket. Anadditional Invocation between the Bigger Comparison Operator ’s conditionIsFalse()Event and the Timer ’s stop() Method stops the timer whenever the user has releasedthe textile button. The Timer, when stopped, resets its counter so that users have tostart pressing the textile button for three consecutive seconds. The implementationof this use case is shown in Figure 6.17. In order to additionally emit emergencycalls when the textile button has been pressed for three consecutive seconds, an ad-ditional Invocation between the Timer ’s triggered() Event and the Contact Book ’smakeEmergencyCall() Method should be defined.
112
6.2. APPLICATION 2: CUSTODIAN JACKET
Figure 6.17: Custodian’s Emergency Situation use case implemented with Tango-Happs.
113
CHAPTER 6. APPLICATIONS
114
Chapter 7
Evaluation
New toolkits and development environments are often validated with the breath ofcoverage of applications they support and by their ability to simplify the development[7, 8, 37, 60, 61]. The breath of coverage of TangoHapps is demonstrated by the usecases we described in Section 6. In this section, we present the results of two userstudies with the goal to assess TangoHapps’ ability to simplify the development ofsmart textiles. The first user study was conducted with novice users and the secondone with professional smart textile developers. Both user groups had none or limitedbackground experience in electronics and programming.
7.1 User Study 1: Novice Users
To assess the ease of use of Interactex, we conducted a study in a middle-schoolclassroom with 18 teenagers (7 female, 11 male, 14 and 15 years old). None of theparticipants had experience in programming or electronics. Participants were dividedin six teams of three members per team. Each team received an Arduino UNO Kitcontaining vibration motors, buzzers, LEDs, light and temperature sensors to testtheir applications and a ”cheat sheet” with circuit layout diagrams describing howelectronic devices should be connected to the Arduino UNO microcontroller. Thestudy lasted two days; four hours each day. During the first day, we taught participantsthe basics for creating a circuit layout and programming using the Arduino IDE.During the second day, we provided an introduction to Interactex that lasted one hour.We showed participants different Application Objects, created Invocations betweenthem and demonstrated how to create a circuit layout and deploy the applicationto the smartphone. We then gave participants an iPad and iPhone with Interactexalready installed and asked them to solve the following three warm-up tasks:
1. Turn on and off an LED using a switch on the smartphone
2. Iterate the music playing in the smartphone using a physical push button
3. Make a call to any number when the temperature decreases over certain thresh-old.
115
CHAPTER 7. EVALUATION
Finally, participants were given 45 minutes to develop their own application usingInteractex.
7.1.1 Results
Participants solved warm-up tasks 1. and 2. without our support in less than ten min-utes each. In order to solve these tasks, participants had to understand Interactex ’sworkflow and event-function coupling. Warm-up task 3. required the usage of theComparator object to compare the value measured by the temperature sensor to thevalue contained within a Variable object. Most participants attached the temperaturesensor to the Comparator and forgot to specify the Variable it should be compared to.This highlighted a potential improvement to Interactex ’s visual programming mode tomake events and methods of objects visible in the canvas as done in other flow-basedprogramming environments. However, every team was able to finish this task afterwe showed them how the Comparator object works in a separate application.
Participants developed the following applications of their own:
1. A jacket with 12 integrated LEDs that the wearer can turn on and off from twobuttons on a smartphone. LEDs turn on in a sequential fashion.
2. A jacket with a button that plays and stops playing the music on the smart-phone. The volume of the music changes according to the light intensity sensedby a light sensor.
3. A T-Shirt that adapts an LED’s intensity according to the light’s intensity inthe room.
4. A T-Shirt that measures the user’s body temperature and displays it on a smart-phone.
5. A T-Shirt that plays beep sounds. The frequency of the tone can be controlledover a slider on the smartphone.
6. A jacket to help find victims of an avalanche. The jacket produces sound andvibration while a button on the jacket is being pressed or while the light’sintensity is smaller than a threshold.
These applications would require extensive expertise to develop without Interactex(e.g. data transmission, synchronization and smartphone app development). Thereason why most teams chose a T-Shirt for their applications is that the defaulttextile in Interactex is a T-Shirt. The T-Shirt can be replaced by another imagestored in the iPad after adding the object to the canvas. Figure 7.1 a) shows threefemale participants developing an electronic circuit during the study, Figure 7.2 a)shows Application 1. consisting of 12 LEDs connected to the Arduino UNO and b)displays the paper prototype developed by the same team.
116
7.2. USER STUDY 2: PROFESSIONAL SMART TEXTILE DEVELOPERS
(a) (b)
Figure 7.1: a) Three female subjects develop an electronic circuit with InteractexDesigner. b) Application 1 consisting of 12 LEDs.
7.2 User Study 2: Professional Smart Textile Devel-opers
In our second study, we conducted interviews with eight professional smart textiledevelopers (2 male, 6 female, 23 - 35 years old) with varying backgrounds: textiledesign (5), interaction design (2) and computer science (1).
The study was conducted in five steps:
1. To enable participants to test their applications on an actual textile, we devel-oped a textile testbed. The textile testbed is shown in Figure 7.2 b). InteractexFirmware was uploaded into the testbed prior to this study.
2. We started every interview by explaining our intentions to participants empha-sizing our desire to assess Interactex ’s suitability for rapid prototyping of smarttextiles including positive and negative aspects. We then demonstrated the us-age of the IDE with an application that turns on and off an LED in the textiletestbed using two buttons on the smartphone’s UI.
3. Participants had to develop any application while thinking aloud for around 20minutes.
4. We conducted a semi-structured interview to gain insight into participant’s im-pression about Interactex and to identify missing functionality. We recordedand transcribed participant’s think aloud protocols and interview answers forpost-analysis.
117
CHAPTER 7. EVALUATION
5. Participants filled a questionnaire that inquired about their professional prac-tices regarding smart textile development and general impression about Inter-actex.
7.2.1 Study Results
When asked about their general impression about Interactex, every participant madeat least one comment praising it. P3: ”Good to know that this is available in the AppStore!”, P4: ”even though I know how to program, I could use it and welcome thatyou don’t need to program”, P6: ”I love it!”. The design of the environment was oneof the most liked features. P4: ”the simplicity is the beauty of it”, P7: ”good! nicegraphic interface” - right after showing Interactex Designer to him for the first time.Participants also praised the easiness to deploy applications to the smart textile. P6:”the workflow was perfect and easy to understand”. The ability to use a smartphoneto read sensor data and manipulate output devices on the testbed was perceived asone of the most convincing and impressive features. P3: ”Really nice that iPhone andhardware are programmed directly and visually in the iPad environment”. P5: ”... itis convenient that a phone manages the interaction with the hardware”.
From the feedback provided by participants, we extracted a total of 85 featurerequests. Five participants requested support for new sensors, output devices andmicrocontrollers. Wishes included pressure sensors, servo motors and hardware com-ponents from other hardware brands (e.g. Sparkfun. Adafruit). Participants alsoquestioned the lack of support for custom-made textile sensors, although these sen-sors correspond in their functionality to elements supported in Interactex (e.g. Switchand Textile sensor). Two participants additionally pointed out the idea to have theiPad communicate directly with the smart textile.
When asked about what they did not like about Interactex, participants mentioneddifferent usability flaws. The usability flaw most mentioned (by five participants) wasthe lack of immediate information to users about the which Invocations where de-fined in the project coupling. P4 explained it as: ”though in the properties of thecomponents things are marked in blue when selected [...] there is nothing to say ifone connection is this event or this method” - referring to a coupling as ”connection”.Other usability flaws mentioned were the lack of a way to scale the textile indepen-dently from electronic devices attached to it and the lack of a clear indication whethera event-function coupling defined is valid. Questionnaire results The results from thequestionnaire revealed that every participant had only basic programming knowledgewith the exception of P2, who was a professional software developer. Participantsmentioned Arduino IDE and Processing as their usual programming environmentsand P2 additionally mentioned Android Studio. Furthermore, every participant re-ported being used to dealing with code editors and not with visual programmingenvironments.
The questionnaire also contained two questions in a 5-point Likert scale. Theoverall suitability of Interactex for rapid prototyping of smart textiles was rated byparticipants as: Very good (3), Good (3), Satisfactory (2), Sufficient (0) and Poor
118
7.2. USER STUDY 2: PROFESSIONAL SMART TEXTILE DEVELOPERS
(a) (b)
Figure 7.2: a) A paper prototype of a jacket developed during the Study 1. b)Textile testbed containing two accelerometers, a push button, switch, light sensor,buzzer, vibration motor and a three-color LED. Every electronic device is connectedwith conductive fabric to a custom-made Arduino Lilypad with an integrated BLEmodule (in the center).
(0). The overall positive ratings are consistent with participants’ reactions during theinterviews. The Satisfactory ratings were given by P2 and P8. P2 had extensive ex-perience in programming and electronics and considered Interactex to be less suitablefor herself. This is in agreement with Interactex ’s target user group of individualswith little experience in programming. P8’s rating was related to her thought thatthe visual programming abstractions offered in Interactex would be too difficult forinexperienced developers. While our findings from the study with novice users showthat Interactex ’s visual programming abstractions can be grasped by users who hadnever programmed before, we agree with P8 in that the realization of more complexapplications might require a longer learning phase than the average 20 minutes partic-ipants took for this study. Other participants perceived the difficulty to use Interactexto be normal or easy. Difficulty ratings were: Very easy (0), Easy (2), Normal (5),Difficult (1) and Very difficult (0).
119
CHAPTER 7. EVALUATION
120
Chapter 8
Conclusions and Future Work
Several solutions have been introduced in the research community that greatly sim-plify the amount of effort and knowledge required to create physical electronic devices.For example, hardware toolkits provide hardware components that can be plug andplayed in order to create a functional electronic device within minutes. Thanks tothese solutions, hardware devices are today being built by hobbyists who have noprevious background in electrical engineering. On the other hand, tools that simplifydevelopment of software for smart textiles are either limited to rapid prototyping sce-narios or require experience in programming. In this dissertation, we have made threemain contributions. First, we have presented a comprehensive literature review of thefield of smart textiles, including definitions, their evolution from the so-called wearablecomputers and technologies currently being used to create them. We have also listedand categorized tools that support the development of smart textiles, wearable com-puters and physical devices and summarized their common features. Furthermore,this dissertation has contributed to the body of knowledge in the field of smart tex-tile development by eliciting generic requirements for an IDE for smart textiles. Weelicited these requirements based on an exhaustive literature review that is summa-rized in Chapter 2 and a four year collaboration with professional textile designers.We have described the research process we followed to iteratively and incrementallyexplore the requirements needed to create a development environment for smart tex-tiles. The requirements we elicited and research process we described can be reusedin the development of further development tools for smart textiles.
Second, we have described the models, design decisions, architecture and internalstructure of TangoHapps. TangoHapps supports the design, implementation, testingand debugging of smart textiles, including placement of electronics, circuit layout andapplication development. The high-level abstractions available in TangoHapps lowerthe entrance barrier to smart textile development. TangoHapps’ visual programmingsemantics span across smartphone and smart textile enabling users to take advan-tage of capabilities present in both devices without writing source code. In order tofacilitate other researchers and professionals in the prototyping and further develop-ment of smart textiles, we made TangoHapps publicly available. Its source code is
121
CHAPTER 8. CONCLUSIONS AND FUTURE WORK
available in GitHub1 and its executable can be downloaded from Apple’s AppStore23.Furthermore, we have presented exhaustive documentation of the IDE, which shouldfacilitate its usage and further development. In particular, a detailed description ofevery Application Object supported in TangoHapps is available in the Appendix. Wealso created a platform for smart textile developers to share their Interactex projects4 with other users. The platform also provides step-by-step tutorials on the usage ofInteractex.
Third, we demonstrated that TangoHapps has a high-ceiling and a low-entrancebarrier. We demonstrated the high-ceiling of the IDE with two use cases from dif-ferent application domains: rehabilitation and blue-collar working. Furthermore, weaddressed the low-entrance barrier of TangoHapps with two user studies. The firstuser study was conducted with middle school students who had never programmedor worked with electronic circuits before. The second user study was conducted withexperienced smart textile developers who tested TangoHapps and gave us their in-sight as end users of the IDE. Both user studies provide evidence that TangoHappsenables users with no experience in programming or electronics to develop applica-tions for smart textiles within minutes. Parts of the contributions presented in thisdissertation have been published in [42, 43, 41, 44, 46, 45].
8.1 Future Work
Before we are able to see people walking around the streets wearing smart garments,challenges such as washability, robustness and social acceptance must be overcome.Washability refers to the ability of a textile to resist washing without deteriorating.The strain (and surprisingly not the water, heat or detergent) a smart textile is ex-posed to during a washing cycle can damage its electronic devices and connections.During normal usage, textiles are also exposed to high degrees of bending and strain.A study done by the ETH Zurich revealed that the upper back of a T-Shirt, for in-stance, is subject to an elongation as high as 20% of its normal size [76]. Additionally,the integration of electronics into garments raises safety concerns and is perceived associally awkward by some audiences (e.g. elderly) [18].
A long term vision of TangoHapps is that the users of a smart textile also developtheir own smart textiles. Users could design and develop the applications for theirown smart garments using TangoHapps and snap or sew the electronic devices tothe textile. TangoHapps currently supports multiple applications on the same smartgarment. Hence, users can already switch their smart garment application by simplyselecting a different application in their smartphones.
Furthermore, an online store could offer applications for smart textiles, in a similarway to how Apple’s AppStore offers applications for mobile devices. We refer to theterm “Happ” as an application for a smart textile, analogously to the term “App”.
1https://github.com/ls1intum/Interactex2https://itunes.apple.com/us/app/interactex-designer/id973912620?mt=83https://itunes.apple.com/us/app/interactex-client/id1031238223?mt=84http://www.interactex.de/
122
8.1. FUTURE WORK
Users could develop applications using TangoHapps and then upload their executableand textile assembly instructions to the TangoHapps HappStore for other users todownload them.
123
CHAPTER 8. CONCLUSIONS AND FUTURE WORK
124
Appendix A
Application Objects
A.1 Events Methods and Variables
In this section we list every Application Object ’s Methods, Events and Variables.
125
APPENDIX A. APPLICATION OBJECTSIm
age
Nam
eD
esc
ription
Events
Meth
ods
Vari
able
s
Butt
onTrigg
ers
even
tsw
hen
pre
ssed
and
when
rele
ased
.
button
Pre
ssed
(),
button
Relea
sed(
)-
pres
sed
:
Boo
lean
Lab
elD
ispla
yste
xtan
dnu
mber
s.-
setT
ext(
Str
ing)
,ap
pendT
ext(
Str
ing)
text
:Str
ing
Sw
itch
Trigg
ers
even
tsw
hen
switch
edon
oroff
.sw
itch
edO
n()
,sw
itch
edO
ff()
,on
Cha
nge
d(Boo
lean
)-
on:
Boo
lean
Slider
Use
dto
sele
cta
valu
efr
oma
range
ofva
lues
.Trigg
ers
even
tsw
hen
its
han
dle
ism
oved
by
the
use
r.
valu
eCha
nge
d(N
umbe
r)-
valu
e:
Num
ber
Tou
chpad
Trigg
ers
even
tsw
hen
the
use
rper
form
sth
efo
llow
ing
multi-to
uch
gest
ure
son
it:
tap,
doub
leta
p,lo
ng
tap,
pinch
and
pan.
panned
X(N
umbe
r),
panned
Y(N
umbe
r),
tapp
ed()
,
doub
leTap
ped(
),pi
nch
ed(N
umbe
r)
--
Musi
cP
laye
r
Acc
esse
sm
obile
dev
ice’
sm
usi
clibra
ry,off
ers
funct
ional
ity
toiter
ate
thro
ugh
the
musi
clist
,
pla
yso
ngs
and
dis
pla
ysm
usi
cin
form
atio
n.
star
ted(
),st
oppe
d()
play
(),pa
use(
),nex
t(),
prev
ious
(),
setV
olum
e(N
umbe
r)
volu
me
:N
umbe
r
Con
tact
Boo
k
Acc
esse
suse
r’s
cont
acts
and
offer
s
funct
ional
ity
toiter
ate
thro
ugh
cont
acts
and
mak
eca
lls.
-ca
ll()
,pr
evio
us()
,
nex
t()
-
Imag
eV
iew
Dis
pla
ysan
imag
e.-
--
Mon
itor
Dis
pla
ysva
lues
over
tim
e(e
.g.
senso
rre
adin
gs).
-ad
dVal
ue1(
Num
ber)
,ad
dVal
ue2(
Num
ber)
-
Tab
leA
.1:
UI
Wid
gets
supp
orte
dby
Tan
goH
apps
.
126
A.1. EVENTS METHODS AND VARIABLES
Image
Nam
eD
esc
ription
Events
Meth
ods
Vari
able
s
Butt
onA
butt
onth
atca
nbe
pre
ssed
.Trigg
ers
even
tsw
hen
pre
ssed
and
when
rele
ased
.
button
Pre
ssed
(),
button
Relea
sed(
)-
pres
sed
:Boo
lean
Sw
itch
An
on/o
ffsw
itch
.Trigg
ers
even
tsw
hen
switch
edon
oroff
.
switch
edO
n()
,sw
itch
edO
ff()
,sw
itch
Cha
nge
d(Boo
lean
)-
-
Lig
htSen
sor
Mea
sure
sligh
tin
tensi
ty.
valu
eCha
nge
d(N
umbe
r)st
art(
),st
op()
ligh
tInte
nsi
ty:
Num
ber
Tem
per
ature
Sen
sor
Mea
sure
ste
mper
ature
.va
lueC
hange
d(N
umbe
r)st
art(
),st
op()
tem
pera
ture
:N
umbe
r
Acc
eler
omet
erM
easu
res
acce
lera
tion
forc
esin
a3D
spac
e.xC
hange
d(N
umbe
r),
yCha
nge
d(N
umbe
r),
zCha
nge
d(N
umbe
r)
star
t(),
stop
()
x:
Num
ber,
y:
Num
ber,
z:
Num
ber
LSM
Com
pas
sM
easu
res
acce
lera
tion
forc
esan
ddirec
tion
ofm
agnet
icfiel
ds
ina
3Dsp
ace.
valu
eCha
nge
d(O
bjec
t),
head
ingC
hange
d(N
umbe
r)st
art(
),st
op()
acce
lera
tion
:O
bjec
t,he
adin
g:
Num
ber
MP
U60
50M
easu
res
acce
lera
tion
acce
lera
tion
forc
esan
dits
orie
ntat
ion
ina
3Dsp
ace.
acce
lera
tion
Cha
nge
d(O
bjec
t),
orie
nta
tion
Cha
nge
d(O
bjec
t),
yawC
hange
d(N
umbe
r),
pitc
hCha
nge
d(N
umbe
r),
rollC
hange
d(N
umbe
r),
xCha
nge
d(N
umbe
r),
yCha
nge
d(N
umbe
r),
zCha
nge
d(N
umbe
r)
star
t(),
stop
()ac
celera
tion
:O
bjec
t,or
ienta
tion
:O
bjec
t
Tex
tile
Sen
sor
Rep
rese
nts
anan
alog
text
ile
senso
r.va
lueC
hange
d(N
umbe
r)st
art(
),
stop
()va
lue
:N
umbe
r
Pro
xim
ity
Sen
sor
Mea
sure
sth
edis
tance
toan
yob
ject
within
ara
nge
ofup
to10
met
ers.
prox
imityC
hange
d(N
umbe
r)st
art(
),
stop
()pr
oxim
ity
:N
umbe
r
Tab
leA
.2:
Sen
sors
supp
orte
dby
Tan
goH
apps
.
127
APPENDIX A. APPLICATION OBJECTS
Image
Nam
eD
esc
ription
Events
Meth
ods
Vari
able
s
LE
DA
Lig
htE
mitting
Dio
de
(LE
D).
Can
be
turn
edon
oroff
and
its
inte
nsi
tyca
nbe
set.
turn
edO
n()
,tu
rned
Off()
turn
On()
,tu
rnO
ff()
,se
tIte
nsi
ty()
on:
Boo
lean
,
inte
nsi
ty:
Num
ber
Buzz
erA
nel
emen
tth
atpro
duce
sa
sound
freq
uen
cy.
Itca
nbe
turn
edon
,tu
rned
off
and
its
sound
freq
uen
cyca
nbe
set.
-tu
rnO
n()
,tu
rnO
ff()
,se
tFre
-
quen
cy(N
umbe
r)
freq
uency
:N
umbe
r
Thre
e-C
olor
LE
DA
nLE
Dth
atem
its
ligh
tin
multip
leco
lors
.-
turn
On()
,tu
rnO
ff()
,
setR
ed(N
umbe
r),
setG
reen
(Num
ber)
,se
tBlu
e(N
umbe
r)
-
Vib
ration
Boa
rd
Pro
duce
svi
bra
tion
s.It
can
be
turn
edon
,
offan
dits
vibra
tion
freq
uen
cyca
nbe
set.
-tu
rnO
n()
,tu
rnO
ff()
,se
tFre
-
quen
cy(N
umbe
r)
freq
uency
:
Num
ber
Tex
tile
Spea
ker
Rep
rese
nts
ara
dio
mod
ule.
onC
hange
d(Boo
lean
)
turn
On()
,tu
rnO
ff()
,se
tFre
-qu
ency
(Num
ber)
,
setV
olum
e(N
umbe
r)
Tab
leA
.3:
Out
putD
evic
essu
ppor
ted
byTan
goH
apps
.
128
A.1. EVENTS METHODS AND VARIABLES
Image
Nam
eD
esc
ription
Events
Meth
ods
Vari
able
s
Lilyp
adO
ffici
alLilyp
adA
rduin
oM
ain
Boa
rd.
--
-
Lilyp
adSim
ple
Offi
cial
Lilyp
adA
rduin
oSim
ple
.H
asle
sspin
sth
anth
eLilyp
adA
rduin
oM
ain
Boa
rd.
--
-
BLE
-Lilyp
adLilyp
adA
rduin
oM
ain
Boa
rdw
ith
anin
tegr
ated
Blu
etoo
thLow
Ener
gym
odule
.-
--
Tab
leA
.4:
Mic
roco
ntrol
lers
supp
orte
dby
Tan
goH
apps
.
Image
Nam
eD
esc
ription
Events
Meth
ods
Vari
able
s
Boo
lean
Sto
res
aB
oole
anva
lue.
valu
eCha
nge
d(Boo
lean
)se
tVal
ue(B
oolean
)va
lue
:Boo
lean
Num
ber
Sto
res
anu
mber
.va
lueC
hange
d(N
umbe
r)se
tVal
ue(N
umbe
r)va
lue
:
Num
ber
Str
ing
Sto
res
aStr
ing.
valu
eCha
nge
d(Str
ing)
setV
alue
(Str
ing)
valu
e:
Str
ing
Tab
leA
.5:
Var
iabl
essu
ppor
ted
byTan
goH
apps
.
129
APPENDIX A. APPLICATION OBJECTSIm
age
Nam
eD
esc
ription
Events
Meth
ods
Vari
able
s
Big
ger
Em
its
the
condi
tion
IsTru
e()
and
condi
tion
Cha
nge
d()
Eve
nts
with
aBoo
lean
Var
iable
set
totr
ueif
the
fist
input
variab
leis
big
ger
than
the
seco
nd
input
Var
iable.
Oth
erw
ise
emits
condi
tion
IsFal
se()
and
condi
tion
Cha
nge
d()
with
aboo
lean
Var
iable
set
tofa
lse.
condi
tion
IsTru
e(),
condi
tion
IsFal
se()
,
condi
tion
-C
hange
d(Boo
lean
)
setV
alue
1(N
umbe
r),
setV
alue
2(N
umbe
r)
isTru
e:
Boo
lean
Big
gerE
qual
Em
its
the
condi
tion
IsTru
e()
and
condi
tion
Cha
nge
d()
Eve
nts
with
aBoo
lean
Var
iable
set
totr
ueif
the
fist
input
variab
leis
big
ger
or
equal
than
the
seco
nd
input
Var
iable.
Oth
erw
ise
emits
condi
tion
IsFal
se()
and
condi
tion
Cha
nge
d()
with
aboo
lean
Var
iable
set
tofa
lse.
condi
tion
IsTru
e(),
condi
tion
IsFal
se()
,
condi
tion
-C
hange
d(Boo
lean
)
setV
alue
1(N
umbe
r),
setV
alue
2(N
umbe
r)
isTru
e:
Boo
lean
Sm
alle
r
Em
its
the
condi
tion
IsTru
e()
and
condi
tion
Cha
nge
d()
Eve
nts
with
aBoo
lean
Var
iable
set
totr
ueif
the
fist
input
variab
leis
smaller
than
the
seco
nd
input
Var
iable.
Oth
erw
ise
emits
condi
tion
IsFal
se()
and
condi
tion
Cha
nge
d()
with
aboo
lean
Var
iable
set
tofa
lse.
condi
tion
IsTru
e(),
condi
tion
IsFal
se()
,
condi
tion
-C
hange
d(Boo
lean
)
setV
alue
1(N
umbe
r),
setV
alue
2(N
umbe
r)
isTru
e:
Boo
lean
Sm
alle
rEqu
al
Em
its
the
condi
tion
IsTru
e()
and
condi
tion
Cha
nge
d()
Eve
nts
with
aBoo
lean
Var
iable
set
totr
ueif
the
fist
input
variab
leis
smaller
or
equal
than
the
seco
nd
input
Var
iable.
Oth
erw
ise
emits
condi
tion
IsFal
se()
and
condi
tion
Cha
nge
d()
with
aboo
lean
Var
iable
set
tofa
lse.
condi
tion
IsTru
e(),
condi
tion
IsFal
se()
,
condi
tion
-C
hange
d(Boo
lean
)
setV
alue
1(N
umbe
r),
setV
alue
2(N
umbe
r)
isTru
e:
Boo
lean
Tab
leA
.6:
Com
pari
son
Ope
rato
rssu
ppor
ted
byTan
goH
apps
.
130
A.1. EVENTS METHODS AND VARIABLES
Image
Nam
eD
esc
ription
Events
Meth
ods
Vari
able
s
Equ
al
Em
its
the
condi
tion
IsTru
e()
and
condi
tion
Cha
nge
d()
Eve
nts
with
aBoo
lean
Var
iable
set
totr
ueif
the
fist
input
variab
leis
equalto
the
seco
nd
input
Var
iable.
Oth
erw
ise
emits
condi
tion
IsFal
se()
and
condi
tion
Cha
nge
d()
with
aboo
lean
Var
iable
set
tofa
lse.
condi
tion
IsTru
e(),
condi
tion
IsFal
se()
,
condi
tion
-C
hange
d(Boo
lean
)
setV
alue
1(N
umbe
r),
setV
alue
2(N
umbe
r)
isTru
e:
Boo
lean
Not
Equ
al
Em
its
the
condi
tion
IsTru
e()
and
condi
tion
Cha
nge
d()
Eve
nts
with
aBoo
lean
Var
iable
set
totr
ueif
the
fist
input
variab
leis
not
equalto
the
seco
nd
input
Var
iable.
Oth
erw
ise
emits
condi
tion
IsFal
se()
and
condi
tion
Cha
nge
d()
with
aboo
lean
Var
iable
set
tofa
lse.
condi
tion
IsTru
e(),
condi
tion
IsFal
se()
,
condi
tion
-C
hange
d(Boo
lean
)
setV
alue
1(N
umbe
r),
setV
alue
2(N
umbe
r)
isTru
e:
Boo
lean
And
Em
its
the
condi
tion
IsTru
e()
and
condi
tion
Cha
nge
d()
Eve
nts
with
aBoo
lean
Var
iable
set
totr
ueif
both
boole
an
input
vari
able
sare
true.
Oth
erw
ise
emits
condi
tion
IsFal
se()
and
condi
tion
Cha
nge
d()
with
aboo
lean
Var
iable
set
tofa
lse.
condi
tion
IsTru
e(),
condi
tion
IsFal
se()
,co
ndi
tion
-
Cha
nge
d(Boo
lean
)
setV
alue
1(Boo
lean
),se
tVal
ue2(
Boo
lean
)is
Tru
e:
Boo
lean
Or
Em
its
the
condi
tion
IsTru
e()
and
condi
tion
Cha
nge
d()
Eve
nts
with
aBoo
lean
Var
iable
set
totr
ueif
either
boole
an
input
vari
able
istr
ue.
Oth
erw
ise
emits
condi
tion
IsFal
se()
and
condi
tion
Cha
nge
d()
with
aboo
lean
Var
iable
set
tofa
lse.
condi
tion
IsTru
e(),
condi
tion
IsFal
se()
,co
ndi
tion
-C
hange
d(Boo
lean
)
setV
alue
1(Boo
lean
),se
tVal
ue2(
Boo
lean
)is
Tru
e:
Boo
lean
Tab
leA
.7:
Log
ical
Ope
rato
rssu
ppor
ted
byTan
goH
apps
.
131
APPENDIX A. APPLICATION OBJECTS
Image
Nam
eD
esc
ription
Events
Meth
ods
Vari
able
s
Additio
nA
dds
two
num
ber
s.co
mpu
ted(
Num
ber)
setO
pera
nd1
(Num
ber)
,se
tOpe
rand2
(Num
ber)
,co
mpu
te()
resu
lt:
Num
ber
Subtr
action
Subtr
acts
oper
and2
from
oper
and1
.co
mpu
ted(
Num
ber)
setO
pera
nd1
(Num
ber)
,
setO
pera
nd2
(Num
ber)
,co
mpu
te()
resu
lt:
Num
ber
Multip
lica
tion
Multip
lies
two
num
ber
sco
mpu
ted(
Num
ber)
setO
pera
nd1
(Num
ber)
,se
tOpe
rand2
(Num
ber)
,co
mpu
te()
resu
lt:
Num
ber
Div
isio
nD
ivid
esop
eran
d1by
oper
and2
.co
mpu
ted(
Num
ber)
setO
pera
nd1
(Num
ber)
,se
tOpe
rand2
(Num
ber)
,
com
pute
()
resu
lt:
Num
ber
Mod
ulo
Cal
cula
tes
the
resi
dual
ofth
ediv
isio
nbet
wee
nop
eran
d1an
dop
eran
d2.
com
pute
d(N
umbe
r)se
tOpe
rand1
(Num
ber)
,se
tOpe
rand2
(Num
ber)
,co
mpu
te()
resu
lt:
Num
ber
Tab
leA
.8:
Ari
thm
etic
Ope
rato
rssu
ppor
ted
byTan
goH
apps
.
132
A.1. EVENTS METHODS AND VARIABLES
Image
Nam
eD
esc
ription
Events
Meth
ods
Vari
able
s
Win
dow
Tak
essi
gnal
valu
esas
inpu
t,an
dre
turn
sch
unks
ofth
esa
me
sign
als,
whi
chm
ight
over
lap.
filled
(Obj
ect)
addV
alue
(Num
ber)
data
:O
bjec
t
Low
-Pas
sFilte
rLow
-pas
ses
asi
gnal
.filter
edVal
ues(
Obj
ect)
addV
alue
(Num
ber)
,ad
dVal
ues(
Obj
ect)
,
rem
oveA
llVal
ues(
)
-
Hig
h-P
ass
Filte
rH
igh-p
asse
sa
sign
al.
filter
edVal
ues(
Obj
ect)
addV
alue
(Num
ber)
,
addV
alue
s(O
bjec
t),
rem
oveA
llVal
ues(
)-
Mea
n
Ext
ract
orC
alcu
late
sth
em
ean
ofa
set
ofva
lues
.feat
ureE
xtra
cted
(Num
ber)
addV
alue
(Num
ber)
,ad
dVal
ues(
Obj
ect)
,
rem
oveA
llVal
ues(
),co
mpu
te()
-
Dev
iation
Ext
ract
orC
alcu
late
sth
edev
iation
ofa
set
ofva
lues
.feat
ureE
xtra
cted
(Num
ber)
addV
alue
(Num
ber)
,ad
dVal
ues(
Obj
ect)
,re
mov
eAllVal
ues(
),
com
pute
()
-
Pea
k
Det
ecto
r
Cal
cula
tes
the
pea
kin
ase
tof
valu
esan
dpro
vides
the
pea
k’s
valu
ean
din
dex
inan
even
t.It
offer
sm
ethod
sto
set
the
range
ina
set
ofsa
mple
sw
her
eth
epea
ksh
ould
be
sear
ched
.
feat
ureE
xtra
cted
(Num
ber)
,
inde
xExt
ract
ed(N
umbe
r)
addV
alue
(Num
ber)
,re
mov
eAllVal
ues(
),co
mpu
te()
,
setR
ange
S-
tart
(Num
ber)
,se
-
tRan
geEnd(
Num
ber)
-
Act
ivity
Cla
ssifi
er
Cla
ssifi
esuse
rm
otio
nbas
edon
acce
lero
met
erin
put.
Act
ivitie
ssu
ppor
ted
are:
’wal
king’
,’runnin
g’,’c
lim
bin
g’an
d’n
otm
ovin
g’.
wal
king(
),ru
nnin
g(),
clim
bing(
),not
Mov
ing(
)clas
sify
Sam
ples
(Obj
ect)
-
Pos
ture
Cla
ssifi
er
Cla
ssifi
esuse
rpos
ture
sbas
edon
IMU
input.
Suppor
ted
pos
ture
sar
e:’s
tandin
g’,’lyi
ng
dow
non
stom
ach’an
d’lyi
ng
dow
non
bac
k’.
stan
ding(
),ly
ingD
own()
,
lyin
gUp(
)clas
sify
Sam
ples
(Obj
ect)
-
Tab
leA
.9:
Sign
alpr
oces
sing
and
clas
sific
atio
nFun
ctio
ns
supp
orte
dby
Tan
goH
apps
.
133
APPENDIX A. APPLICATION OBJECTS
Image
Nam
eD
esc
ription
Events
Meth
ods
Vari
able
s
Tim
er
Gen
erat
esan
even
taf
ter
xtim
e.It
’str
igge
rtim
ean
dw
het
her
itsh
ould
trig
ger
once
orm
any
tim
esca
nbe
configu
red
over
the
Con
figu
ration
Vie
win
the
Pal
ette
.
trig
gere
d()
star
t(),
stop
()-
Sou
nd
Rep
rese
nts
aso
und.
Itoff
ers
asi
ngl
em
ethod
topla
yit.
-pl
ay()
-
Rec
order
Pro
vides
aw
ayto
stor
ea
set
ofva
lues
(e.g
.fr
oma
senso
r)an
dto
feed
ase
tof
valu
esin
toot
her
obje
cts.
star
tRec
ordi
ng(
),st
opRec
ordi
ng(
)
star
tRec
ordi
ng(
),st
opRec
ordi
ng(
),ad
dVal
ue(N
umbe
r)-
Map
per
Sca
les
valu
es.
This
isuse
fulto
mak
eva
lues
fit
toth
era
nge
requ
ired
byot
her
obje
cts.
For
exam
ple
,th
eSlide
rU
Iw
idge
tpro
duce
sby
def
ault
valu
esin
the
range
[025
5]an
dth
e
Buz
zer
har
dw
are
elem
ent
requ
ires
afr
equen
cyin
the
range
[020
000]
.T
he
Map
per
wou
ldta
keth
eSlide
r’s
input
range
[Min
1M
ax1]
and
scal
eit
linea
rly
toth
eBuz
zer’
sou
tput
range
[Min
2M
ax2]
.G
ener
ates
anev
ent
when
ever
the
valu
ech
ange
s.
valu
eCha
nge
d(N
umbe
r)
setM
in1(
Num
ber)
,se
tMax
1(N
umbe
r),
setM
in2(
Num
ber)
,se
tMax
2(N
umbe
r),
setV
alue
(Num
ber)
valu
e:
Num
ber
Tab
leA
.10:
Uti
lity
prog
ram
min
gel
emen
tssu
ppor
ted
byTan
goH
apps
.
134
A.2. PALETTE OBJECTS
A.2 Palette Objects
Figures A.1 and A.2 display every object available in Interactex Designer.
A.3 Variables
In addition to the Variables that we listed in Section A.1, Application Objects exposeVariables that are used by developers to change the Application Object ’s default con-figuration. These Variables are displayed over the Configuration View in the Palette.It should be noted that Variables of a superclass are shared among all its subclassesand that Microcontrollers, Arithmetic Operators and Logical Operators have no Vari-ables. Figure A.3 in the Appendix displays the Variables of a Label and Timer objects.
135
APPENDIX A. APPLICATION OBJECTS
Figure A.1: Palette Objects for instantiating Smart Textiles, Electronic Devices (e.g.Hardware Devices and Microcontrollers) and UI Widgets available in Interactex De-signer.
136
A.3. VARIABLES
Figure A.2: Palette Objects for instantiating Signal Processing Functions, Variablesand Operators available in Interactex Designer.
137
APPENDIX A. APPLICATION OBJECTS
Cla
ssVari
able
s
UI
Wid
get
•W
idth
.U
sed
tose
tth
ew
idth
ofth
eU
IW
idge
t.
•H
eigh
t.Se
tsth
ehe
ight
ofth
eU
IW
idge
t.
•C
olor
.Se
tsth
eba
ckgr
ound
colo
rof
the
wid
get.
Not
ever
yU
IW
idge
tha
sa
back
grou
ndco
lor.
Cha
nges
toth
isVar
iabl
ew
illbe
effec
tive
onth
ew
idge
tsth
atha
vea
back
grou
ndco
lor.
Butt
onT
itle
.T
hete
xton
the
But
ton.
Lab
elTex
t.T
hete
xton
the
Lab
el.
Sw
itch
On
atst
art.
Whe
ther
the
Switch
shou
ldbe
turn
edon
atth
ebe
ginn
ing
ofth
eap
plic
atio
n.
Slider
•Val
ue.
The
valu
eth
atsh
ould
beas
sign
edto
the
Slide
rin
itia
lly.
•M
in.
The
min
imum
allo
wed
valu
e.
•M
ax.
The
max
imum
allo
wed
valu
e.
Tou
chpad
•X
-Mul
tipl
ier.
Con
stan
tus
edto
scal
eth
edi
spla
cem
ent
valu
esal
ong
the
x-ax
isde
liver
edby
the
Tou
chpa
d.
•Y
-Mul
tipl
ier.
Con
stan
tus
edto
scal
eth
edi
spla
cem
ent
valu
esal
ong
the
y-ax
isde
liver
edby
the
Tou
chpa
d.
Tab
leA
.11:
Var
iabl
esof
UI
Wid
gets
(Par
t1/
2).
138
A.3. VARIABLES
Cla
ssVari
able
s
Musi
cP
laye
r
•Pla
ybu
tton
.W
heth
erth
eM
usic
Pla
yer
shou
ldsh
owth
epl
aybu
tton
.
•N
extbu
tton
.W
heth
erth
eM
usic
Pla
yer
shou
ldsh
owa
butt
onto
play
next
song
inth
epl
aylis
t.
•Pre
viou
sbu
tton
.W
heth
erth
eM
usic
Pla
yer
shou
ldsh
owa
butt
onto
play
prev
ious
song
inth
epl
aylis
t.
•Vol
ume
view
.W
heth
erth
eM
usic
Pla
yer
shou
ldsh
owth
evo
lum
esl
ider
.
•V
isib
le.
Whe
ther
the
Mus
icPla
yer
shou
ldbe
visi
ble
inth
em
obile
devi
ce’s
inte
rfac
e.
Con
tact
Boo
k
•C
allbu
tton
.W
heth
erth
eC
onta
ctBoo
ksh
ould
show
the
call
butt
on.
•N
extbu
tton
.Whe
ther
the
Con
tact
Boo
ksh
ould
show
abu
tton
toit
erat
eto
the
next
cont
act.
•Pre
viou
sbu
tton
.Whe
ther
the
Con
tact
Boo
ksh
ould
show
abu
tton
toit
erat
eto
the
prev
ious
cont
act.
Imag
eV
iew
Imag
e.O
pens
apo
p-up
for
sele
ctio
nof
the
imag
eam
ong
the
user
’sm
obile
phon
eim
age
albu
ms.
Mon
itor
•X
-Axi
s.W
heth
erth
eM
onit
orsh
ould
disp
lay
the
hori
zont
alax
is.
•Y
-Axi
s.W
heth
erth
eM
onit
orsh
ould
disp
lay
the
vert
ical
axis.
Tab
leA
.12:
Var
iabl
esof
UI
Wid
gets
(Par
t2/
2).
139
APPENDIX A. APPLICATION OBJECTSC
lass
Vari
able
s
Har
dwar
eD
evic
e
•N
ame.
Ena
bles
deve
lope
rsto
prov
ide
ana
me
toth
eH
ardw
are
Dev
ice
toid
enti
fyit
mor
eea
sily
duri
ngde
velo
pmen
t.T
hena
me
will
bedi
spla
yed
ina
labe
lbel
owth
eEdi
tabl
eO
bjec
t’s
imag
ein
the
canv
as.
•A
utor
oute
.If
ther
eis
asing
leM
icro
control
ler
adde
dto
the
canv
as,th
isbu
tton
caus
esth
eH
ardw
are
Dev
ice
tobe
wir
edto
any
com
pati
ble
avai
labl
epi
nsof
the
Mic
roco
ntrol
ler.
Butt
onE
xpos
esno
Var
iabl
es.
Sw
itch
Exp
oses
noVar
iabl
es.
Lig
htSen
sor
Exp
oses
noVar
iabl
es.
Tem
per
ature
Sen
sor
Exp
oses
noVar
iabl
es.
Acc
eler
omet
erE
xpos
esno
Var
iabl
es.
LSM
Com
pas
sE
xpos
esno
Var
iabl
es.
MP
U60
50E
xpos
esno
Var
iabl
es.
Pot
entiom
eter
•N
otify
stra
tegy
.D
eter
min
esho
wth
ePot
enti
omet
erno
tifie
sot
her
Obje
cts
abou
tne
wva
lues
bein
gav
aila
ble.
Can
bese
tto
eith
erA
lway
s-
whi
chca
uses
the
Pot
entiom
eter
toal
way
sem
itev
ents
whe
nne
wva
lues
are
avai
labl
e-,
inRan
ge-
whi
chca
uses
the
Pot
entiom
eter
toem
itev
ents
only
whe
nth
eva
lues
liebe
twee
na
spec
ific
rang
e-
and
Once
-w
hich
caus
esth
ePot
entiom
eter
toem
itan
even
tw
hen
the
valu
een
ters
asp
ecifi
edra
nge.
•M
innot
ify
valu
e.T
hem
inim
umva
lue
inth
era
nge
used
byth
eno
tific
atio
nst
rate
gies
inRan
gean
dO
nce
.
•M
axnot
ify
valu
e.T
hem
axim
umva
lue
inth
era
nge
used
byth
eno
tific
atio
nst
rate
gies
inRan
gean
dO
nce
.
Pro
xim
ity
Sens
or
Tab
leA
.13:
Var
iabl
esof
Har
dwar
eD
evic
es(P
art
1/2)
.
140
A.3. VARIABLES
Cla
ssVari
able
s
LE
D•
Inte
nsi
ty.
The
init
ialin
tens
ity
ofth
eLED
.
•O
nat
star
t.W
heth
erth
eLED
shou
ldbe
turn
edw
hen
the
appl
icat
ion
star
ts.
Buzz
er•
Fre
quen
cy.
The
init
ialfr
eque
ncy
ofth
eBuz
zer.
•O
nat
star
t.W
heth
erth
eBuz
zer
shou
ldbe
turn
edw
hen
the
appl
icat
ion
star
ts.
Thre
e-C
olor
LE
D
•Red
.T
heT
hree
-Col
orLED
sin
itia
lre
dco
mpo
nent
inte
nsity.
•G
reen
.T
heT
hree
-Col
orLED
sin
itia
lgr
een
com
pone
ntin
tens
ity.
•Blu
e.T
heT
hree
-Col
orLED
sin
itia
lbl
ueco
mpo
nent
inte
nsity.
Vib
ration
Boa
rdFre
quen
cy.
The
Vib
ration
Boa
rd’s
init
ialfr
eque
ncy.
Tab
leA
.14:
Var
iabl
esof
Har
dwar
eD
evic
es(P
art
2/2)
.
141
APPENDIX A. APPLICATION OBJECTS
Cla
ssVari
able
s
Num
ber
Val
ue.
Tex
tbo
xto
inpu
tth
eN
umbe
r’s
init
ialva
lue.
Boo
lean
Val
ue.
Swit
chto
inpu
tth
eBoo
lean
’sin
itia
lva
lue.
Stri
ngVal
ue.
Tex
tbo
xto
inpu
tth
eStr
ing’s
init
ialva
lue.
Win
dow
•Siz
e.T
heam
ount
ofsa
mpl
esth
eW
indo
wsh
ould
buffe
r.
•O
verlap
ping.
The
amou
ntof
sam
ples
inth
ecu
rren
tbu
ffer
that
wer
eal
sopr
esen
tin
the
prev
ious
buffe
r.
Low
-Pas
sFilte
rFac
tor.
Sets
the
inte
nsity
ofth
eFilte
r.A
fact
orof
0le
aves
the
sign
alun
mod
ified
.T
hebi
gger
the
fact
or,
the
smoo
ther
the
sign
albe
com
es.
Hig
h-P
ass
Filte
rT
heH
igh-
Pas
sFilte
rha
sth
esa
me
Var
iabl
esas
the
Low
-Pas
sFilte
r.M
ean
Ext
ract
orE
xpos
esno
Var
iabl
es.
Dev
iation
Ext
ract
orE
xpos
esno
Var
iabl
es.
Pea
kD
etec
tor
•M
in.
Sets
the
min
imum
valu
eof
the
rang
ein
whi
chpe
aks
are
sear
ched
for.
•M
ax.
Sets
the
max
imum
valu
eof
the
rang
ein
whi
chpe
aks
are
sear
ched
for.
•U
pper
peak
.D
efine
sw
heth
erth
ePea
kD
etec
tor
shou
ldse
arch
for
uppe
ror
low
erpe
aks.
Act
ivity
Cla
ssifi
erE
xpos
esno
Var
iabl
es.
Pos
ture
Cla
ssifi
erE
xpos
esno
Var
iabl
es.
Tab
leA
.15:
Var
iabl
esof
Pro
gram
min
gO
bjec
ts(P
art
1/2)
.
142
A.3. VARIABLES
Cla
ssVari
able
s
Tim
er
•Fre
quen
cy.
The
trig
ger
freq
uenc
yof
the
Tim
erin
seco
nds.
•T
ime
Ela
psed
.T
heam
ount
ofse
cond
ssi
nce
the
tim
erha
sbe
enst
ated
(0if
the
tim
erha
sno
tbe
enst
arte
d).
•Rep
eats
.W
heth
erth
eT
imer
shou
ldtr
igge
ron
ceor
inde
finit
ely
unti
lst
oppe
d.
Sou
nd
Sou
nd.
Ope
nsa
popu
pfo
rse
lect
ion
ofa
soun
dfil
e.R
ecor
der
Exp
oses
noVar
iabl
es.
Map
per
•In
putM
in.
Defi
nes
the
min
imum
valu
eof
the
inpu
tra
nge.
•In
putM
ax.
Defi
nes
the
max
imum
valu
eof
the
inpu
tra
nge.
•O
utpu
tM
in.
Defi
nes
the
min
imum
valu
eof
the
inpu
tra
nge.
•O
utpu
tM
ax.
Defi
nes
the
max
imum
valu
eof
the
inpu
tra
nge.
Bas
edon
thes
efo
urVar
iabl
es,th
eM
appe
rde
fines
alin
ear
func
tion
whi
chit
uses
tosc
ale
valu
esin
the
inpu
tra
nge
tova
lues
inth
eou
tput
rang
e.
Tab
leA
.16:
Var
iabl
esof
Pro
gram
min
gO
bjec
ts(P
art
2/2)
.
143
APPENDIX A. APPLICATION OBJECTS
A.4 Simulation Objects
Simulation Objects provide an interface for users to modify the state of ApplicationObjects and display runtime information useful for debugging purposes. Tables A.17,A.18 and A.19 list what each Simulation Object subclass displays and how it reactsto user input. It should be noted that UI Widgets Simulation Objects behave in anidentical way as the UI Widget.
144
A.4. SIMULATION OBJECTS
Figure A.3: Configuration View of a Label and Timer.
145
APPENDIX A. APPLICATION OBJECTS
Sim
ula
tion
Obje
ct
Input
Outp
ut
Butt
onFin
ger
touc
hes
onth
eBut
ton
Sim
ulat
ion
Obj
ect
caus
eth
eBut
ton
tobe
pres
sed/
rele
ased
.P
ress
ing
and
rele
asin
gth
ebu
tton
prod
uce
aso
und
effec
t.
Sw
itch
Tap
son
the
Switch
Sim
ulat
ion
Obj
ect
caus
eth
eSwitch
tobe
turn
edon
/off.
•P
ress
ing
and
rele
asin
gth
ebu
tton
prod
uce
aso
und
effec
t.
•A
diffe
rent
imag
eis
show
ndi
spla
ying
the
swit
chtu
rned
onor
off.
Lig
htSen
sor
Fin
ger
touc
hes
onth
eLig
htSen
sor
Sim
ulat
ion
Obj
ect
caus
eth
elig
htin
tens
ity
mea
sure
dby
the
Lig
htSen
sor
toin
crea
se.
The
tem
pera
ture
star
tsde
crea
sing
auto
mat
ical
lyw
hen
the
user
isno
tto
uchi
ngth
eLig
htSen
sor
Sim
ulat
ion
Obj
ect.
•A
spri
teof
alig
htbe
amtu
rns
brig
hter
orda
rker
depe
ndin
gon
the
sim
ulat
edlig
htin
tens
ity.
•A
labe
lis
show
nbe
low
the
elem
ent
cont
aini
ngth
ecu
rren
tlig
htin
tens
ity
valu
e.
Tem
per
ature
Sen
sor
Clo
ckw
ise
rota
tion
gest
ures
arou
ndth
eTem
pera
ture
Sen
sor
Sim
ulat
ion
Obj
ect
caus
eth
ete
mpe
ratu
rem
easu
red
byth
eob
ject
toin
crea
se.
Cou
nter
-clo
ckw
ise
rota
tion
gest
ures
caus
eth
esi
mul
ated
tem
pera
ture
tode
crea
se.
•A
nim
age
ofth
eth
erm
omet
erw
ith
am
ovin
gar
row
spri
tein
dica
tecu
rren
tsi
mul
ated
tem
pera
ture
.
•A
labe
lis
show
nbe
low
the
elem
ent
cont
aini
ngth
ecu
rren
tte
mpe
ratu
reva
lue.
Tab
leA
.17:
Subc
lass
esof
Sen
sor
Sim
ulat
ion
Obj
ect.
146
A.4. SIMULATION OBJECTS
Sim
ula
tion
Obje
ct
Input
Outp
ut
Acc
eler
omet
erT
heAcc
eler
omet
erSim
ulat
ion
Obj
ect
reac
tsto
the
acce
lera
tion
mea
sure
dby
the
acce
lero
met
erin
tegr
ated
inth
eta
blet
devi
ce.
Aba
llis
show
nar
ound
aci
rcle
fram
eth
atm
oves
inth
edi
rect
ion
ofth
eac
cele
rati
on.
LSM
Com
pas
s
The
LSM
Com
pass
Sim
ulat
ion
Obj
ect
reac
tsto
the
acce
lera
tion
and
com
pass
head
ing
mea
sure
dby
the
acce
lero
met
eran
dm
agne
tom
eter
inte
grat
edin
the
tabl
etde
vice
.
Dis
play
sth
esa
me
outp
utas
the
Acc
eler
omet
eran
dan
addi
tion
alri
ngre
pres
enti
ngan
anal
ogco
mpa
ssla
belle
dw
ith
four
coor
dina
tes
(Nor
th,So
uth,
Wes
t,E
ast)
that
rota
tes
tom
atch
the
coor
dina
tes
asth
eus
erro
tate
sth
eta
blet
.
MP
U60
50
The
MPU
6050
Sim
ulat
ion
Obj
ect
reac
tsto
the
acce
lera
tion
and
rota
tion
mea
sure
dby
the
acce
lero
met
eran
dgy
rosc
ope
inte
grat
edin
the
tabl
etde
vice
.
Dis
play
sth
esa
me
outp
utas
the
Acc
eler
omet
er.
Pot
entiom
eter
The
Pot
entiom
eter
Sim
ulat
ion
Obj
ect
offer
sa
knob
whi
chus
ers
can
drag
tosi
mul
ate
diffe
rent
valu
es.
•T
hePot
entiom
eter
’ssi
mul
ated
valu
eca
nbe
read
impl
icit
lyfr
omth
epo
siti
onof
the
knob
.
•A
ddit
iona
lly,as
wit
hth
eLig
htSen
sor
and
Tem
pera
ture
Sen
sor,
ala
belis
show
nbe
low
the
elem
ent
cont
aini
ngth
ecu
rren
tte
mpe
ratu
reva
lue.
Tab
leA
.18:
Subc
lass
esof
Sen
sor
Sim
ulat
ion
Obj
ect.
147
APPENDIX A. APPLICATION OBJECTS
Sim
ula
tion
Obje
ct
Input
Outp
ut
LE
DA
sing
leta
pge
stur
eon
the
LED
Sim
ulat
ion
Obj
ect
turn
sth
eLED
on/o
ff.
Dis
play
sa
spri
teof
alig
htbe
amw
hich
gets
brig
hter
orda
rker
depe
ndin
gon
the
inte
nsity
ofth
eLE
D.
The
spri
teis
hidd
enif
the
LE
Dis
turn
edoff
.
Buzz
erA
sing
leta
pge
stur
eon
the
Buz
zer
Sim
ulat
ion
Obj
ect
turn
sit
on/o
ff.
•T
heBuz
zer
Sim
ulat
ion
Obj
ect
play
sa
shak
ing
anim
atio
nw
hen
itis
turn
edon
.
•A
soun
dat
the
sam
efr
eque
ncy
asth
eBuz
zer
ispl
ayed
.
Thre
e-C
olor
LE
DA
sing
leta
pge
stur
eon
the
Thr
ee-C
olor
LED
Sim
ulat
ion
Obj
ect
turn
sit
on/o
ff.
Dis
play
sa
spri
teof
alig
htbe
amth
atch
ange
sco
lor
depe
ndin
gon
the
curr
ent
sim
ulat
edco
lor
confi
gura
tion
ofth
eT
hree
-Col
orLED
Sim
ulat
ion
Obj
ect.
The
spri
teis
hidd
enif
the
LE
Dis
turn
edoff
.
Vib
ration
Boa
rdA
sing
leta
pge
stur
eon
the
Vib
ration
Boa
rdSim
ulat
ion
Obj
ect
turn
sit
on/o
ff.T
heV
ibra
tion
Boa
rdpl
ays
ash
akin
gan
imat
ion
whe
nit
istu
rned
on.
Tab
leA
.19:
Subc
lass
esof
Out
putD
evic
eSim
ulat
ion
Obj
ect.
148
A.4. SIMULATION OBJECTS
Cla
ssIn
put
Outp
ut
Num
ber
Acc
epts
nous
erin
put.
Dis
play
sa
labe
lw
ith
the
curr
entl
yst
ored
valu
e.B
oole
anA
ccep
tsno
user
inpu
t.D
ispl
ays
ala
belw
ith
the
curr
entl
yst
ored
valu
e.St
ring
Acc
epts
nous
erin
put.
Dis
play
sa
labe
lw
ith
the
curr
entl
yst
ored
valu
e.W
indo
wA
ccep
tsno
user
inpu
t.D
ispl
ays
noou
tput
.Low
-Pas
sFilte
rA
ccep
tsno
user
inpu
t.D
ispl
ays
noou
tput
.H
igh-P
ass
Filte
rA
ccep
tsno
user
inpu
t.D
ispl
ays
noou
tput
.M
ean
Ext
ract
orA
ccep
tsno
user
inpu
t.D
ispl
ays
ala
belw
ith
the
curr
ent
mea
n.D
evia
tion
Ext
ract
orA
ccep
tsno
user
inpu
t.D
ispl
ays
ala
belw
ith
the
curr
ent
devi
atio
n.
Pea
kD
etec
tor
Acc
epts
nous
erin
put.
Dis
play
sno
outp
ut.
Act
ivity
Cla
ssifi
erA
ccep
tsno
user
inpu
t.D
ispl
ays
anan
imat
ion
ofa
pers
onw
alki
ng,ru
nnin
g,st
andi
ngor
clim
bing
.
Pos
ture
Cla
ssifi
erA
ccep
tsno
user
inpu
t.D
ispl
ays
anim
age
ofa
pers
onst
andi
ng,ly
ing
dow
non
stom
ach
orly
ing
dow
non
the
back
.
Tab
leA
.20:
Subc
lass
esof
Pro
gram
min
gSim
ulat
ion
Obj
ects
.
Cla
ssIn
put
Outp
ut
Tim
erA
sing
leta
pon
the
Tim
erSim
ulat
ion
Obj
ect
caus
esth
eT
imer
coun
tdow
nto
rese
t.D
ispl
ays
ala
belw
ith
the
curr
ent
coun
tdow
nti
me.
Sou
nd
Acc
epts
nous
erin
put.
The
soun
dis
play
ed.
Rec
order
Acc
epts
nous
erin
put.
The
reco
rder
’sim
age
chan
ges
toin
dica
tew
heth
erit
isre
cord
ing
orno
t.M
apper
Acc
epts
nous
erin
put.
Dis
play
sno
outp
ut.
Tab
leA
.21:
Subc
lass
esof
Pro
gram
min
gSim
ulat
ion
Obj
ects
.
149
APPENDIX A. APPLICATION OBJECTS
A.5 TextIT Objects
In this section we describe every TextIT object. Control Objects are shown in TablesA.22 and A.23. Visualization Objects are shown in Tables A.24, A.25, A.26 and A.27.
150
A.5. TEXTIT OBJECTS
Image
Nam
eD
escr
ipti
on
Sta
rtBlo
ckC
licki
ngth
ePla
ybu
tton
caus
esa
“sta
rt”
mes
sage
tobe
sent
toth
eco
nnec
ted
obje
ct
Dat
aSou
rce
Clic
king
onth
eEdi
tbu
tton
open
sa
file
sele
ctio
nw
indo
wto
choo
seth
eda
tafil
efr
omth
efil
esys
tem
.
Con
dition
alEle
men
tC
licki
ngon
the
Edi
tbu
tton
lead
sto
aco
deed
itin
gw
indo
w.
Tab
leA
.22:
Tex
tIT
’sC
ontrol
Obj
ects
.
151
APPENDIX A. APPLICATION OBJECTS
Image
Nam
eD
escr
ipti
on
Iter
ator
Rec
eive
sin
put
data
on“inp
ut”
port
.E
vent
str
igge
red
onth
e“n
ext”
port
caus
eth
eIt
erat
orto
proc
ess
the
next
elem
ent
inth
ein
put
arra
y.
Del
ayer
Inth
isca
se,th
eD
elay
erw
illtr
igge
ra
sign
alev
ery
50m
illis
econ
ds.
Lib
rary
Man
ager
Clic
king
onth
eEdi
tbu
tton
lead
sto
afil
ese
lect
ion
win
dow
toch
oose
the
data
file
from
the
files
yste
m.
Tab
leA
.23:
Tex
tIT
’sIt
erat
or,D
elay
eran
dLib
rary
Man
ager
Con
trol
Obj
ects
.
152
A.5. TEXTIT OBJECTS
Image
Nam
eD
escr
ipti
on
Lab
elT
heLab
eldi
spla
ysth
eva
lue
prov
ided
thro
ugh
its
inpu
tpo
rt.
Lin
eC
hart
The
Lin
eC
hart
disp
lays
ase
ries
ofva
lues
aslin
esor
bars
.Lin
ese
ries
can
bede
fined
from
the
elem
ent’
sEdi
tW
indo
w.
Val
ues
prov
ided
thro
ugh
the
inpu
tpo
rtha
veto
mat
chth
eam
ount
oflin
ese
ries
defin
ed.
The
Lin
eC
hart
has
addi
tion
albu
tton
sto
show
/hi
deth
ese
ries
,sa
vea
scre
ensh
otof
the
char
tan
dto
swit
chbe
twee
nlin
eor
bar
visu
aliz
atio
nst
yles
.
Pie
Cha
rt
The
Pie
Cha
rtdi
spla
ysva
lues
asa
pie
char
t.Fe
atur
essi
mila
rco
nfigu
rabl
efu
ncti
onal
ity
asth
eLin
eC
hart
tosh
ow/
hide
the
seri
es,sa
veth
ech
art’
scu
rren
tst
ate
ina
scre
ensh
ot,an
dto
swit
chbe
twee
npi
ean
dfu
nnel
char
tvi
sual
izat
ion
styl
es.
Tab
leA
.24:
Tex
tIT
’sLab
el,Lin
eC
hart
and
Pie
Cha
rtV
isua
liza
tion
Obj
ects
.
153
APPENDIX A. APPLICATION OBJECTS
Image
Nam
eD
escr
ipti
on
Sca
tter
Cha
rt
Dis
play
spo
ints
ina
2DC
arte
sian
coor
dina
tesy
stem
.V
alue
sof
ever
ypo
int
can
bese
enw
hen
hove
ring
the
mou
seov
erit
.Se
ries
and
axis
labe
lsca
nbe
confi
gure
dov
erth
eEdi
tW
indo
w.
Gau
geC
hart
The
Gau
geC
hart
issi
mila
rto
ala
belin
that
itdi
spla
ysa
sing
leva
lue.
How
ever
,th
eG
auge
Cha
rtdi
spla
ysva
lues
byan
imat
ing
ane
edle
mov
ing
cloc
kwis
e.
Tab
leA
.25:
Tex
tIT
’sSca
tter
Cha
rtan
dG
auge
Cha
rtV
isua
liza
tion
Obj
ects
.
154
A.5. TEXTIT OBJECTS
Image
Nam
eD
escr
ipti
on
3DV
iewer
The
3DV
iewer
rend
ers
a3D
scen
eco
ntai
ning
anob
ject
rota
ted
acco
rdin
gto
the
quat
erni
onpr
ovid
edth
roug
hit
sin
put
port
.If
fed
wit
hse
ries
ofqu
ater
nion
s,it
will
anim
ate
the
rota
tion
ofth
e3D
obje
ct.
JSO
NV
iewer
The
JSO
NV
iew
erdi
spla
ysa
JSO
Nfil
ein
afo
rmat
ted
man
ner.
The
JSO
NV
iewer
does
not
enab
leed
itin
gth
eJS
ON
files
.
Tab
leA
.26:
Tex
tIT
’s3D
Vie
wer
and
JSO
NV
iewer
Vis
ualiza
tion
Obj
ects
.
155
APPENDIX A. APPLICATION OBJECTS
Image
Nam
eD
escr
ipti
on
Tab
leV
iewer
The
Tab
leV
iewer
issi
mila
rto
the
JSO
NV
iewer
.T
heTab
leV
iewer
disp
lays
adi
ctio
nary
prov
ided
thro
ugh
its
inpu
tpo
rtas
ata
ble.
Vid
eoPla
yer
As
its
nam
ein
dica
tes,
the
Vid
eoPla
yer
play
svi
deo
files
.V
ideo
files
are
adde
dov
erth
eEdi
tW
indo
w.
Vol
ume
and
scre
enm
ode
(i.e
.no
rmal
,fu
llsc
reen
)ar
eco
nfigu
rabl
e.
Tab
leA
.27:
Tex
tIT
’sTab
leV
iewer
and
Vid
eoPla
yer
Vis
ualiza
tion
Obj
ects
.
156
A.5. TEXTIT OBJECTS
A.5.1 Code Editor
A TextIT object with particular importance is the Code Editor. The Code Editor ’sEdit Window is divided in left and right frames. Developers write JavaScript sourcecode on the left and visualize execution results on the right side. Syntax errors andcompile warnings (e.g. unused Variables) are displayed next to the source code line.When the source code in the Code Editor is executed, the Code Editor creates itsoutput ports dynamically. A screenshot of the Code Editor is shown in Figure A.4.
157
APPENDIX A. APPLICATION OBJECTS
Fig
ure
A.4
:Tex
tIT
’sC
ode
Edi
tor.
The
win
dow
isdi
vide
din
two
view
s.C
ode
issh
own
atth
ele
ftan
dex
ecut
ion
resu
lts
are
show
non
the
righ
tvi
ew.
Com
pile
war
ning
sar
esh
own
asex
clam
atio
nm
arks
and
erro
rsar
esh
own
asre
dcr
osse
son
the
left
mar
gin.
158
A.5. TEXTIT OBJECTS
A.5.2 Line Chart Displaying Jogging Signal
Figure A.5 displays a TextIT project that displays linear acceleration data recordedwhile a user was jogging. When the Data Loader ’s output port is connected to theLine Chart ’s data port, the Line Chart renders the signal in a line chart plot.
Figure A.5: Jogging signal displayed by a Line Chart View in TextIT.
A.5.3 TextIT Plugin for Counting Peaks in a Signal
This section presents the pseudo-code used to count the number of peaks in a signal.This pseudo-code is used by the KneeHapp bandage to count the number of side hopsa patient performs in a specified period of time. Section 6.1.3 describes the use caseand the usage of this pseudo-code in more detail.
159
APPENDIX A. APPLICATION OBJECTS
funct i on peakCount = countPeaks (v , d e l t a )minValue = In f ;maxValue = −I n f ;minIdx = NaN;maxIdx = NaN;lookForMax = 1 ;
f o r i = 1 : l ength (v )currentSample = v( i ) ;
i f currentSample > maxValuemaxValue = currentSample ;
maxIdx = i ;end
i f currentSample < minValueminValue = currentSample ;minIdx = i ;
end
i f lookForMaxi f currentSample < maxValue − de l t apeakCount ++minValue = currentSample ;minIdx = i ;lookForMax = 0 ;
ende l s ei f currentSample > minValue + de l t a
maxValue = currentSample ;maxIdx = i ;lookForMax = 1 ;
endend
end
160
List of Figures
1.1 a) In the summative research approach the artifacts produced by atechnology are assessed. b) In the formative research approach, theartifacts produced by a technology are used to assess and improve thetechnology. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.1 a) Smart garments compared to other devices according to their in-tegration with the human body, based on previous work from SteveMann [71]. b) Wearable devices represented as a taxonomy. . . . . . . . 8
2.2 The three generations of smart textiles. . . . . . . . . . . . . . . . . . 92.3 a) Qing Dynasty’s abacus ring. Image taken from http://www.chinaculture.org.
b) Eudaemonics’ shoe used to cheat at the casino. Image takenfrom: http://eyetap.org/wearcam/eudaemonic/ with permission ofSteve Mann. c) Ivan Sutherland’s HMD [111]. Communications ofthe ACM 2002, Vol. 11:2. Copyright c⃝2002 by ACM, Inc. Reprintedwith permission of ACM, Inc. . . . . . . . . . . . . . . . . . . . . . . . 10
2.4 a) Steve Mann’s HMD [69]. Reprinted with permission of Steve Mann.b) CMU’s VuMan 2 wearable computer [105]. Reprinted with permis-sion of Daniel Siewiorek. c) Forms for wearability [35]. Reprinted withpermission of Francine Gemperle. . . . . . . . . . . . . . . . . . . . . . 10
2.5 a) Georgia Tech’s Wearable Motherboard (GTWM) [90]. Copyrightc⃝2002 by ACM, Inc. Reprinted with permission of ACM, Inc. b)Musical Jacket [94]. Reprinted with permission of Maggie Orth. . . . . 13
2.6 Textile fabrication process. . . . . . . . . . . . . . . . . . . . . . . . . 142.7 Different types of conductive yarns. (a) A copper foil wrapped around
a non-conductive yarn. Reprinted with permission of Maggie Orthof International Fashion Machines. (b) A wire wrapped around non-conductive fiber strands [109]. Licensed under Creative Commons BY4.0. (c) Multifilament yarn coated with a metallic layer [109]. Licensedunder Creative Commons BY 4.0. . . . . . . . . . . . . . . . . . . . . . 15
2.8 Electronic devices attached to fabric. (a) Flexible electronic moduleglued to the fabric and connected with embroidered conductive linesdeveloped by the Fraunhofer’s Institute for Reliability and Microinte-gration [67]. (b) Microcontroller connected with conductive thread toconductive snap buttons crimped. Developed in collaboration betweenthe Technische Universität München and the Universität der Künste[43]. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
161
LIST OF FIGURES
2.9 Model of a smart textile. . . . . . . . . . . . . . . . . . . . . . . . . . 162.10 Taxonomy of textile sensors. . . . . . . . . . . . . . . . . . . . . . . . 182.11 Taxonomy of output devices. . . . . . . . . . . . . . . . . . . . . . . . 202.12 Textile output devices. a) Textile thermochromic display. Reprinted
with permission of Katharina Bredies. b) Textile thermochromic dis-play. Reprinted with permission of Katharina Bredies. . . . . . . . . . . 21
2.13 Taxonomy of connections used in smart textiles. . . . . . . . . . . . . 22
3.1 Overview of TangoHapps. . . . . . . . . . . . . . . . . . . . . . . . . . 353.2 Model of a Smart Textile. . . . . . . . . . . . . . . . . . . . . . . . . . 363.3 Main abstractions of an Application in TangoHapps. . . . . . . . . . . 373.4 Model of the Editor. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373.5 Model of the Simulator. . . . . . . . . . . . . . . . . . . . . . . . . . . 383.6 Model of the Deployer. . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
4.1 High-level design of TangoHapps. . . . . . . . . . . . . . . . . . . . . . 454.2 TangoHapps’s layered architecture. Software components on the Hard-
ware layer appear grayed out because they consist of libraries developedby third-party developers reused within TangoHapps. . . . . . . . . . . 46
4.3 Deployment of software components in TangoHapps. . . . . . . . . . . 484.4 Main classes of the Running Engine software component. . . . . . . . 494.5 Taxonomy of Application Objects. . . . . . . . . . . . . . . . . . . . . 504.6 Main classes of the Plugin Interpreter component. . . . . . . . . . . . 554.7 Main classes of the Firmata Library component. The Virtual Commu-
nication Module belongs to the Simulator software component. . . . . 564.8 Overview of the Editor software component. . . . . . . . . . . . . . . 574.9 Main classes involved in the Palette. . . . . . . . . . . . . . . . . . . . 584.10 Main classes involved in the Toolbar. . . . . . . . . . . . . . . . . . . . 594.11 Main classes involved in the Canvas. . . . . . . . . . . . . . . . . . . . 604.12 Main classes involved in the Editor ’s functionality to create circuit
layouts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 614.13 Layout (a) and object diagram (b) of an electric circuit that consists
of a Button connected to an Arduino Lilypad. . . . . . . . . . . . . . . 624.14 Main classes in the Plugin Editor. . . . . . . . . . . . . . . . . . . . . 634.15 Main classes in the Simulator. . . . . . . . . . . . . . . . . . . . . . . 664.16 Usage of the Decorator pattern in the Simulator. . . . . . . . . . . . . 674.17 Main classes of the Deployer. . . . . . . . . . . . . . . . . . . . . . . . 68
5.1 Interactex Designer ’s Project Selection Screen. . . . . . . . . . . . . . 705.2 Interactex Designer UI overview. . . . . . . . . . . . . . . . . . . . . . 715.3 Invocations between Events of a Button and Methods of an LED. . . . 735.4 Circuit layout of the CTS-Gauntlet developed in Interactex Designer. 745.5 Simulation of different objects in Interactex Designer. . . . . . . . . . 755.6 Interactex Client ’s User Application Screen (a) and Default Application
Screen (b). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 775.7 Overview of TextIT ’s user interface. . . . . . . . . . . . . . . . . . . . 795.8 Visual components of a TextIT Object. . . . . . . . . . . . . . . . . . . 80
162
LIST OF FIGURES
5.9 WaveCap implemented in Interactex Designer. . . . . . . . . . . . . . 805.10 Development of an algorithm to detect jogging speed in TextIT. . . . . 825.11 TextIT code to estimate jogging speed for the WaveCap application. . 835.12 Usage of a TextIT plugin in Interactex Designer to control smart-
phone’s volume based on estimated jogging speed. . . . . . . . . . . . . 835.13 Circuit layout of WaveCap developed in Interactex Designer. Hardware
Devices from left to right: a Textile Sensor, a BLE-Lilypad and a TextileSpeaker (renamed to “Radio Speaker)”. . . . . . . . . . . . . . . . . . . 84
6.1 a) KneeHapp Bandage. b) KneeHapp sock. . . . . . . . . . . . . . . . . 886.2 a) KneeHapp coordinate system. b) Illustration of the medial collapse
during a squat. and how it affects the pitch. . . . . . . . . . . . . . . . 896.3 One-leg hop linear acceleration on y and z-axis raw (a) and filtered (b). 916.4 One-leg hop average pressure. . . . . . . . . . . . . . . . . . . . . . . . 916.5 Linear acceleration of seven side hops. raw (a) and filtered (b). . . . . . 926.6 KneeHapp’s Range of Motion use case implemented with TangoHapps. 936.7 KneeHapp’s functionality to produce auditive feedback when the user’s
knee deviates more than 4 degrees during a One-Leg Squat. . . . . . . . 956.8 KneeHapp’s One-Leg Squat use case implemented with TangoHapps. . . 966.9 KneeHapp’s One-Leg Hop use case implemented with TangoHapps. . . 986.10 KneeHapp’s Side Hops use case implemented with TangoHapps. . . . . 1006.11 a) Custodian Jacket - fourth version. b) Backside of a rack in a super-
computer center. Some height unit labels are hidden by cables. . . . . . 1046.12 Linear acceleration along y-axis while user is standing (a) and walking
(b). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1056.13 Classification of physical activities based on IMU data. Axes corre-
spond to the deviation of the linear acceleration along the x, y andz-axes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
6.14 TangoHapps implementations of: a) Function that checks whether theproximity measured by a Proximity Sensor is in the range 35 ± 2cm.b) Function that sets the color of the RGB LED to red. . . . . . . . . . 108
6.15 Sketch of a TangoHapps implementation of the Custodian’s Error Pre-vention use case. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
6.16 Custodian’s Technician’s Safety use case implemented with TangoHapps.1116.17 Custodian’s Emergency Situation use case implemented with Tango-
Happs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
7.1 a) Three female subjects develop an electronic circuit with InteractexDesigner. b) Application 1 consisting of 12 LEDs. . . . . . . . . . . . . 117
7.2 a) A paper prototype of a jacket developed during the Study 1. b)Textile testbed containing two accelerometers, a push button, switch,light sensor, buzzer, vibration motor and a three-color LED. Everyelectronic device is connected with conductive fabric to a custom-madeArduino Lilypad with an integrated BLE module (in the center). . . . . 119
163
LIST OF FIGURES
A.1 Palette Objects for instantiating Smart Textiles, Electronic Devices (e.g.Hardware Devices and Microcontrollers) and UI Widgets available inInteractex Designer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
A.2 Palette Objects for instantiating Signal Processing Functions, Variablesand Operators available in Interactex Designer. . . . . . . . . . . . . . 137
A.3 Configuration View of a Label and Timer. . . . . . . . . . . . . . . . . 145A.4 TextIT ’s Code Editor. The window is divided in two views. Code
is shown at the left and execution results are shown on the rightview. Compile warnings are shown as exclamation marks and errorsare shown as red crosses on the left margin. . . . . . . . . . . . . . . . 158
A.5 Jogging signal displayed by a Line Chart View in TextIT. . . . . . . . 159
164
List of Tables
1.1 Smart textiles used to elicit the initial requirements of TangoHapps. . . 5
5.1 Toolbar Buttons in Interactex Designer. . . . . . . . . . . . . . . . . . . 725.2 Images of Hooks in Interactex Designer. Filled shapes indicate that a
parameter provided by an Event comply those expected by a Method.Hollow shapes indicate that the Event does not provide a parameter ofthe type expected by the Method. . . . . . . . . . . . . . . . . . . . . . 73
A.1 UI Widgets supported by TangoHapps. . . . . . . . . . . . . . . . . . . 126A.2 Sensors supported by TangoHapps. . . . . . . . . . . . . . . . . . . . . 127A.3 Output Devices supported by TangoHapps. . . . . . . . . . . . . . . . . 128A.4 Microcontrollers supported by TangoHapps. . . . . . . . . . . . . . . . 129A.5 Variables supported by TangoHapps. . . . . . . . . . . . . . . . . . . . 129A.6 Comparison Operators supported by TangoHapps. . . . . . . . . . . . . 130A.7 Logical Operators supported by TangoHapps. . . . . . . . . . . . . . . . 131A.8 Arithmetic Operators supported by TangoHapps. . . . . . . . . . . . . 132A.9 Signal processing and classification Functions supported by TangoHapps.133A.10 Utility programming elements supported by TangoHapps. . . . . . . . . 134A.11 Variables of UI Widgets (Part 1/2). . . . . . . . . . . . . . . . . . . . . 138A.12 Variables of UI Widgets (Part 2/2). . . . . . . . . . . . . . . . . . . . . 139A.13 Variables of Hardware Devices (Part 1/2). . . . . . . . . . . . . . . . . 140A.14 Variables of Hardware Devices (Part 2/2). . . . . . . . . . . . . . . . . 141A.15 Variables of Programming Objects (Part 1/2). . . . . . . . . . . . . . . 142A.16 Variables of Programming Objects (Part 2/2). . . . . . . . . . . . . . . 143A.17 Subclasses of Sensor Simulation Object. . . . . . . . . . . . . . . . . . . 146A.18 Subclasses of Sensor Simulation Object. . . . . . . . . . . . . . . . . . . 147A.19 Subclasses of Output Device Simulation Object. . . . . . . . . . . . . . 148A.20 Subclasses of Programming Simulation Objects. . . . . . . . . . . . . . 149A.21 Subclasses of Programming Simulation Objects. . . . . . . . . . . . . . 149A.22 TextIT ’s Control Objects. . . . . . . . . . . . . . . . . . . . . . . . . . 151A.23 TextIT ’s Iterator, Delayer and Library Manager Control Objects. . . . 152A.24 TextIT ’s Label, Line Chart and Pie Chart Visualization Objects. . . . . 153A.25 TextIT ’s Scatter Chart and Gauge Chart Visualization Objects. . . . . 154A.26 TextIT ’s 3D Viewer and JSON Viewer Visualization Objects. . . . . . 155A.27 TextIT ’s Table Viewer and Video Player Visualization Objects. . . . . 156
165
LIST OF TABLES
166
Bibliography
[1] S. C. Anand, J. F. Kennedy, M. Miraftab, and S. Rajendran. Medical textilesand biomaterials for healthcare. Elsevier, 2005.
[2] S. Ananthanarayan, M. Sheh, A. Chien, H. Profita, and K. Siek. Pt Viz: To-wards a Wearable Device for Visualizing Knee Rehabilitation Exercises. In Pro-ceedings of the SIGCHI Conference on Human Factors in Computing Systems,CHI ’13, pages 1247–1250, New York, NY, USA, 2013. ACM.
[3] F. Axisa, P. M. Schmitt, C. Gehin, G. Delhomme, E. McAdams, and A. Dittmar.Flexible technologies and smart clothing for citizen medicine, home health-care, and disease prevention. IEEE transactions on information technology inbiomedicine, 9(3):325–336, 2005.
[4] M. Ayoade and L. Baillie. A Novel Knee Rehabilitation System for the Home.In Proceedings of the SIGCHI Conference on Human Factors in ComputingSystems, CHI ’14, pages 2521–2530, New York, NY, USA, 2014. ACM.
[5] R. T. Azuma and Others. A survey of augmented reality. Presence, 6(4):355–385, 1997.
[6] S. K. Bahadir, V. Koncar, and F. Kalaoglu. Wearable obstacle detection systemfully integrated to textile structures for visually impaired people. Sensors andActuators A: Physical, 179:297–311, 2012.
[7] R. Ballagas, F. Memon, R. Reiners, and J. Borchers. iStuff mobile: rapidlyprototyping new mobile phone interfaces for ubiquitous computing. In Proceed-ings of the SIGCHI conference on Human factors in computing systems, pages1107–1116. ACM, 2007.
[8] R. Ballagas, M. Ringel, M. Stone, and J. Borchers. iStuff: a physical userinterface toolkit for ubiquitous computing environments. In Proceedings of theSIGCHI conference on Human factors in computing systems, pages 537–544.ACM, 2003.
[9] D. Bannach, P. Lukowicz, and O. Amft. Rapid prototyping of activity recogni-tion applications. Pervasive Computing, IEEE, 7(2):22–31, 2008.
[10] W. Barfield and T. Caudell. Fundamentals of wearable computers and augmentedreality. CRC Press, 2001.
167
BIBLIOGRAPHY
[11] S. Bielska, M. Sibinski, and A. Lukasik. Polymer temperature sensor for tex-tronic applications. Materials Science and Engineering: B, 165(1):50–52, 2009.
[12] F. Block, M. Haller, H. Gellersen, C. Gutwin, and M. Billinghurst. VoodooS-ketch: extending interactive surfaces with adaptable interface palettes. In Pro-ceedings of the 2nd international conference on Tangible and embedded interac-tion, pages 55–58. ACM, 2008.
[13] S. Brady, D. Diamond, and K.-T. Lau. Inherently conducting polymer modi-fied polyurethane smart foam for pressure sensing. Sensors and Actuators A:Physical, 119(2):398–404, 2005.
[14] L. Buechley. A construction kit for electronic textiles. In 10th IEEE Interna-tional Symposium on Wearable Computers, 2006, pages 83–90. IEEE, 2006.
[15] F. Carpi and D. De Rossi. Electroactive polymer-based devices for e-textilesin biomedicine. IEEE Transactions on Information Technology in Biomedicine,9(3):295–318, sep 2005.
[16] L. M. Castano and A. B. Flatau. Smart fabric sensors and e-textile technologies:a review. Smart Materials and Structures, 23(5):53001, 2014.
[17] M.-H. Cheng, L.-C. Chen, Y.-C. Hung, and C. M. Yang. A real-time maximum-likelihood heart-rate estimator for wearable textile sensors. In Engineering inMedicine and Biology Society, 2008. EMBS 2008. 30th Annual InternationalConference of the IEEE, pages 254–257. IEEE, 2008.
[18] K. Cherenack and L. van Pieterson. Smart textiles: Challenges and opportuni-ties. Journal of Applied Physics, 112(9):91301, 2012.
[19] S. Coyle, F. Benito-Lopez, T. Radu, K. Lau, and D. Diamond. Fibers and fabricsfor chemical and biological sensing. Research Journal of Textile and Apparel,14(4):64–72, 2012.
[20] D. Curone, E. L. Secco, A. Tognetti, G. Loriga, G. Dudnik, M. Risatti,R. Whyte, A. Bonfiglio, and G. Magenes. Smart Garments for Emergency Op-erators: The ProeTEX Project. IEEE Transactions on Information Technologyin Biomedicine, 14(3):694–701, may 2010.
[21] S. Curry. An introduction to the Java Ring. Java Wrold, April, 1998.
[22] R. Daude and M. Weck. Mobile approach support system for future machinetools. In First International Symposium on Wearable Computers, 1997. Digestof Papers, pages 24–30, oct 1997.
[23] A. Dey, R. Hamid, C. Beckmann, I. Li, and D. Hsu. a CAPpella: programmingby demonstration of context-aware applications. Proceedings of the SIGCHIconference on Human factors in computing systems, 6(1):40, 2004.
168
BIBLIOGRAPHY
[24] A. K. Dey, R. Hamid, C. Beckmann, I. Li, and D. Hsu. a CAPpella: pro-gramming by demonstration of context-aware applications. In Proceedings ofthe SIGCHI conference on Human factors in computing systems, pages 33–40.ACM, 2004.
[25] M. Di Rienzo, F. Rizzo, G. Parati, G. Brambilla, M. Ferratini, and P. Cas-tiglioni. MagIC System: a New Textile-Based Wearable Device for BiologicalSignal Monitoring. Applicability in Daily Life and Clinical Setting. Confer-ence proceedings: Annual International Conference of the IEEE Engineering inMedicine and Biology Society, 7:7167–7169, 2005.
[26] B. Ding, M. Wang, J. Yu, and G. Sun. Gas sensors based on electrospunnanofibers. Sensors, 9(3):1609–1624, 2009.
[27] P. Dragicevic and J.-D. Fekete. Support for input adaptability in the ICONtoolkit. In Proceedings of the 6th international conference on Multimodal inter-faces, pages 212–219. ACM, 2004.
[28] P. Dragicevic and J.-D. Fekete. The input configurator toolkit: towards highinput adaptability in interactive applications. In Proceedings of the workingconference on Advanced visual interfaces, pages 244–247. ACM, 2004.
[29] B. Eskofier, M. Oleson, C. DiBenedetto, and J. Hornegger. Embedded surfaceclassification in digital sports. Pattern Recognition Letters, 30(16):1448–1456,2009.
[30] J. Favre, R. Aissaoui, B. Jolles, J. de Guise, and K. Aminian. Functionalcalibration procedure for 3D knee joint angle description using inertial sensors.Journal of Biomechanics, 42(14):2330–2335, oct 2009.
[31] J. Favre, B. M. Jolles, R. Aissaoui, and K. Aminian. Ambulatory measurementof 3D knee joint angle. Journal of biomechanics, 41(5):1029–1035, 2008.
[32] S. Feiner, B. MacIntyre, T. Hollerer, and A. Webster. A touring machine:prototyping 3D mobile augmented reality systems for exploring the urban en-vironment. In First International Symposium on Wearable Computers, 1997.Digest of Papers, pages 74–81, oct 1997.
[33] F. H. Fu, C. H. Bennett, C. Lattermann, and C. B. Ma. Current trends inanterior cruciate ligament reconstruction part 1: biology and biomechanics ofreconstruction. The American Journal of Sports Medicine, 27(6):821–830, 1999.
[34] E. Gamma. Design patterns: elements of reusable object-oriented software. Pear-son Education India, 1995.
[35] F. Gemperle, C. Kasabach, J. Stivoric, M. Bauer, and R. Martin. Design forwearability. In Second International Symposium on Wearable Computers, 1998.,pages 116–122. IEEE, 1998.
169
BIBLIOGRAPHY
[36] G. Gioberto, C.-H. Min, C. Compton, and L. E. Dunne. Lower-limb GoniometryUsing Stitched Sensors: Effects of Manufacturing and Wear Variables. In Pro-ceedings of the 2014 ACM International Symposium on Wearable Computers,ISWC ’14, pages 131–132, New York, NY, USA, 2014. ACM.
[37] S. Greenberg and C. Fitchett. Phidgets: easy development of physical interfacesthrough physical widgets. In Proceedings of the 14th annual ACM symposiumon User interface software and technology, pages 209–218. ACM, 2001.
[38] L. Y. Griffin, J. Agel, M. J. Albohm, E. A. Arendt, R. W. Dick, W. E. Garrett,J. G. Garrick, T. E. Hewett, L. Huston, M. L. Ireland, and Others. Noncon-tact anterior cruciate ligament injuries: risk factors and prevention strategies.Journal of the American Academy of Orthopaedic Surgeons, 8(3):141–150, 2000.
[39] P. Grossman. The LifeShirt: a multi-function ambulatory system monitoringhealth, disease, and medical intervention in the real world. Stud Health TechnolInform, 108:133–141, 2004.
[40] J. Habetha. The MyHeart project-fighting cardiovascular diseases by preventionand early diagnosis. In Proceedings of the IEEE Engineering in Medicine andBiology Society, pages 6746–6749. Citeseer, 2006.
[41] J. Haladjian. TangoHapps: an integrated development environment for smartgarments. In Proceedings of the 2015 ACM International Joint Conference onPervasive and Ubiquitous Computing and Proceedings of the 2015 ACM Inter-national Symposium on Wearable Computers, pages 471–476. ACM, 2015.
[42] J. Haladjian, K. Bredies, and B. Bruegge. Interactex: An integrated develop-ment environment for smart textiles. In Proceedings of the 2016 ACM Interna-tional Joint Conference on Pervasive and Ubiquitous Computing and Proceed-ings of the 2016 ACM International Symposium on Wearable Computers. ACM,2016.
[43] J. Haladjian, Z. Hodaie, H. Xu, M. Yigin, B. Bruegge, M. Fink, and J. Hoeher.KneeHapp: a bandage for rehabilitation of knee injuries. In Proceedings of the2015 ACM International Joint Conference on Pervasive and Ubiquitous Com-puting and Proceedings of the 2015 ACM International Symposium on WearableComputers, pages 181–184. ACM, 2015.
[44] J. Haladjian, D. Ismailović, B. Köhler, and B. Bruegge. A quick prototypingframework for adaptive serious games with 2D physics on mobile touch devices.In IADIS Mobile Learning 2012 (ML 2012), Berlin, 2012.
[45] J. Haladjian, D. Richter, P. Muntean, D. Ismailović, and B. Brügge. A frame-work for the creation of mobile educational games for dyslexic children. In ML2013 - Mobile Learning, Lisbon, Portugal, mar 2013.
[46] J. Haladjian, F. Ziegler, B. Simeonova, B. Köhler, P. Muntean, D. Ismailović,and B. Brügge. A framework for game tuning. In GET 2012 - IADIS Gameand Entertainment Techonologies, Lisbon, Portugal, jul 2012.
170
BIBLIOGRAPHY
[47] H. Harms, O. Amft, D. Roggen, and G. Tröster. Rapid prototyping of smartgarments for activity-aware applications. Journal of Ambient Intelligence andSmart Environments, 1(2):87–101, 2009.
[48] B. Hartmann, L. Abdulla, M. Mittal, and S. R. Klemmer. Authoring sensor-based interactions by demonstration with direct manipulation and patternrecognition. In Proceedings of the SIGCHI conference on Human factors incomputing systems, pages 145–154. ACM, 2007.
[49] B. Hartmann, S. R. Klemmer, and M. Bernstein. d. tools: Integrated prototyp-ing for physical interaction design. IEEE Pervasive Computing, 2005.
[50] S. Hodges, N. Villar, N. Chen, T. Chugh, J. Qi, D. Nowacka, and Y. Kawahara.Circuit stickers: Peel-and-stick construction of interactive electronic prototypes.In Proceedings of the SIGCHI Conference on Human Factors in ComputingSystems, pages 1743–1746. ACM, 2014.
[51] T. Holleczek, A. Rüegg, H. Harms, and G. Tröster. Textile Pressure Sensorsfor Sports Applications. In Proceedings of the 9th IEEE Conference on Sensors(IEEE Sensors 2010), pages 732–737, nov 2010.
[52] C.-T. Huang, C.-L. Shen, C.-F. Tang, and S.-H. Chang. A wearable yarn-basedpiezo-resistive sensor. Sensors and Actuators A: Physical, 141(2):396–403, 2008.
[53] Y. Huang and M. Eisenberg. Plushbot: an application for the design of pro-grammable, interactive stuffed toys. In Proceedings of the fifth internationalconference on Tangible, embedded, and embodied interaction, pages 257–260.ACM, 2011.
[54] S. E. Hudson and J. Mankoff. Rapid construction of functioning physical in-terfaces from cardboard, thumbtacks, tin foil and masking tape. In Proceedingsof the 19th annual ACM symposium on User interface software and technology,pages 289–298. ACM, 2006.
[55] Z. Hui, T. X. Ming, Y. T. Xi, and L. X. Sheng. Pressure sensing fabric. In MRSProceedings, volume 920, pages 0920—-S05. Cambridge Univ Press, 2006.
[56] J. A. Jacko. Human computer interaction handbook: Fundamentals, evolvingtechnologies, and emerging applications. CRC press, 2012.
[57] A. R. Kahn, J. D. Hixson, J. E. Puffer, and E. E. Bakken. Three-Years’ ClinicalExperience with Radioisotope Powered Cardiac Pacemakers. IEEE Transactionson Biomedical Engineering, BME-20(5):326–331, sep 1973.
[58] E.-S. Katterfeldt, N. Dittert, and H. Schelhowe. EduWear: smart textiles asways of relating computing technology to everyday life. In Proceedings of the8th International Conference on Interaction Design and Children, pages 9–17.ACM, 2009.
171
BIBLIOGRAPHY
[59] B. Kaufmann and L. Buechley. Amarino: a toolkit for the rapid prototypingof mobile ubiquitous computing. In Proceedings of the 12th international con-ference on Human computer interaction with mobile devices and services, pages291–298. ACM, 2010.
[60] S. R. Klemmer, J. Li, J. Lin, and J. A. Landay. Papier-M{â}ch{é}: Toolkitsupport for tangible interaction. In Proceedings of the 16th annual ACM sympo-sium on user interface software and technology (UIST 2003), Vancouver, BritishColumbia, Canada, 2003.
[61] S. R. Klemmer, J. Li, J. Lin, and J. A. Landay. Papier-Mache: toolkit supportfor tangible input. In Proceedings of the SIGCHI conference on Human factorsin computing systems, pages 399–406. ACM, 2004.
[62] J. Krumm. Ubiquitous computing fundamentals. CRC Press, 2009.
[63] B. Larson, H. Elmqvist, L. Ryden, and H. Schüller. Lessons From the FirstPatient with an Implanted Pacemaker:. Pacing and Clinical Electrophysiology,26(1p1):114–124, 2003.
[64] J. C. Lee, D. Avrahami, S. E. Hudson, J. Forlizzi, P. H. Dietz, and D. Leigh. Thecalder toolkit: wired and wireless components for rapidly prototyping interactivedevices. In Proceedings of the 5th conference on Designing interactive systems:processes, practices, methods, and techniques, pages 167–175. ACM, 2004.
[65] Y. Li, J. I. Hong, and J. A. Landay. Topiary: a tool for prototyping location-enhanced applications. In Proceedings of the 17th annual ACM symposium onUser interface software and technology, pages 217–226. ACM, 2004.
[66] Y. Li, M. Y. Leung, X. M. Tao, X. Y. Cheng, J. Tsang, and M. C. W. Yuen.Polypyrrole-coated conductive fabrics as a candidate for strain sensors. Journalof materials science, 40(15):4093–4095, 2005.
[67] T. Linz, C. Kallmayer, R. Aschenbrenner, and H. Reichl. Embroidering electri-cal interconnects with conductive yarn for the integration of flexible electronicmodules into fabric. In null, pages 86–91. IEEE, 2005.
[68] S. Lysecky and F. Vahid. Enabling nonexpert construction of basic sensor-based systems. ACM Transactions on Computer-Human Interaction (TOCHI),16(1):1, 2009.
[69] S. Mann. An historical account of the ’WearComp’ and ’WearCam’ inventionsdeveloped for applications in ’personal imaging’. In First International Sympo-sium on Wearable Computers, 1997. Digest of Papers, pages 66–73, oct 1997.
[70] S. Mann. Wearable computing: a first step toward personal imaging. Computer,30(2):25–32, feb 1997.
172
BIBLIOGRAPHY
[71] S. Mann. Can Humans Being Clerks make Clerks be Human? Exploringthe Fundamental Difference between UbiComp and WearComp. it-InformationTechnology (vormals it+ ti) Methoden und innovative Anwendungen der Infor-matik und Informationstechnik, 43(2/2001):97, 2001.
[72] N. Marquardt and S. Greenberg. Distributed physical interfaces with sharedphidgets. In Proceedings of the 1st international conference on Tangible andembedded interaction, pages 13–20. ACM, 2007.
[73] T. Martin, M. Jones, J. Edmison, and R. Shenoy. Towards a design frameworkfor wearable electronic textiles. Seventh IEEE International Symposium onWearable Computers, 2003. Proceedings, 2003.
[74] C. Mattmann, O. Amft, H. Harms, G. Tröster, and F. Clemens. Recognizingupper body postures using textile strain sensors. In 11th IEEE InternationalSymposium on Wearable Computers, 2007, pages 29–36. IEEE, 2007.
[75] C. Mattmann, T. Kirstein, and G. Tröster. A method to measure elongations ofclothing. In International Scientific Conference Ambience05, Tampere, Finland,page 18, 2005.
[76] C. Mattmann and G. Troster. Design concept of clothing recognizing backpostures. In 3rd IEEE/EMBS International Summer School on Medical Devicesand Biosensors, 2006, pages 24–27. IEEE, 2006.
[77] M. Mauriello, M. Gubbels, and J. E. Froehlich. Social Fabric Fitness: TheDesign and Evaluation of Wearable E-textile Displays to Support Group Run-ning. In Proceedings of the 32Nd Annual ACM Conference on Human Factorsin Computing Systems, CHI ’14, pages 2833–2842, New York, NY, USA, 2014.ACM.
[78] D. Meoli and T. May-Plumlee. Interactive electronic textile development: Areview of technologies. Journal of Textile and Apparel, Technology and Man-agement, 2(2):1–12, 2002.
[79] J. Meyer, P. Lukowicz, and G. Tröster. Textile pressure sensor for muscle activityand motion detection. In 10th IEEE International Symposium on WearableComputers, 2006, pages 69–72. IEEE, 2006.
[80] L. Morton, L. Baillie, and R. Ramirez-Iniguez. Pose calibrations for inertialsensors in rehabilitation applications. In IEEE 9th International Conference onWireless and Mobile Computing, Networking and Communications (WiMob),2013, pages 204–211, oct 2013.
[81] L. J. Najjar, J. C. Thompson, and J. J. Ockerman. A wearable computer forquality assurance inspectors in a food processing plant. In First InternationalSymposium on Wearable Computers, 1997. Digest of Papers, pages 163–164, oct1997.
173
BIBLIOGRAPHY
[82] Z. Nakad, M. T. Jones, and T. Martin. Communications in Electronic TextileSystems. In Communications in Computing, pages 37–46, 2003.
[83] G. Ngai, S. C. F. Chan, W. W. Y. Lau, and J. C. Y. Cheung. A framework forcollaborative etextiles design - An introduction to co-etex. In Proceedings of the2009 13th International Conference on Computer Supported Cooperative Workin Design, CSCWD 2009, pages 191–196, 2009.
[84] G. Ngai, S. C. F. Chan, V. T. Y. Ng, J. C. Y. Cheung, S. S. S. Choy, W. W. Y.Lau, and J. T. P. Tse. I*CATch: A Scalable Plug-n-play Wearable ComputingFramework for Novices and Children. In Proceedings of the SIGCHI Conferenceon Human Factors in Computing Systems, CHI ’10, pages 443–452, New York,NY, USA, 2010. ACM.
[85] M. Ockendon and R. Gilbert. Validation of a Novel Smartphone Accelerometer-Based Knee Goniometer. Journal of Knee Surgery, 25(04):341–346, may 2012.
[86] J. J. Ockerman, L. J. Najjar, and J. C. Thompson. Wearable computers forperformance support: initial feasibility study. In First International Symposiumon Wearable Computers, 1997. Digest of Papers, pages 10–17, oct 1997.
[87] M. Pacelli, G. Loriga, N. Taccini, and R. Paradiso. Sensing Fabrics for Mon-itoring Physiological and Biomechanical Variables: E-textile solutions. In 3rdIEEE/EMBS International Summer School on Medical Devices and Biosensors,2006, pages 1–4, 2006.
[88] J. A. Paradiso and E. Hu. Expressive footwear for computer-augmented danceperformance. In First International Symposium on Wearable Computers, 1997.Digest of Papers, pages 165–166, oct 1997.
[89] R. Paradiso, G. Loriga, N. Taccini, A. Gemignani, and B. Ghelarducci.WEALTHY-a wearable healthcare system: new frontier on e-textile. Journal ofTelecommunications and Information Technology, pages 105–113, 2005.
[90] S. P. S. Park, K. Mackenzie, and S. Jayaraman. The wearable motherboard: aframework for personalized mobile information processing (PMIP). Proceedings2002 Design Automation Conference (IEEE Cat. No.02CH37324), pages 170–174, 2002.
[91] H. Perner-Wilson, L. Buechley, and M. Satomi. Handcrafting Textile Interfacesfrom a Kit-of-no-parts. In Proceedings of the Fifth International Conferenceon Tangible, Embedded, and Embodied Interaction, TEI ’11, pages 61–68, NewYork, NY, USA, 2011. ACM.
[92] R. W. Picard and J. Healey. Affective wearables. In First International Sym-posium on Wearable Computers, 1997. Digest of Papers, pages 90–97, oct 1997.
[93] E. R. Post and M. Orth. Smart fabric, or "wearable clothing". In First In-ternational Symposium on Wearable Computers, 1997. Digest of Papers, pages167–168, oct 1997.
174
BIBLIOGRAPHY
[94] E. R. Post, M. Orth, P. R. Russo, and N. Gershenfeld. E-broidery: Design andfabrication of textile-based computing. IBM Systems journal, 39(3.4):840–860,2000.
[95] R. Ramakers, K. Todi, and K. Luyten. PaperPulse: An Integrated Approachfor Embedding Electronics in Paper Designs. In Proceedings of the 33rd AnnualACM Conference on Human Factors in Computing Systems, pages 2457–2466.ACM, 2015.
[96] A. Rathnayake and T. Dias. Yarns with embedded electronics. In Proceed-ings of the 2015 ACM International Joint Conference on Pervasive and Ubiq-uitous Computing and Proceedings of the 2015 ACM International Symposiumon Wearable Computers, pages 385–388. ACM, 2015.
[97] V. Savage, X. Zhang, and B. Hartmann. Midas: fabricating custom capacitivetouch sensors to prototype interactive objects. In Proceedings of the 25th annualACM symposium on User interface software and technology, pages 579–588.ACM, 2012.
[98] S. Schneegass, M. Hassib, T. Birmili, and N. Henze. Towards a garment OS:supporting application development for smart garments. In Proceedings of the2014 ACM International Symposium on Wearable Computers: Adjunct Program,pages 261–266. ACM, 2014.
[99] W. R. Sherman and A. B. Craig. Understanding virtual reality: Interface, ap-plication, and design. Elsevier, 2002.
[100] Y. Shimokochi and S. J. Shultz. Mechanisms of noncontact anterior cruciateligament injury. Journal of athletic training, 43(4):396, 2008.
[101] M. Sibinski, M. Jakubowska, and M. Sloma. Flexible temperature sensors onfibers. Sensors, 10(9):7934–7946, 2010.
[102] J. Siegel and M. Bauer. A field usability evaluation of a wearable system. InFirst International Symposium on Wearable Computers, 1997. Digest of Papers,pages 18–22, oct 1997.
[103] S. G. Slater. New technology sensor fabrics to monitor health data. HomeHealth Care Management & Practice, 19(6):480–481, 2007.
[104] A. Smailagic. ISAAC: a voice activated speech response system for wearablecomputers. In First International Symposium on Wearable Computers, 1997.Digest of Papers, pages 183–184, oct 1997.
[105] A. Smailagic and D. P. Siewiorek. Matching interface design with user tasks.Modalities of interaction with CMU wearable computers. Personal Communi-cations, IEEE, 3(1):14–25, feb 1996.
175
BIBLIOGRAPHY
[106] M. B. Spitzer, N. M. Rensing, R. McClelland, and P. Aquilino. Eyeglass-basedsystems for wearable computing. In First International Symposium on WearableComputers, 1997. Digest of Papers, pages 48–51, oct 1997.
[107] P. Stanley-Marbell, D. Marculescu, R. Marculescu, and P. K. Khosla. Modeling,analysis, and self-management of electronic textiles. IEEE Transactions onComputers, 52(8):996–1010, 2003.
[108] T. Starner, J. Weaver, and A. Pentland. A wearable computer based Ameri-can sign language recognizer. In First International Symposium on WearableComputers, 1997. Digest of Papers, pages 130–137, oct 1997.
[109] M. Stoppa and A. Chiolerio. Wearable electronics and smart textiles: a criticalreview. Sensors, 14(7):11957–11992, 2014.
[110] M. Sundholm, J. Cheng, B. Zhou, A. Sethi, and P. Lukowicz. Smart-mat: Rec-ognizing and Counting Gym Exercises with Low-cost Resistive Pressure SensingMatrix. In Proceedings of the 2014 ACM International Joint Conference on Per-vasive and Ubiquitous Computing, UbiComp ’14, pages 373–382, New York, NY,USA, 2014. ACM.
[111] I. E. Sutherland. A Head-mounted Three Dimensional Display. In Proceedingsof the December 9-11, 1968, Fall Joint Computer Conference, Part I, AFIPS’68 (Fall, part I), pages 757–764, New York, NY, USA, 1968. ACM.
[112] H. Z. Tan and A. Pentland. Tactual displays for wearable computing. In FirstInternational Symposium on Wearable Computers, 1997. Digest of Papers, pages84–89, oct 1997.
[113] X. M. Tao. Smart Fibres, Fabrics and Clothing: Fundamentals and Applications.Woodhead Publishing Series in Textiles. Elsevier Science, 2001.
[114] B. Thomas, S. Tyerman, and K. Grimmer. Evaluation of three input mecha-nisms for wearable computers. In First International Symposium on WearableComputers, 1997. Digest of Papers, pages 2–9, oct 1997.
[115] C. Thompson, J. J. Ockerman, L. J. Najjar, and E. Rogers. Factory automationsupport technology (FAST): a new paradigm of continuous learning and supportusing a wearable. In First International Symposium on Wearable Computers,1997. Digest of Papers, pages 31–38, oct 1997.
[116] E. O. Thorp. The invention of the first wearable computer. In Second Interna-tional Symposium on Wearable Computers, 1998. Digest of Papers, pages 4–8,oct 1998.
[117] L. Van Langenhove and C. Hertleer. Smart clothing: a new life. Internationaljournal of clothing science and technology, 16(1/2):63–72, 2004.
176
BIBLIOGRAPHY
[118] N. Villar, J. Scott, S. Hodges, K. Hammil, and C. Miller. .NET gadgeteer: aplatform for custom devices. In Pervasive Computing, pages 216–233. Springer,2012.
[119] A. Wakita and Y. Anezaki. Intuino: an authoring tool for supporting theprototyping of organic interfaces. In Proceedings of the 8th ACM Conference onDesigning Interactive Systems, pages 179–188. ACM, 2010.
[120] M. Weiser. The Computer for the 21st Century. SIGMOBILE Mob. Comput.Commun. Rev., 3(3):3–11, jul 1999.
[121] Y.-L. Yang, M.-C. Chuang, S.-L. Lou, and J. Wang. Thick-film textile-basedamperometric sensors and biosensors. The Analyst, 135(6):1230–1234, 2010.
[122] S.-C. Yeh, S.-M. Chang, S.-Y. Chen, W.-Y. Hwang, T.-C. Huang, and T.-L. Tsai.A lower limb fracture postoperative-guided interactive rehabilitation trainingsystem and its effectiveness analysis. In 14th International Conference on e-Health, Networking, Applications and Services (Healthcom), 2012 IEEE, pages149–154, oct 2012.
[123] H. Zhang, X. Tao, T. Yu, and S. Wang. Conductive knitted fabric as large-straingauge under high temperature. Sensors and Actuators A: Physical, 126(1):129–140, 2006.
[124] H. Zhang, L. Tian, L. Zhang, and G. Li. Using textile electrode EMG for pros-thetic movement identification in transradial amputees. In IEEE InternationalConference on Body Sensor Networks (BSN), 2013, pages 1–5, may 2013.
[125] C. Zysset, K. Cherenack, T. Kinkeldei, and G. Tröster. Weaving integratedcircuits into textiles. In 2010 International Symposium on Wearable Computers(ISWC), pages 1–8. IEEE, 2010.
177