hci issues in extreme computing james a. landay endeavour-darpa meeting, 9/21/99
Post on 20-Dec-2015
214 views
TRANSCRIPT
• 2 •• UC Berkeley Endeavour Project •
HCI in the eXtreme Computing Era
• Future computing devices won’t have the same UI as current PCs
– wide range of devices• small or embedded in environment• often w/ “alternative” I/O & w/o screens • special purpose applications
– “information appliances”
– lots of devices per user• all working in concert
• How does one design for this environment?
• 3 •• UC Berkeley Endeavour Project •
Design Challenges
• Design of good appliances will be hard– how do you design cross-appliance “applications”?
• e.g., calendar app.: one speech based & one GUI based
• Hard to make different devices work together– multiple devices, UIs & modes, which to “display”?
• How to build UIs for a physical or virtual space?– take advantage of the resources as the user moves
• Information overload is a major problem– how to just extract what is relevant?
• 4 •• UC Berkeley Endeavour Project •
Key Technologies
• Tacit information analysis algorithms• Design tools that integrate
– “sketching” & other low-fidelity techniques– immediate context & tacit information– interface models
• 5 •• UC Berkeley Endeavour Project •
Our Approach
• Evaluate rough prototypes in target domains– learning– high-speed decision making
• Build– novel applications on existing appliances
• e.g., on the Palm PDA & CrossPad– new information appliances
• e.g., SpeechCoder (w/ ICSI)
• Evaluate in realistic settings • Iterate
– use the resulting experience to build • more interesting appliances • better design tools & analysis techniques
• 6 •• UC Berkeley Endeavour Project •
Domains of Focus• Group-based learning
– groups of students teach themselves material– “teachers” give structure, diagnose problems, & respond– shown successful outcomes, but doesn’t scale well– key idea: use ubiquitous sensors & activity data to allow
• teachers to stay aware of activities as class size scales• groups to find expertise among other groups
• Emergency response decision making– respond to fires, earthquakes, floods, hurricanes, ...– quickly allocate resources– situation awareness is paramount– key idea: use activity data to discover & exploit tacit
structure• user expertise & information quality• informal work teams & hierarchies
• 7 •• UC Berkeley Endeavour Project •
Analyze Tacit Activity: Find People & Info
• The real world– who is talking? who are they looking at? what else is
happening?
• The digital environment– who reads (or writes) what and when? – who communicates with whom and when? with what tools?
• Goal: Describe an information ecology– people w/ various expertise, backgrounds & roles
• quickly find human experts (e.g., how to restart pumps…)– documents with content, authority, intended audience…– structures: groups, communities, hierarchies, etc.– visualization that provides awareness without overload– feed this information back to the infrastructure
• Challenge: recognize/compute from sensor/activity data
• 8 •• UC Berkeley Endeavour Project •
Tacit Information Analysis Methods• Social Networks
– centrality measures for estimating authority
• Clustering– discovering tacit groups, and related
documents
• 9 •• UC Berkeley Endeavour Project •
Use Context: Improve Interaction
• Services to discover available devices– there is a wall display -> use it for my
wearable
• Choose interaction modes that don’t interfere
• 10 •• UC Berkeley Endeavour Project •
Use Context: Improve Interaction
• Services to discover available devices– there is a wall display -> use it for my wearable
• Choose interaction modes that don’t interfere– context understanding services
• people are talking -> don’t rely on speech I/O• user’s hands using tools -> use speech I/O & visual out
– use context as a way to search data collected by ubiquitous archiving services
-> UI design tools should understand context & support multimodal I/O
• 11 •• UC Berkeley Endeavour Project •
Multimodal Interaction
• Benefits– take advantage of more than 1 mode of input/output– computers could be used in more situations & places– UIs easier and useful to more people
• Building multimodal UIs is hard– often require immature “recognition” technology
• single mode toolkits recently appeared (“good enough”)
– hard to combine recognition technologies• few toolkits & no prototyping tools -> experts required
– this was the state of GUIs in 1980
• 12 •• UC Berkeley Endeavour Project •
Multimodal Design Tools Should Support• Rapid production of
“rough cuts”– don’t handle all cases– informal techniques
• sketching/storyboarding• “Wizard of Oz”
– iterative design• user testing/fast mods
• Generate initial code– UIs for multiple devices– designer adds detail &
improves interaction– programmers add code
• 13 •• UC Berkeley Endeavour Project •
Approach: Sketches & Models• Infer models from design “sketches”
– model is an abstraction of appliance’s UI design
• Use models to– semi-automatically generate UIs – dynamically adapt apps UI to changing context
Model
• 17 •• UC Berkeley Endeavour Project •
Specifying Non-Visual Elements
• How do designers do this now?– speech
• scripts or grammars (advanced designers only)• flowcharts on the whiteboard• “Wizard of Oz” -> fake it!
– gestures• give an example & then tell programmer what it
does
• We can do the same by demonstration
• 19 •• UC Berkeley Endeavour Project •
Plan for Success• Year 1
– evaluate context-aware prototypes in target domains (op6)– test & refine authority mining algorithms (op5)
• Year 2– design & implement multimodal UI design tool (op7)– implement tacit mining algorithms using sensing data for (op5)
• expert locator & query-free retrieval• providing visual awareness of group & task clustering
– create new applications using the tools for (op6)• learning• high-speed decision making
• Year 3– evaluate tools & applications– integrate with S/W & H/W design tools
• 21 •• UC Berkeley Endeavour Project •
State of the Art• Traditional tools & methodologies (paper, VB, …)
– no support for multimodal UIs (especially speech)– do not allow targeting one app to platforms w/ varying
I/O capabilities (assume like a PC)
• Model-based design tools– force designers to think abstractly about design
• Context-aware widgets– how do devices communicate high-level contexts?
• XML or UIML– still need to understand what should be expressed
• 22 •• UC Berkeley Endeavour Project •
In-Class Group Learning
• Participatory learning: Students work in groups of 4-7; communicate via pen or keyboard chat– each group has one main note-taker; others add
their own comments or questions to the transcript – students can mark up a group transcript, the
lecturer’s notes, or a private window– one student per group works as facilitator or TA,
posing questions to the others
• 23 •• UC Berkeley Endeavour Project •
Emergency Decision-Making
• Tacit activity mining (from ubiquitous sensing)– determines where people are, what they are working
on, what they know, etc. – quickly find human experts (e.g., how to restart
pumps…)– automatic authority mining (quality of information) – visualization that provides awareness without
overload
• Challenge is to recognize and compute structure– we borrow ideas from social network theory