가까운 미래의 단말에 대한 소고
TRANSCRIPT
2014: Getting Lost 2019: Libraries 2020: Copyright 2030: Keys2033: Coins
2050+ : Ugliness Physical Pain Nation States Death
Biological evolution and human technology both show continual acceleration.The time between events continues to decrease; 2B years from the origin of
life to cells and 14 years between the PC and World Wide Web.
Product Goal
Hardware components Specifications
Main control board
(Contron) COM Express Embedded System- CPU: Intel ATOM processor X530- RAM: DDR2 1 GB- Graphic: Intel GMA 500, 2 RGB output- OS: Windows XP Embedded
Peripheral devicecontrol board
Customized Embedded Board- Cortex M3 Controller- ADCs for Sensors (Illumination Sensor etc.)- 3x4 Touch Sensors- 24 RGB LEDs
Projectors
(Optima) PK201 Pocket Projector- Resolution: Native WVGA(854x480)- Brightness: 650 ANSI lumens- Light Source: RGB LED
Cameras(Microsoft) LifeCam HD-5000- 16:9 widescreen, 720 (1280x720) capture- Auto Focus
Actuators
(Robotis) Dynamixel DX-117- Running Degree: 0 ~ 300 deg.- Resolution: 0.29 deg. - Stall Torque: 15kgf.cm(12.0V, 1.4A)
Microphone
Customized Sampler- 3 x Electret Condenser Microphone (ECMs)- Sampling frequency: 8KHz ~ 44.1KHz- 16-bit resolution
As a companion (Doggie level of interaction)
- 원거리 (3~4m)에서 나를 알아보고,
- 부르면 바라보고 다가오고,
- 내가 하는 명령을 제한적이나마 알아듣고,
- 필요할 때 나를 따라다니고,
- 이야기 할 때 시선을 맞추면서 교감하는 정도…
As a computing device (Projector-camera interface)
- 실공간 또는 사물에 정보를 투사 (Robotic Spatial Augmented Reality)
- 손가락으로 투사된 화면을 제어
User Interaction in FRC인간 친화적 상호작용
사용자 추종
얼굴인식
화자인식
준생체인식
음원추적
호출 제스처 인식
음성인식
Who am I Follow me Do it
User Interaction in FRC인간 친화적 상호작용
프로젝터/카메라를 이용한 가상 인터페이스
증강 현실 (Augmented Reality)공간 증강 현실 (Spatial Augmented Reality)로봇 공간 증강 현실 (Robotic Spatial Augmented Reality)
• 로보틱 공간 증강 현실 (Robotic Spatial Augmented Reality)
Projection Controlusing Inverse Kinematics
Image Pre-warping(Anamorphic Illusion)
Image Pre-warping(Non-planar surface
projection)
프로젝터/카메라를 이용한 가상 인터페이스
• 손끝 검출 및 추적을 이용한 가상 마우스 인터랙션
ImageAcquisition
PreliminaryImage
Processing
FingertipFinding
Labeling/Morphological
Processing
Triggering/Finger Rules
Checker
CoordinationMapping
Mouse Event
KBD Event
OS/Applications
- Grayscale conversion- Adaptive background
subtraction- Image thresholding
- Noise Filtering(Blob labeling &
Erode/Dilate)
- Template Matching- Palm Detection
프로젝터/카메라를 이용한 가상 인터페이스
Autonomous Behaviors사용자/환경과의 지속적 관계를 통한 자율행위 및 성장
MotivationalDrives
User Adaptation &Learning Proactive InteractionUser Feedback
Goal
UserPreference
User state
BehaviorSelection
PerceptionSubsystem
BehaviorSubsystem
Services
Task
User Preference LearningLearn user’s preferences from interaction with the user.Using dual-layer architecture (Top-layer rules and Bottom-layer learning)
InputContext
Explicit User ModelAssociative Rules
Condition(Context) → Conclusion(Service)
Implicit User ModelAssociative Memory
Probability(Service|Context)
Rule Extraction
ValidationTest
DescriptionGeneration
A Service
ServiceExecution
ServiceExecution
ServiceSuggestion
ServiceSuggestion
Service Alternatives
Service Alternatives
Ranked service list
상황별각서비스에대한사용자선호도확률값
명시적으로정의되거나암시적으로학습된규칙
Valid
Invalid
설명을통한상호작용
Motivation
Curiosity drive (Exploration)
s
Input Context (t-1) Knowledge GainComponent
SurpriseComponent
Prediction ErrorPrediction
Leaky Accumulator
Curiosity Measure
Selected Action (t-1)
‐
Error Change
+
PerrScSc tt )1( KGcScCm )1(
Input Context (t)
Socialitydrive (Adaptation)
sInput Context (t)
Condition a1
Condition an
…ω1
ωn
Time Component b(t)ωt
Sociality drive activation ds
n
i
iit atb1
)( d s
(a) 주변의 최적의 기기를 활용한 통화
(b) 주변의 물체를 활용한 통화
(c) 소지한 최소 장치를 활용한 통화
(d) 원격지 공간과 장치의 연결
(e) 일반사물과의 인터랙션
(f) 일반사물에 정보 표출 및경험의 공유
for details about the presentation, please email to [email protected]