Upload
linda-collins
View
216
Download
0
Tags:
Embed Size (px)
Citation preview
Friday, 4/8/2011
Professor Wyatt Newman
Smart Wheelchairs
Outline• What/Why Smart Wheelchairs? • Incremental Modules
– Reflexive collision avoidance – Localization, trajectory generation, steering
and smart buildings – Speech-driven wheelchair control
• Natural language interfaces
Architecture Natural language/ speech processing
localization/motion control (or joystick)
reflexes/local mapping
Wheelchair command
sensors
“Otto” instrumented wheelchair
*Kinect *Hokuyo *“Neato” *ultrasound
Sensing the world• All mobile vehicles should avoid collision.• “Ranger” sensors
– Actively emit energy to detect obstacles
• Cameras– Passively absorb light and can use machine
vision techniques to estimate obstacle positions.
Rangers• Simple rangers
– Can be sonar or infrared.– Limited information arises from wide “cone” emitted by
sensor.
Laser Scanners• Lidars (LI Detection And Ranging)
– Much better information.– Many radial points of data.
• Velodyne– Three dimensional lidar.– Very expensive.
Laser Scanners
• Neato sensor:– Low-cost sensor– 1-deg range values– Not yet available as separate unit
Cameras• Monocular cameras cannot return depth
information.• Stereo cameras do return depth information.
– This requires two sensors and has computational and calibration overhead.
• Hybrid sensor: Swiss Ranger– Uses infrared time of flight calculations with a monocular
camera to produce a 3D map.
• Kinect sensor:– Low-cost, mass-produced camera for computer gaming– Uses structured light to infer 3-D
Autonomous Mode• Localization
– Relative frame– Global frame
• Navigation– Goal planning– Path planning– Path following/Steering
Localization• Local frame sensors
– Odometry– Gyros– Accelerometers
• Fusion with Kalman Filter• Drifty and unreliable for long term position
estimation
Localization• Global frame
– SLAM (Simultaneous Localization & Mapping)– AMCL (Adaptive Monte Carlo Localization)
Navigation• Rviz (robot’s perception)• video
Smart Building• Coordination & Cooperation
– Smart devices work together to improve quality of life for users
– Multi-robot path planning and congestion control
– Robots invoke services within buildings
• video
Vocal Joystick
• A hands free control system for a wheelchair will provide restored independence– Quadriplegics, ALS, MS, Cognitive Disorders, Stroke
• Assistive Technology – High Level of Abandonment– Comfort– Difficult interface– Doesn’t properly fit the problem– Hard to make small adjustments
Alternative Wheelchair Control
• Voiced– Path Selection vs. Goal Selection (“Go to”)– “Natural” language commands (Left, Right)
• Non-Voiced– Humming controller
• Mouth-Controlled– “sip and puff”– tongue
Alternative Wheelchair Control
• Head Joystick• Eye movement (“Gaze”)• Chin Control• EMG
Why not voice?
• Voice is the most natural way to interface with a wheelchair. Why have we not seen voice activated wheelchairs in the market?– Recognition problems– Over simplified– Difficulty in precision control without collision
avoidance– Difficult HMI– Hard to make small adjustments
Speech-driven Wheelchair Control• A naturalistic “vocal” joystick for a wheelchair (or
any other mobile vehicle). • Prosodic features will be extracted from the user
when giving a command.– Pitch, Stress, and Intensity– Modeled and learned (through training simulations)
• Uses a Small corpus – Users wont have to manage many commands.– With added prosodic features could provide a more
natural means and solve the small changes in velocity, a problem described earlier.
• video
A linguistic interface
• Longer-term research in natural human interfaces
• There are three ways to think and speak about space in order to travel through it.
(1) MOTION driving, (2) voyage DRIVING, and (3) goal driven speech control of motion: (1)–>(2)–>(3)
We control each others’ movements, when it is relevant, by (1) motor commands, (2) indications of paths, and (3) volitive expressions of goals. So:
Speaking to a taxi driver, (3) the mention of a goal is normally enough to achieve proper transportation.
Speaking to a private driver as his navigator, we would instead give (2) indications for the trajectory by referring to perceived landmarks.
Speaking to a blindfolded person pushing your wheelchair, we would finally just use (1) commands corresponding to simply using a joystick in a videogame.
Interface Architecture:
Local OntologyIncl. sites and known objects
Local OntologyIncl. sites and known objects
SPEECHRec. &Prod.
Visualdisplay
Sensorsignal
Parsing& Inter-pretation
Motoraction
?!
Obstacle avoidance
Future Work• Wheelchair as personal assistant
– Safety monitoring– Health monitoring– Assistive functions
• Wheelchair users focus group input• User trials• Add-on modules
– Automated seat pressure redistribution– Medication reminders/monitoring– BP and weight monitoring– Distress sensing/response
Summary/Q&A• Reflexive collision avoidance—near-term product?• Localization, trajectory generation and steering• Verbal joystick w/ prosody• a priori maps vs. teaching/map-making;• smart buildings/smart products• Natural language processing and human
interfaces—longer term