Seeing and Acting in a Virtual World PSY 341K

Preview:

DESCRIPTION

Seeing and Acting in a Virtual World PSY 341K. Class hours: Tues, Thurs 9.30-11 Room 4-242, SEAY Instructor: Professor Mary Hayhoe SEAY Room 5-238 X5-9338 mary@mailcps.utexas.edu Office hours: Anytime by appointment TA: David Lewis david.lewis @mail.utexas.edu - PowerPoint PPT Presentation

Citation preview

Seeing and Acting in a Virtual World PSY 341K

Class hours: Tues, Thurs 9.30-11 Room 4-242, SEAY

Instructor: Professor Mary Hayhoe SEAY Room 5-238 X5-9338 mary@mailcps.utexas.edu Office hours: Anytime by appointment

TA: David Lewis david.lewis@mail.utexas.edu

Co-instructor: Gabriel Diaz gdiaz@mail.cps.utexas.edu Web Site:http://homepage.psy.utexas.edu/homepage/class/psy341K/hayhoe/

Organization

1. Four experiments, approximately 3 weeks each.

2. Background lectures, data collection, analysis, presentation; emphasis on class discussion.

3. Groups of 4/5.

4. Requirements: 4 papers, 2 exams (short answer), attendance/participation/presentations.

5. Readings/lectures etc on web site.

The great unsolved problem: How does the brain control behavior?

Phrenology

Localization of function

Even simple actions involve many parts of the brain.

action plan Size, direction

velocity

motivation

signals to muscles

coordinatefeedback

respirationheart rate

memory

Initiatesequence

targeting

Classical Methods

What are the physical limits of Vision?

How accurate are eye movements?What is the peak velocity?What brain regions control eye movements?

A Typical Experiment

Why do some objects “pop out”?

An Experiment on Searching for Objects

And why are they sometimes hard to find?

Questions we might like to ask:

Where do we look in a scene in everyday life?

What information do we need?

How do we locate the information we need?

How are the movements controlled?

Why virtual reality?

Technological advances: 1. measurement of complex eye, head, hand movements 2. high speed image processing allows complex virtual environments that can be controlled experimentally 3. head mounted displays, tactile feedback

Natural behavior unexplored.

Need to validate (or not) results from simpler paradigms.

The CPS Virtual Reality Lab – a unique opportunity

What you’ll learn

- Basic properties of perception, movements, and attention

- Understanding the research process: the question, design of experiments, data

analysis, making conclusions, communication.

- Original contributions/ discoveries. Thinking independently.

Difficult things about this course

- no good text- fragmentary- lack of background- data analysis

- presentations

Date Topic

Jan 17 Overview of the course: understanding human actions Introduction to Virtual Reality lab.

Jan 19 Using our Eyes in Everyday Tasks: Lecture: The nervous system, vision, and motor control.The eye and eye movements

Rosenbaum Ch 5, Land paper.

Jan 24 Lab: tracking the eyes while catching balls. Jan 26 Lab: tracking the eyes.

Jan 31 Lecture: Interpreting the data

Feb 2 Discussion of Findings/ class presentations

Feb 7 Interdependence of Vision and Action: Lecture Paper 1 due

Feb 9 Vision and movement.(Rosenbaum Ch 2)

Feb 14 Lab Intercepting virtual targets

Feb 16 Lab: ctd

Feb 21 Understanding the data

Feb 23 Discussion of Findings / class presentations

Feb 28 Review Paper 2 due

Mar 1 Mid-term

Virtual racquetball: Nvis helmet, Arrington eye-tracker, PhaseSpace head/hand/racquet tracking, ODE to control ball and racquet interactions

Gabe Diaz

Mar 06 Learning Where to Look: lecture

Mar 8 Lecture

Mar 13, 15 Spring Break

Mar 20 Lab: Avoiding virtual pedestrians

Mar 22 Lab: ctd

Mar 27 Discussion of Outcome

Mar 29 Class Presentations

Gaze allocation when walking in a real environment

Things to do: control direction, avoid obstacles, foot placement,characterize surroundings etc cf Walter: normal vision involves sets of sub-tasksor modules – need to allocate attention effectively between sub-tasks.

Portable ASL eyetrackerOval path around large room

pedestrians

(Jovancevic & Hayhoe, 2009 J Neurosci)

How are gaze targets chosen?

Apr 3 Attention & Vision: Lecture Paper 3 due

Apr 5 Lecture: attention and eye movements in natural environments

Apr 10 Lab: Walking in a Virtual Apartment

Apr 12 Lab: ctd

Apr 17 Understanding the data

Apr 19 Class presentations

Apr 24 Lecture: Uses of virtual environments

Apr 26 Review

May 1 Review

May 3 Final Exam Paper 4 due

Grading: Papers 1-4: 15% each. Midterm and Final: 15% each; Attendance: 5%; Presentations and class discussion: 5%)

Papers: 7-10 pages (typewritten, double spaced) reporting the results of the lab experiments.Can re-write papers.

Exams - short answer questions. Midterm: first half of course.Final: second half of the courseExams cover : class material, labs, and readings.