Recent Projects in the DiVE Virtual Reality Lab
David J. Zielinski
Duke Media Arts + Sciences Rendezvous. March 6th, 2014.
DiVE Personnel: Regis Kopper
Post Doc. U of Florida.w/ Ben Lok. Virtual Humans.
Phd. Computer ScienceVirginia Tech.w/ Doug Bowman.Distal Pointing (laser pointer metaphor) interactions.
Director of the DiVEDuke University 2013 - Present
DiVE Personnel: David Zielinski
Masters in Computer ScienceUniversity of Illinois at Urbana-Champaignw/ Bill ShermanVirtual Music Instruments
R&D Engineer - VR Software DevelopmentDuke University 2004 - Present
DiVE Personnel: Tamika Craige
Staff Specialist for the DiVE
Questions about booking tours for the DiVE?
http://virtualreality.duke.edu/visit/schedule-a-tour/
Devices - The DiVE
6-Sided CAVE-type SystemEach wall 2.9mx2.9m. 1050x1050. Active Stereo @ 110hz. IS-900 6-DOF Tracking.
Devices - zSpace
VR Workbench23” diagonal. 1920x1080. passive stereo. head and stylus tracking
Devices - Oculus Rift
VR Head Mounted Display (HMD)Developer Version: 1280x800. 110 diagonal FOV. Orientation Tracking.
Comparative Study of Input Devices for a VR Mine Simulation
w/ Brendan MacDonald from the National Institute for Occupational Safety and Health
Input Devices
Xbox game padseveral joysticks lots of buttons
Gyration Air Mousehold trigger, will move cursor IS-900 Wand
6-DOF: position and orientation
NIOSH utilizes Xbox and Air Mouse.Q: Would they benefit from a 6-DOF system?
UserStudy Tasks
Selection Time Results: Wand Wins
Preference Results: Wand is Prefered
Also, complaints that air mouse was frustrating, it didn’t always do intended action.
Selection Error Results: InterestingWand is fast and good for fast broad movements.
However, bad for small targets. Why?
● must continuosly aim.● at the moment the selection
button is pressed, the aim can be displaced. “heisenberg effect”
Big Picture Thoughts:“More advanced” devices are not guaranteed to produce better results.
Remaining Questions:How much does the context matter?What about longitudinal training?
IEEE VR 2014
● March 29 - April 2 ● Minneapolis, Minnesota ● This work will be
presented as a poster.
Enabling Closed-Source Applications for Virtual Reality via
OpenGL Intercept Techniques
w/ Ryan Mcmahan (The University of Texas at Dallas), w/ Solaiman Shokur and Edgard Morya (International Institute for Neuroscience of Natal)
To be presented at the SEARIS 2014 Workshop. March 30th.Co-located with IEEE VR 2014
Motivation
● VR experiences often utilize special hardware.● So we need to use software that can utilize the
special hardware.
Motivation - Part 2● To utilize VR, a user often needs to learn new
software (language/application).● Barrier for adoption of VR.● Could we VR-enable the desktop application
the user is already familiar with? even if it’s commercial/closed source?
Answer: OpenGL intercept-based techniques.
What is OpenGL?
MATLAB OpenGL Screen
fill3([0 0 0],[1 0 0],[1 0 1], glBegin(GL_TRIANGLES) glVertex3f(0,0,0); glVertex3f(1,0,0); glVertex3f(1,0,1);glEnd();
"OpenGL is an API (Application Progamming Interface) for rendering 2D and 3D computer graphics. The API is typically used to interact with a GPU, to achieve hardware-accelerated rendering. OpenGL was developed by Silicon Graphics Inc. in 1992." -Wikipedia
vertex = corner
Intercept Case
Standard Case
Intercept Technique Example
Drawing Command (e.g. “draw a triangle”)
Graphics Driver (opengl32.dll)
Graphics Driver (opengl32.dll)
Intercept Driver (new opengl32.dll) Drawing Command (e.g.
“draw a triangle”)
//inside our custom opengl32.dll
glColor3f(float r, float g, float b){
//could do some modification to color hereourtable.glColor3f(r,g,b); //call real glColor3f
}
Intercepted Function Example
Technique: In-And-Out
Desktop Application
customopengl32.dll
graphics out
Plugin or Built-in Scripting input
device data in
input device data sender
real opengl32.dll
Technique: Driver Mediated Head Tracking
Desktop Application
customopengl32.dll
graphics
input device data
input device data sender
real opengl32.dll
Case Study: MotionBuilder● Commercial motion capture
and animation software by Autodesk.
● Can also create interactive experiences via constraints system, python scripting, and a C++ software development kit (SDK).
● Has high quality real-time inverse kinematics, to modify character animations.
Application: Brain Computer Interface (BCI).
● Real-time decoding of brain activity● Invasive or Non-Invasive
We believe that VR can be used to train subjects with reduced mobility to utilize a BCI to control physical devices (e.g., wheel chairs, exoskeletons) in a safe environment .
Motivation:● Our collaborators are using MotionBuilder for
BCI Experiments. Can we VR-Enable MotionBuilder for the Oculus Rift?
?
MotionBuilder + Oculus: In-And-Out
MotionBuilder + Oculus Rift: Driver Mediated Head Tracking
No extra plugins or constraints needed for MotionBuilder.
Latency seemed less. How can we test this?
Results: Success!
Websites and Contact
http://virtualreality.duke.edu/http://people.duke.edu/~djzielin/http://regiskopper.com
[email protected]@duke.edu
Thank You!
Questions?
Recommended