Master’s Thesis

Posted on February 4, 2019 by 5lcht

Master's Thesis

Master's Thesis [Work-In-Progress]

Analyzing the social VR landscape to better understand how to make co-located VR experiences more social.

My thesis is an exploration into designing interactions and hardware for collocated collaborative Virtual Reality experiences for the purpose of creating design guidelines that support better communication and learning between collaborators.

My motivation behind this work stemmed from the lack of commercial device support for collocated VR experiences. For example, I have experienced this issue personally while watching VR Netflix using my phone and a Daydream headset. If there is someone else in the room with you while you are watching TV in a VR headset, it is difficult to pay attention to or acknowledge the other person. We imagine that it will be very difficult for the average consumer to adapt to watching long-format media in VR until there is a way to comfortably share this experience with family and friends.

For this project, two bachelor’s students are assisting with the work. Together, we are developing hardware prototypes (in cardboard), interaction designs (in sketch), and designing scenarios.

Survey of Social Applications

Method, Themes, Activities [TBD]

Literature Review

Method, Time-space Matrix, Themes, Measurements, Opportunities [TBD]

Prototypes

We will use these demos as a part of a user study to discover if body representation in co-located social VR experiences encourages collaboration and increases feelings of togetherness. Scenarios will be developed to study multiple co-located VR participants and co-located VR + non-VR participants.

Low-cost body tracking for virtual reality

This purpose of this demo is to provide a way for mobile VR users to engage in VR experiences together simply by placing their phones in a low-cost VR device like Google cardboard. By using phone camera-based body tracking, VR users do not have to instrument an environment to get realistic body representation in the virtual environment.

Additional work to be done

  • Use additional sensor or marker to identify z coordinate.
  • Animate a 3D model with poses.
  • Add facial expression tracking of both VR and non-VR participants.
  • Provide multi-user support.
  • Prototype will be configurable to support multiple study activities.

github

live Demo

Low-cost face tracking for virtual reality

This demo instruments the user’s computer or phone camera to track lower facial movements of co-located participants while both users are wearing a head-mounted VR device in order to visualize participant’s facial expressions in VR.

Additional work to be done

  • Animate a 3D model with face.
  • Prototype will be configurable to support multiple study activities.

github

live Demo

Kinect body tracking for virtual reality

This demo instruments the user’s environment to provide robust multi-user body tracking in the virtual environment.

Additional work to be done

  • Animate a 3D model with poses.
  • Add facial expression tracking of both VR and non-VR participants.
  • Provide multi-user support.
  • Prototype will be configurable to support multiple study activities.

github

live Demo

Study

Tests & Methods [TBD]

Results [TBD]

Reflection

Impact of research [TBD]

Credits

Research conducted with WeAreVR group from the University of Michigan Information Interaction Lab and thesis advisior, Professor Michael Nebeling.

WeAreVR Group Members: Katy Madier, Rhea Kulkarni, Sindhu Giri, Sophie Linn


Posted under: Uncategorized