In this research project, we merge the physical and digital worlds by providing a collaboration tool that combines 360° video and augmented reality. Imagine a meeting room that is live-streamed to remote meeting participants in 360° and whatever they draw into the live stream appears in the actual room.

A research paper about 360Anywhere has been accepted at the 2018 ACM Symposium on Engineering Interactive Computing Systems.
University of Michigan
Team of 5
Jun–Sep 2017
What I was/did
Team/Project Lead
User Research
System Design + Architecture
How the process turned out
literature review
competitive analysis
mind maps
affinity diagrams
initial prototype
Iterative Implementation (3×)
high-fidelity prototype
user study
Identifying Challenges.
Both 360° videos and augmented reality are really powerful for remote collaboration. Yet, their full potential had not been made use of so far. Through a literature review and competitive analysis (including Skype on HoloLens, Sketchboard.io, and Chalk by Vuforia), we found a number of unresolved challenges, the most important of which were:
  1. 360° video and AR had not been combined before.
  2. Users are often not aware of where others are looking in a 360° video.
  3. Users are often looking into different directions.
The UI for collaborators, showing the 360° stream in which they can look around freely. The laptop screenshot (left) shows a gaze cone indicating where the mobile user (right) is looking. This is one way to solve challenge 2 above.
Finding a Solution.
Making use of standard methods—sketching, storyboards, mind maps, and affinity diagrams—we brainstormed a range of scenarios and solutions and also created initial wireframes. We soon noticed that there is no one-size-fits-all solution for all remote collaboration scenarios. Therefore, we decided for a component-based solution that lets users design just the system they need. A second important requirement was as little hardware as possible to address a wide range of potential users.
We display things that are drawn into the 360° stream directly in the room using projective augmented reality.
The collaborator UI, showing the projective augmented reality region with a blue outline.
Building and Testing 360­Any­where.
Starting with an initial prototype based on A-Frame, 360Anywhere was built in three iterations, each with a user study at the end. Based on these tests as well as the challenges and requirements we had identified earlier in the process, we ultimately created a system that (among other things) provides the following components that can be activated based on users' needs:
  • Gaze cones to indicate where other users are looking in the 360° live stream,
  • a function to take over the stream and force everyone to look in the same direction, and
  • the possibility to add text, images, videos, and drawings to the 360° live stream.
However, most importantly, when a projector is available, the system can be set up to support projective augmented reality. That is, whatever is drawn into such a projective region in the 360° live stream appears at the exact position in the room and can be saved for later sessions. In this way, we can create interactive whiteboards onto which both local and remote meeting participants can draw.

The user studies were carried out with one group of three and two groups of four and were intended to identify usability issues and explore more potential scenarios. Our participants were presented with three tasks to be solved in a group and afterwards filled out a post-study questionnaire and gave open-ended feedback. The studies suggest that our system can solve the existing challenges and be applied to a wide range of scenarios.

Overall, setting up 360Anywhere requires only a laptop, a 360° camera, and (optionally) a projector.