ss

VR Ego-centric Data Navigation

Live demo

Research Paper

GitHub

Context

This research project was aimed to visualize higher-dimensional data scaled to 3D in VR. We developed an ego-centric data-visualization technique using JS in VR for interaction and navigation across webspaces.

Impact

Average visibility improved upto 97% with our method

Research Lab

DGP @ University of Toronto

Supervision

Dr. Karan Singh

Timeline

2018 - 2019

Research Focus

Build a tool to view and navigate search results in VR

Overview

Over the last few decades technology has revolutionized the ability to create, store and retrieve information on a whim. Information visualization, the art of representing data in a way that it is easy to understand and to manipulate, can help us make sense of information and thus make it useful in our lives. Information Visualization tools currently used are restricted to 2D screens, but with the recent developments in AR/VR give us the ability to visualize information and navigate web in the 3D space. In this report, we design and develop new techniques to interact with information in the AR/VR space.

1. Representing data in VR

Reducing higher-dimensional data with force-field implementation where force between nodes is scaled to their affinity. Multidimensional scaling is a visual representation of dissimilarities between sets of objects. The dissimilarities are quantitatively represented as distances. “Objects” can be faces, colour, map coordinates, political persuasion, or any kind of real or conceptual stimuli.

ss

2. Radial replusion force

We start with a data set and its corresponding affinity matrix, and assign random initial position of points in 3D around the user and using physics simulation, let the points settle down on a local minima, according to their affinity. With plethora of data points around the user to look at,there are some ways in which this visualization can be improved. First of all, from the user’s viewpoint, when visualizing data points, points that are close by in the line of sight can intersect and result in an occluded vision, to solve this, we introduce new forces in the system, namely radial repulsion forces between the points that are close enough radially. Secondly, we designate points that are of higher importance as landmarks and reduce the opacity of points that are far enough and not landmarks.

ss
ss

Finally, we verify the visibility of the nodes with full opacity and use parameters such as spring stiffness of radial-repulsion forces and size of the nodes to remove minor occlusions. A new force was introduce to avoid object oclusions in VR. Users should be able to see all the data points clearly in space.

Implementation

d3-force-3d and WebXR

We built a tool where data with nodes and affinity could be used to visualize in VR with maximum visibility. As it's built with WebXR, it is browser-based and supports multiple headsets.

ss
ss

Results

97%

Visibility improvement with radial forces

2.1s

Avg load time

0.1s

latency

`