Friday, 22 May, 2020 UTC


Summary

As our dependence on technology continues to consolidate, data privacy has become a peripheral concern for many. This is partly due to the relationship between the companies providing the technology and those using it. When it comes to privacy policies there are usually two options; you agree and are granted access to use the service, or you disagree and you are not.
This is largely the case for free services, like Google and Facebook, which rely on user generated data to attract companies to advertise on their platforms. With data now becoming one of the most valuable commodities in the world, pervasive techniques are rampant with tech companies looking to collect as much data as possible on their users.
Facebook data servers 70 miles south of the Arctic Circle
With the advent of social media along with an unwavering willingness of users to share more and more personal data, it has become common for services like Google and Facebook to collect and store information including geographical location, search and purchase history, social ties, photos, likes as well as public and private messages.
In 2018 the world witnessed a large scale data privacy scandal when it became known that Cambridge Analytica acquired data from Facebook and used it to create psychological profiles of over 87 million users. In 2016 Alexander Nix, former CEO of Cambridge Analytica, spoke openly about creating psychographic profiles based on this data to categorise users into groups based on extroversion, agreeableness and neuroticism.
Former Cambridge Analytica CEO Alexander Nix delivering a speech on psychographics in 2016
The aim was to use these categories to micro-target users with specifically designed advertisements across online media platforms and to sway on the fence voters towards a particular political candidate or stance. Cambridge Analytica were found to have operated in sixty eight countries with work relating to both elections and referendums.
As VR advances as a new medium, privacy concerns have hardly been raised. This is worrying given the fact that in order for VR to function it requires a whole host of new data types. Eye gaze, gestures, facial expressions as well real life surroundings are all intimate and personal data required for the calibration of playing space as well as the basic functioning.
A heat map showing parts of a scene viewed most often by a user
Mozilla’s Privacy and Security lead for mixed reality projects, Diane Hosfelt, aptly describes the difference in processing and collecting data. “Processing information about your physical movements is required for basic functionality of most mixed reality experiences…you can’t do much without processing certain data. Collecting data implies that it is stored remotely for a time period beyond what’s necessary for simply processing it.”

Trending AR VR Articles:

1. Designing for a modern 3D world: A UX design guide for VR
2. Scripting Javascript Promise In Spark AR For Beginners
3. Build your first HoloLens 2 Application with Unity and MRTK 2.3.0
4. Virtual Reality: Do We Live In Our Brain’s Simulation Of The World?
It is the unnecessary storing of data after processing that poses a threat to user privacy and control over data. If data is not being used for immediate processing then ultimately, it is going to be stored and accumulated for purposes unrelated to the experience. In a disturbing hypothetical scenario featured on an article posted on the World Economic Forum website, a user plays a VR game where data about his body movement are processed and collected without his knowledge.
A month after playing the game, the user was turned down for a new life-insurance policy. His data had been sold to an insurance company who, having seen the tracking data from the game, concluded it to reveal behavioural movement patterns often seen among people in the very early stages of dementia.
Hand gesture tracking in VR
This type of scenario, although hypothetical, raises serious questions over the handling of VR related data in the future. Given the ways in which data has been collected and used in the past on seemingly harmless platforms like Facebook, awareness and appropriate regulation is required to protect consumers before we reach a situation like the hypothetical scenario above or another Cambridge Analytica type fiasco.
Created as an essential element of EU privacy and human rights law, GDPR (General Data Protection Regulation) was enacted to protect the personal data and privacy of EU citizens. The data involved with the use VR is considered biometric data under the regulation due to its potential for unique identification of an individual.
GDPR does cover some aspects of data privacy for VR users but these will need clarification and extension to be considered adequate. Minimum security standards have yet to be established for VR specifically and VR providers should be forced to consider the protection of users through mechanisms other than simple user consent, which is often bypassed without being checked and not understood completely by minors.
Oculus, owned by Facebook, is at the forefront of the race to monopolise VR headsets along with HTC. Given their track record over the handling of user data, this is a worrying prospect. Right now in their privacy policy it states that they process and collect “information about your environment, physical movements and dimensions when you use an XR (mixed reality) device.”
This pertains to user movements in the space, including gait and head movements, hand gesture tracking as well as information regarding your physical world surroundings, used while calibrating your playing space.
Passthrough camera on the Oculus Quest
As VR becomes more and more popular, suitable data protection for all users will no doubt remain one of the main challenges going forward. Any development and distribution of VR technologies will require a considerate and proactive approach to data privacy. Without adequate regulation however, companies will no doubt continue to collect and store user data for profit.
As individuals there are things we can do to influence the way things go, a willingness to pay for headsets and applications that value the protection of user data is important, as is the scepticism and examination of so called free applications. As Aldous Huxley once said in an interview in 1958: “We mustn’t be caught by surprise by our own advancing technology, this has happened again and again in history”.

Don’t forget to give us your 👏 !

https://medium.com/media/1e1f2ee7654748bb938735cbca6f0fd3/href
Virtual Reality and Data Privacy was originally published in AR/VR Journey: Augmented & Virtual Reality Magazine on Medium, where people are continuing the conversation by highlighting and responding to this story.