Recent research has shown that anonymising biometric data generated by VR headsets is almost impossible. Using 95 data points, the researchers were able to identify individuals with a 90% accuracy rate [UPDATE Nov. 2 2020: it’s currently 95% without any special tasks]. In the future, results will become more accurate, using less data points. At the same time, XR technologies are rapidly gaining ground in academia. Instances like these beg the question: How do we responsibly implement and use XR in the academic world? Is responsible use of XR in academia even possible?
It’s not just privacy implications that need to be considered. Introducing a new technology into the classroom or research lab will lead to disruption of the status quo. In a previous article, I outlined the current state, advantages and disadvantages of XR in academia. Here, I want to go into the responsible use of XR in the academic field, and how we might steer its development in the right direction.
With new technologies, the earlier you get on board, the more influence you exert over its development. The internet is a prime example: academia caught on quickly and had a large influence over its development and implementation, resulting in robust networks and tools which could facilitate the needs of its institutions (eduroam is a prime example). However, we’re not always successful in harnessing the potential of new technologies. The smartphone could be considered such a failure. Having been unable, or unwilling, to take a proactive approach to its implementation in the classroom, it’s now perceived as a nuisance.
Setting the agenda
Getting on board early is important, otherwise other parties will set the agenda for us and dictate where the technology is headed. These parties have different goals than professionals in the academic field. For instance, tech companies have revenue models and thus strong incentives to push new technologies into the world as quickly as possible. This might be good for innovation, but may forgo thorough consideration of ethical, societal, and academic implications. Academic institutes should not stand on the sidelines, but actively push the use of new technologies into the direction they want it to go.
With VR gearing up to go mainstream and AR ready to leave its infancy, the academic world needs to make its voice heard now, and play a crucial, shaping role in this early stage of XR. XR’s development has been primarily tech-driven, which is common with these kinds of innovations. Because of this, there are many questions that remain unanswered, or which haven’t even been considered thus far. Examples include:
- Who has access to XR technology and their experiences?
- Who owns participant-generated data?
- Does the technology respect the participant’s privacy? Which data is collected, why, and by whom? Where is it stored, and for how long?
- Are experiences and technologies inclusive?
- How can we perform research with XR in an ethical way? How do we inform and prepare our research participants?
- How intense should we make an experience? Is it OK to emotionally affect your students?
These questions deserve our attention and scrutiny. All too often, critical evaluation, ethics, laws, regulations, and conventions lag behind the pace of innovation.
First steps: how to use XR responsibly
At XR ERA, we recognise there is an urgent need for the development of knowledge, tools and policies so we can implement and use XR responsibly. In our view, ‘responsible of XR in academia’ entails experiences which are high-quality, engaging, safe, and which adhere to academic values and standards. In order to meet these requirements, various ethical and practical considerations need to be taken into account. The Rathenau Institute has done some preliminary work on this already, which led us to identify these key areas:
An XR experience should meet context-specific quality and design standards to ensure that it enhances, not diminishes, existing teaching and research methods. XR experiences of an academic standard take into account learning design, practical use, engagement and media design.
Privacy and data protection
This is often the first topic that comes up when we discuss the responsible use of XR in academia. New technologies come with new data and interactions. For example, headsets can track all kinds of user data, such as gaze, eye, hand and head tracking. Researchers, teachers, and industry oftend find these data useful. But how much information should be gathered? Should participants give informed consent? Likewise for protection: How far can researchers go, when tracking all kinds of biometric data of their participants?
Examples of gaze and eye tracking. Sources: Head Tracking in a Virtual Classroom to assess ADHD and Eye Tracking Benchmarks for an Oculus Rift IR Camera
Autonomy is about the power balance between user and creator. Users should feel comfortable and at least somewhat in control of their experience, instead of being at the mercy of whatever the creators throw at them. They should be able to make an informed decision on whether or not to expose themselves to a certain experience.
Preventing ecosystem lock-in
Deciding to use certain hard- and software often makes users, developers and institutions dependent on a specific ecosystem. Currently, it’s hard to not depend on one provider. As a bloc, academia should strive to adopt open technologies and standards, such as WebXR and OpenXR. These enable cross-platform support, open development and shared ownership, making transparency and flexibility core principles.
Tackling legal issues
XR presents some challenging issues from a legal perspective. The distinct nature of this technology may create an uncertain legal scenario for both developers and users of XR, such as with IP, data protection, tort law liability, or criminal behaviours such as cyber-bullying or new forms of cyber crime.
Maintaining physical and mental well-being
XR can expose people to uncomfortable, weird, or emotional experiences. This should be kept in mind while designing and developing XR experiences. It’s important to evaluate what research participants, students, and other users can handle. This can range from nausea and car sickness, to PTSD. We should tread lightly here, as we currently don’t know what the (long-term) effects of XR experiences on human beings are.
XR should provide an environment in which users feel safe and are motivated to have beneficial social interactions with others, both in- and outside an XR experience. We should provide a variety of ways to enable interaction between teachers and students, researchers and participants, or anyone who wishes to integrate an XR experience in their work. Additionally, the leading party should have tools to monitor and ensure social wellbeing of the users.
Responsible use of XR means we need to foster an inclusive academic environment. Therefore, we need to raise awareness about the accessibility of XR experiences for a wide range of different users. Accessibility is determined by various aspects, which can be physical (such as disabilities), as well as economical (e.g. making experiences available to a wide range of disciplines, not just the ones that are deemed economically interesting).
XR will bring new ethical questions to light which we cannot even conceive of yet. It is essential that we continuously question the technology, its effects, its development and the way people interact with it and with each other through the technology.
As a starting point, we aim to provide a preliminary set of guidelines, tools, and best practices. These will be structured along three categories: responsible facilitation & implementation, responsible use in research, and responsible use in education.
These guidelines on responsible use of XR in academia are just a first draft, and are missing something essential: your input. Achieving the goals outlined in this article will be a monumental task that we must undertake together, in the wider academic community. Nothing is set in stone; everything is still up for debate.
We invite you, the professional in the field, the researcher, the student, the manager, or whoever you may be, to join us and help move the discussion forward. Only through close collaboration with academic partners from all over the world can we guide the use of XR in the right direction.
Industry review boards are needed to protect VR user privacy – Jessica Outlaw & Susan Persky, World Economic Forum
Responsible VR. Protect consumers in virtual reality – Dhoya Snijders, Sophie Horsman, Linda Kool, Rinie van Es. Rathenau Institute