On the 3rd of July, the XR ERA community came together to learn about and discuss the EU-funded project XR4Human, in particular, its mission to “co-create a living guidance on ethical and related policy, regulatory governance, and interoperability issues of XR technologies whilst building public trust and acceptance, and a strong and competitive European XR ecosystem”.
ABOUT THE SPEAKERS
We were thrilled to be joined by Marco Correa Pérez and Melissa Amorós Lark from LLInC (Leiden Learning & Innovation Centre), both of whom have been highly involved in XR4Human from its inception in November 2022.
An intellectual property, data protection, and digital technologies lawyer with an Advanced LLM from Leiden University, Marco acts as Legal Counsel at both LLInC and XR4Human.
Melissa, Project Manager at both LLInC and XR4Human, has a particular interest in not only innovation, but the development of policies and guidelines for data’s responsible use.
WHAT AND FOR WHOM DOES XR4Human STAND?
XR4Human is a 3 year-long EU-funded project which unites XR technology experts, aiming ultimately to establish standards for XR’s development and use in the European context. In doing so, it seeks to take into account not only regulators themselves, but also those in industry, academia, and consumers.
Although XR4Human remains highly aware of barriers which potentially threaten its desired outcomes, as Melissa clarifies, several efforts are being made to mitigate their impact. For example:
(1) Potential Barrier: “Lack of adherence to the European Code of Conduct by developers and producers”
Mitigation strategy: “Involving developers and producers in CoC creation process”
(2) Potential Barrier: “Fragmentation of the European XR community”
Mitigation strategy: “XR4Human’s engagement processes led by XR4Europe to leverage its established network of European XR players and stakeholders”
It is hoped that such strategies will permit the project to reach its four principal desired outcomes:
- “Widespread acceptance and adoption of the European cross-industry Code of Conduct (CoC) by developers and producers”
- “Ensuring XR technologies adhere to high standards of ethics, privacy, security, and safety”
- “Strengthening links and promoting collaboration among the XR constituency, including EU-funded projects”
- “Providing improved quality of XR experiences and applications for end users”
STRUCTURE AND STAGES OF XR4Human
In terms of its structure, XR4Human has been organised around 9 fundamental stages referred to as work packages. Each with their own objectives yet often interdependent, these build upon each other’s outcomes and are led by the consortium organisations assigned to them.
Since work packages 3 and 5 were those with which Melissa and Marco were most closely associated, it was these which were focalised most closely during the meetup.
- WP3: “Mapping the Regulatory and Governance Issues of XR”
- WP5: “To Co-create a European Code of Conduct for Equitable, Inclusive, and Human-centred XR Technologies for Developers and Producers”
Although XR4Human has not yet reached its first reporting stage (designated April 2024), its initial policy report has recently been submitted, detailing what it finds to be the greatest challenges of XR technology’s use.
WHAT, IN PARTICULAR, IS WORK PACKAGE 3?
Main goal: “Explore: Map, examine, evaluate, and provide guidance on the related regulatory and governance issues arising in XR technologies”.
To achieve this principal aim, this Work Package strives to ascertain an overview of XR’s applications across several industries, addressing the challenges and risks related to the technology’s use, whether mental, physical, privacy-related, or other.
Concerning gaming and entertainment, several mental and physical risks were identified, amongst which were discussed manipulation, physical harm, and the adverse effects of consistent exposure. Significantly, within these industries, vast quantities of users’ biometric data continues to be shared not only with the headset’s manufacturer but, potentially, also beyond. So detailed and comprehensive can this data be that it can lead to the production of deepfakes. This raises, once again, the aforementioned threats of manipulation and deception.
Gaming and entertainment are, however, far from the only industries addressed by XR4Human in WP3. Within education, training, and the broader workplace, for example, there arise further issues regarding consent and how this might be legitimately sought in light of the hierarchies and pressures of teacher/student and manager/employee relations.
In healthcare too, what is the procedure concerning XR’s use when the patient is unable to express their consent? All these questions arise in addition to those foundational to XR’s very usage: what can be done to address the problematically high price of headsets, making them more broadly affordable across the EU? What are the consequences of the current gaps brought about by this accessibility discrepancy?
Q. One participant noted that, throughout the discussion, sustained focus had been placed upon the end user and enquired as to how XR4Human perceives this end user in terms of levels of expertise. Moreover, how does the project incorporate non-expertise into its proposed regulation and Code of Conduct?
A. Melissa Amorós Lark: “Some of us are developing tools and applications that are currently being used in university classrooms, so I think it’s very important to make sure that we ask students whether they see value in using these applications, whether they’re learning in a new and different way, and whether the costly investment is actually making them learn more than they would in a traditional classroom environment. So those of us who are developing these applications have a responsibility to gather that user input and their lived experience. We can say that, maybe they are not an “expert” in the traditional sense, but if someone is using an application and they get dizzy, they’re an expert in finding that out and sharing their experience”.
Q. Another participant questioned the potential impact of overregulation, particularly within an educational context. Is there a genuine risk that, under such circumstances, it would become considerably more challenging or even impossible for XR to be used by schools and universities?
A. Marco Correa Pérez: “It’s a challenge even when having an EU framework, since Member States may have some leeway to decide how to implement it. For example, the age of consent for data processing under the GDPR varies from country to country. Here in the Netherlands it’s 16 years old but, in other countries, this varies from 13 to 16. Below this threshold, you can’t process even sensitive data, especially biometrics, without parental consent. So, the use of XR devices may be prohibited unless there are technical and organisational measures which prevent personal data from being collected without consent. Alternatively, the data collected must not be personalised. One solution could be, for example, using one device for many students with a shared account, so you cannot link data to a specific student”.
Thank you to all those who joined for such an insightful and enjoyable presentation and discussion!
FOR MEMBERS: To gain access to both the video recording and slides from this meetup, simply email email@example.com and we will happily provide the details.