Skip to main content

Advances in XR Technologies: Advances in XR Technologies

Advances in XR Technologies
Advances in XR Technologies
    • Notifications
    • Privacy
  • Issue HomeSocial Grammars of Virtuality, no. 2
  • Journals
  • Learn more about Manifold

Notes

Show the following:

  • Annotations
  • Resources
Search within:

Adjust appearance:

  • font
    Font style
  • color scheme
  • Margins
table of contents
  1. Advances in XR Technologies
    1. Introduction
    2. Methods
    3. Thematic Analysis Results
      1. 1. VR/AR Applications in Healthcare and Therapy (44%):
      2. 2. VR/AR for Education and Training (24%):
      3. 3. Human Factors and User Experience in VR/AR (18%):
      4. 4. VR/AR for Social Interaction and Communication (7%):
      5. 5. VR/AR for Design and Visualization (7%):
    4. Future Research Themes
      1. Hardware
      2. Software
    5. Future Hardware/Software Adoption
    6. Conclusion
    7. XR Hardware and Software in Studies from 2023: A Comprehensive List
      1. Hardware:
      2. Other Devices:
      3. Software:
    8. References

Advances in XR Technologies

Jeffrey Vadala

Introduction

This section provides an analysis of the state of extended reality (XR) technology in research. It is based upon a comprehensive review of academic literature, drawing from 266 peer-reviewed articles published in 2023 across key journals dedicated to XR research. The key journals examined across all sections of this report, including this section, include Virtual Reality (152 articles), Frontiers in Virtual Reality (91 articles), Presence: Virtual and Augmented Reality (12 articles), and Virtual Creativity (11 articles). Focusing on XR research usage trends for 2023, this section provides an inductive thematic analysis of research abstracts to identify key topics and trends, as well as a detailed compilation of the XR hardware and software tools most frequently mentioned in the literature. The goal is to provide an objective, data-driven overview of the XR landscape that can inform future research and development efforts. Thus, this section serves as a valuable resource for researchers, developers, and stakeholders looking to understand and advance the field of XR technologies.

Methods

Thematic Analysis: To identify major themes in VR research literature, we employed an innovative approach using advanced natural language processing. Abstracts from relevant papers were compiled into a single text file and analyzed using Google Gemini 1.5, a sophisticated model with a 1 million token context. This method leverages Gemini's ability to identify patterns and extract insights from large unstructured datasets, mirroring recent studies that have successfully used Large Language Models (LLMs) to identify themes and sentiment in research literature (Miah et al., 2024).

Recent research has demonstrated that LLM-based sentiment analysis approaches, which share similar methods and tools as thematic analysis approaches, can be as robust and sometimes more effective than conventional natural language tools like RoBERTA (Krugmann & Hartmann, 2024). This has led researchers to explore various ways of integrating LLMs into thematic analysis toolsets. One promising approach is the LLM-in-the-loop model, where human coders collaborate with an LLM to establish coding parameters (Dai et al., 2023). In the analysis that follows, Gemini 1.5 was used to inductively identify key themes, determine their relative prevalence as a percentage of the total literature, and provide citations and examples for each theme. This method allows for a comprehensive and nuanced understanding of the current state of VR research literature.

Hardware and Software Analysis: To compile a comprehensive list of the XR hardware and software tools most frequently mentioned in the literature, a custom Python script utilizing the OpenAI Application Programming Interface (API) was developed. This script first divided each article into 500-word “chunks” that could be processed by the API. The API was then prompted with the role of "identifying and reporting on XR hardware and software" for each text chunk. The resulting output was compiled into a structured database and summarized with OpenAI API calls to produce the final list of key VR hardware and software tools, along with frequency metrics. This automated approach allowed a much larger volume of literature to be analyzed compared to manual methods. It can be adopted for general research purposes and is provided at https://github.com/drquandary/Docparse

Thematic Analysis Results

Virtual and augmented reality (VR/AR) technologies[1] have seen rapid advancements and widespread adoption in recent years across a diverse range of domains. This report presents a comprehensive analysis of the major themes and applications of VR/AR based on a systematic review of the current academic literature. The themes identified through this inductive analysis include: VR/AR applications in healthcare and therapy (44%), VR/AR for education and training (24%), human factors and user experience in VR/AR (18%), VR/AR for social interaction and communication (7%), and VR/AR for design and visualization (7%).

Distribution of Type of VR/AR Application in Literature
A pie chart with different colored circles with Crust in the background

Description automatically generated

Within healthcare and therapy, key application areas are rehabilitation, pain management, mental health treatment, and medical training. Education and training applications span skill training, general education and learning, and realistic simulations. Research on human factors delves into presence and immersion, cybersickness, and usability evaluation. Social interaction studies explore topics like social VR and avatar embodiment. Finally, design and visualization applications include urban planning and product design. The following sections will delve into each theme in more detail, discussing key findings, trends, and implications.

1. VR/AR Applications in Healthcare and Therapy (44%):

This dominant theme encompasses a wide range of applications aimed at improving physical and mental well-being.

  • Rehabilitation (18%): Studies like "A haptic-feedback virtual reality system to improve the Box and Block Test (BBT) for upper extremity motor function assessment" (Dong et al., 2023), "Immersive virtual reality for upper limb rehabilitation: comparing hand and controller interaction" (Juan et al., 2023), and "A virtual reality bus ride as an ecologically valid assessment of balance: a feasibility study" (Gonçalves et al., 2023) explore VR/AR for motor rehabilitation, balance training, and cognitive rehabilitation.
  • Pain Management (12%): Research like "Designing effective virtual reality environments for pain management in burn-injured patients" (Phelan et al., 2023) and "When virtual reality supports patients’ emotional management in chemotherapy" (Buche et al., 2023) investigates the use of VR/AR to alleviate pain in burn patients and during chemotherapy.
  • Mental Health (9%): Studies such as "Gamified virtual reality exposure therapy for adolescents with public speaking anxiety: a four-armed randomized controlled trial" (Kahlon et al., 2023) and "Co-design of avatars to embody auditory hallucinations of patients with schizophrenia" (García et al., 2023) utilize VR/AR for treating anxiety disorders, phobias, and schizophrenia.
  • Medical Training (5%): Research like "Toward the validation of VR-HMDs for medical education: a systematic literature review" (Pedram et al., 2023) and "Utilization of virtual reality for operating room fire safety training: a randomized trial" (Katz et al., 2023) delves into the application of VR/AR for training medical professionals in surgical procedures and emergency response.

2. VR/AR for Education and Training (24%):

This theme explores the potential of VR/AR to enhance learning experiences and skill development.

  • Skill Training (8%): Studies like "Training mental imagery skills of elite athletes in virtual reality" (Wu et al., 2023) and "Investigating the effectiveness of immersive VR skill training and its link to physiological arousal" (Radhakrishnan et al., 2023) investigate VR/AR for training complex motor skills, including sports movements and fine motor skills.
  • Education & Learning (8%): Research such as "A phenomenological approach to virtual reality in psychiatry education" (Pedersen & Musaeus, 2023), "Embodied mixed reality with passive haptics in STEM education: randomized control study with chemistry titration" (Johnson-Glenberg et al., 2023), and "Incorporating AR/VR-assisted learning into informal science institutions: A systematic review" (Chen et al., 2023) delves into the use of VR/AR in various educational settings for subjects like physics, chemistry, and language learning.
  • Simulation & Training (8%): Studies like "Exploring the role of virtual reality in military decision training" (Harris et al., 2023) and "Immersive virtual reality and passive haptic interfaces to improve procedural learning in a formal training course for first responders" (Calandra et al., 2023) use VR/AR to create realistic simulations for training purposes in areas like military decision-making and fire safety.

3. Human Factors and User Experience in VR/AR (18%):

This theme focuses on understanding the human response to VR/AR technologies and optimizing the user experience.

  • Presence & Immersion (7%): Studies such as "A qualitative case study on deconstructing presence for young adults and older adults" (Pouke et al., 2022) and "Using interpretative phenomenological analysis to gain a qualitative understanding of presence in virtual reality" (Kelly, 2023) investigate the factors that influence the feeling of presence and immersion in VR/AR environments.
  • Cybersickness (6%): Research like "Cybersickness as the virtual reality sickness questionnaire (VRSQ) measures it!?–an environment-specific revision of the VRSQ" (Josupeit, 2023) and "Predicting VR cybersickness and its impact on visuomotor performance using head rotations and field (in)dependence" (Maneuvrier et al., 2023) explores the causes and mitigation strategies for cybersickness.
  • Usability & User Evaluation (5%): Studies like "Development of a customizable interactions questionnaire (CIQ) for evaluating interactions with objects in augmented/virtual reality" (Gao & Boehm-Davis, 2023) and "Effects of virtual reality and test environment on user experience, usability, and mental workload in the evaluation of a blood pressure monitor" (Hinricher et al., 2023) evaluate the usability and user experience of VR/AR applications and develop new evaluation methods.

4. VR/AR for Social Interaction and Communication (7%):

This theme examines the potential of VR/AR to facilitate social interaction and communication.

  • Social VR (4%): Research such as "Understanding the effect of a virtual moderator on people’s perception in remote discussion using social VR" (Yang et al., 2023) and "The sentiment of a virtual rock concert" (Slater et al., 2023) explores the use of VR for social interaction and communication.
  • Avatar Embodiment (3%): Studies like "Immersive role-playing with avatars leads to adoption of others’ personalities" (Sakuma et al., 2023) and "Evaluating face gender cues in virtual humans within and beyond the gender binary" (Ghosh et al., 2023) explore the impact of avatar embodiment on social interaction and self-perception.

5. VR/AR for Design and Visualization (7%):

This theme explores the use of VR/AR in design and visualization tasks.

  • Urban Planning & Design (4%): Research like "Public participation in urban design with augmented reality technology based on indicator evaluation" (Wang & Lin, 2023) and "Augmented reality as a participation tool for youth in urban planning processes: Case study in Oslo, Norway" (Reaver, 2023) investigates the use of VR/AR for visualizing and evaluating urban planning concepts.
  • Product Design & Visualization (3%): Studies such as "Digital fabrics for online shopping and fashion design" (Haghzare et al., 2023) utilize VR/AR for product design and visualization.

Future Research Themes

Although the themes here represent a broad array of topics, XR is a highly flexible set of technologies that can be utilized in a variety of research paradigms. There is ample potential for future research to make new thematic categories and to impact other fields. For example, the following topics can be found in other disciplinary journals but have yet to be substantially integrated into the research efforts of the core four journals examined in this section.

Research into the Long-term Impact and Efficacy of VR/AR applications will help us understand their lasting effects in areas like therapy and education. Integrating VR/AR with other technologies such as artificial intelligence and the Internet of Things could lead to smarter, more responsive learning environments and therapy sessions. Improving Accessibility and Inclusivity is essential to ensure that everyone, including those with disabilities, can benefit from these tools by developing adaptable interfaces. Cross-Cultural and Ethical Considerations are critical as VR/AR technologies reach worldwide audiences, ensuring they respect privacy, security, and diverse cultural norms. Understanding the Economic Impact and Scalability of VR/AR can highlight their cost-effectiveness and the challenges of broad implementation, especially in resource-scarce settings. Integrating VR/AR with Traditional Fields like archaeology and environmental science could open new doors for advanced environmental modeling (see Vadala and Milbrath 2016). Lastly, studying VR/AR in Real-world and Multi-user Environments will show us how these technologies can change the way we interact socially and operate in public spaces. By focusing research on these areas, we can enhance the reach and utility of XR, making these technologies more practical and beneficial across different sectors of society.

Hardware

The compilation of XR hardware and software used in studies from 2023 onward underscores a significant reliance on commercially available devices originally designed for gaming and entertainment, which are being repurposed for academic and clinical research applications. The array of hardware employed spans various brands and models, each contributing uniquely to the field of XR research.

Head-Mounted Displays (HMDs):

  • Microsoft HoloLens 2: Highlighted in studies by Ashitiani et al. (2023) for exploring spatial understanding of brain tumors and by Kildahl-Andersen et al. (2023) for mixed reality bronchoscopy, indicating its utility in medical research.
  • Meta Quest 2: Used in diverse contexts such as upper-limb motor function assessment by Evans et al. (2023), cognitive training for ADHD symptoms by Cunha et al. (2023), and multisensory VR nature immersion by De Jesus Junior et al. (2023), showcasing its versatility.
  • HTC Vive Pro and Varjo XR-3: These devices are employed for detailed studies on visuomotor tracking and text legibility in VR by Baillet et al. (2023) and Kilpelainen and Hakkinen (2023), respectively, emphasizing their application in fine-motor skill analysis and visual clarity.
  • HTC Vive: Noted for its use in investigating virtual object manipulation, glove pose estimation, and action recognition, illustrating the breadth of research from Bonfert et al. (2023) to Sakurada et al. (2023).
  • Oculus Quest and Rift S: These units are utilized for studies on locomotion techniques, the impact of physical walking on target selection in VR, and immersive session effects on pain and affect, indicating a broad interest in user experience and therapeutic applications.

Other Devices:

  • Devices like VR controllers, Vive Trackers, Leap Motion, and the Kinect sensor are critical for enhancing interaction and movement within VR environments, supporting studies on full-body tracking and hand gesture recognition.
  • Specialized Devices such as the VRPMST, Cybercopters, X-Board, MoVR therapy suite, PA suit, and Nesplora Aquarium demonstrate the innovative development of custom tools for specific research needs, ranging from memory assessment to haptic feedback and attention evaluation.

Software

The software landscape in XR research is equally rich, with a focus on both commercial game engines and bespoke applications designed to extend the utility of XR beyond its initial entertainment focus.

Game Engines:

  • Unity and Unreal Engine: Widely recognized for their role in developing immersive environments across numerous studies, these engines serve as the backbone for a vast range of VR/AR/MR applications, from educational tools to therapeutic interventions.

Other Software:

  • Tools like the Lab Streaming Layer (LSL), SPSS, and R are pivotal in collecting, synchronizing, and analyzing data, ensuring that research methodologies are robust and results are reliable.

Specific VR/AR/MR Applications:

  • Applications such as Virtual_Decisions: GANGS, EnhanceVR, RealityMedia, and the MoVR therapy suite illustrate the sector's move towards targeted interventions and explorations of spatial narratives, cognitive training, and therapeutic tools within the XR domain.

The analysis of VR/AR hardware and software mentioned in the literature revealed several notable trends and insights. The Microsoft HoloLens 2 emerged as the standard device in healthcare applications due to its ability to allow users to see the outside environment and its research mode functionality. However, it is important to note that Microsoft has recently abandoned the HoloLens line, making it a dead end in terms of future software and hardware development (special note – the HoloLens is still being developed for military uses).

The Oculus Quest 2 was another frequently mentioned device, which is not surprising given its increasing affordability and popularity among consumers and researchers alike. The HTC Vive, once a leader in the VR market, also appeared in many studies, despite HTC's lack of significant hardware updates in recent years. The Varjo headset stood out for its exceptionally high resolution, offering a superior visual experience compared to other devices. However, its high price point remains a barrier to widespread adoption. The recently released Apple Vision Pro is expected to match or nearly match the Varjo's resolution while being dramatically cheaper, although still costly compared to other consumer-grade headsets.

Full-body tracking using hardware-based solutions was another notable trend in the literature. Considering modern software development trends focusing on software inference and the use of small low-cost cameras, this approach will eventually be replaced by software-based inference of body tracking, leveraging advanced cameras, inverse kinematics, AI, and machine learning techniques to track and infer the users body position.

In terms of software, Unity and Unreal Engine remain the dominant platforms for VR/AR development, despite the introduction of more WebXR technologies. It is anticipated that Unreal Engine, with its superior graphics fidelity, will eventually overtake Unity for studies requiring photorealistic environments.

These findings highlight the rapid evolution and dynamic nature of the VR/AR landscape, with new hardware and software solutions constantly emerging and reshaping the field. Researchers and developers must stay attuned to these trends to make informed decisions about the tools and platforms they use in their work.

Future Hardware/Software Adoption

The current overview of XR hardware and software relies on the established headset hardware manufacturers and their corresponding software packages that have been in use for the past several years. New hardware and software approaches have emerged in 2024 that are already reshaping research. This especially in augmented reality (AR) and mixed reality (MR). This is most apparent with the use of the Apple Vision Pro which is a device that utilizes AR, MR, and VR. The Apple Vision Pro, with its combination of AR and VR, high-resolution displays, eye-tracking, and hand-gesture recognition, is now being used for neurosurgical work, surgical simulations, remote assistance and more (Cheng et al. 2024, Olexa et al., 2024). Future hardware developments promise more wireless tech, higher resolution displays, and smaller headsets. Small devices like the XReal AR glasses show promise that headsets will eventually be glasses-sized devices that will provide more flexible and seamless research uses. Technologies found in the Varjo-X4 like the super high resolution varifocal displays allow for visual focusing systems that replicate minutia of human eyesight. Releasing for PC, Sony’s PSVR2 will provide an immersive haptic feedback system while several companies are releasing affordable “haptic” suits. Omnidirectional treadmills and similar technologies have also recently matured enough to provide the “holodeck” experience where users can freely move around a full VR environment using natural and embodied movement without using the commonly utilized awkward joystick teleportation and joystick-free movement schemes.

On the software side, AR is becoming more significant, with platforms like Apple's ARKit and Google's ARCore leading in object recognition and environmental mapping. Future software will likely focus on more intuitive, AI-driven interfaces that adapt to user behavior, making XR experiences smoother. Cross-platform compatibility and the integration of AR/VR with AI, IoT, and 5G will enhance these experiences even further. The opensource web standard for VR and AR and MR known WebXR has matured enough to provide near comparable rendering performance to non-web platforms while offering “no download” single click experiences (see Girginova et al., 2024). Several robust platforms like PlayCanvas and Spline offer full packages for comprehensive application development.

Finally, emerging late in 2023, the wide availability of easy to adopt Artificial Intelligence (AI) technology will soon make an impact. Machine Learning, LLMs, and visual encoder systems will soon play a role in XR research. Visual encoders like OpenAIs GPT4o which can understand complex video streams will no doubt be used in conjunction with XR recordings of user viewsheds including mixed reality or VR content. Similar yet simpler and closed functionality is already advertised for use in the Meta Quest 3 system using the Llama 3.1. This will open the doors to automatic high precision image segmentation auto coding of visual relations that previously required high amounts of Machine training and set up in component ML frameworks like YOLO. The ease and flexibility of these systems should spur new forms of systematic and automated analysis focusing on first-person user interactions while also allowing researchers to develop XR visual and audio systems that can adapt and present information and visuals based on the complex reasoning that systems like GPT4o can provide.

Neural Radiance Fields (NeRF) and Gaussian Splatting technologies are set to change graphical rendering of objects and environments for XR research. Neural Radiance Fields (NeRF) uses deep learning to generate highly realistic 3D scenes from a few images. This contrasts with traditional photogrammetry software that requires a significant time investment for photo capture, model refinement and processing. NeRF technology can capture intricate details of real-world environments, enabling more immersive and accurate virtual experiences. By reconstructing environments with remarkable precision, NeRFs are pushing the boundaries of what’s possible in XR, providing more lifelike simulations for applications in gaming, training, and remote collaboration. This approach has just begun being used for a variety of disciplines including neuroscience, surgical training, forensics, and a host of simulation related disciplines (Kolpan et al., 2024).

Sometimes paired with NeRF techniques is the efficient new rendering technique Gaussian Splatting. Made possible by a variety of technological advances (efficient algorithms, GPU speed and memory increases, web xr standards), Gaussian splatting on the other hand is a technique that enhances the rendering of 3D scenes by using a collection of Gaussian functions to represent surfaces and textures. This method allows for smoother transitions and more realistic textures in 3D models, addressing some of the limitations of traditional polygon-based rendering. Gaussian Splatting’s ability to produce high-quality visual outputs with less computational overhead makes it particularly valuable for XR applications, where performance and visual fidelity are crucial. More specifically, Gaussian Splatting has the benefit over traditional rendering techniques when it comes to rendering complex environments in a realistic manner without common distortions and rendering flaws associated with polygon rendering techniques which have dominated since the 1990s.

Together, NeRF and Gaussian Splatting are revolutionizing XR research by improving the realism and efficiency of 3D content creation. These technologies enable researchers and developers to create more detailed and accurate virtual environments, which can be used for a wide range of applications, from immersive training simulations to virtual tourism and beyond. As these techniques continue to evolve, they hold the promise of making XR experiences more accessible, realistic, and impactful than ever before.

Conclusion

This report offers a detailed analysis of the current state of extended reality (XR) technologies through a review of academic literature from core XR research journals analyzed throughout the report. Using both traditional thematic analysis and advanced machine learning, we identified key themes, trends, and insights within the XR landscape. Additionally, we compiled the most frequently mentioned hardware and software tools in recent studies.

Our thematic analysis, powered by a custom made “Doc Parser” tool that used LLM OpenAI API in conjunction with Google Gemini 1.5, identified major themes in XR research, including applications in healthcare and therapy, education and training, human factors and user experience, social interaction and communication, and design and visualization. The findings show that XR technologies significantly impact various domains, with healthcare and therapy being the most prominent. Applications in this area include rehabilitation, pain management, mental health treatment, and medical training.

The hardware and software analysis highlighted the reliance on commercially available devices repurposed for research. Notable devices include the Microsoft HoloLens 2, Meta Quest 2, HTC Vive, and Varjo XR-3, each contributing to different areas of XR research. This section also notes the introduction of new devices like the Apple Vision Pro, which integrates AR, MR, and VR, promising future advancements in resolution, wireless technology, and ergonomic design.

On the software side, platforms like Unity and Unreal Engine continue to dominate XR development, supported by specialized tools for data collection and analysis. Emerging technologies such as the Apple Vision Pro, and super lightweight AR glasses will provide new technological affordances to researchers while rendering techniques like Neural Radiance Fields (NeRF) and Gaussian splatting are set to transform 3D scene and object reconstruction and representation by providing realistic and efficient content creation methods. These advancements enable more lifelike simulations and immersive experiences, pushing the boundaries of XR.

This section also identifies future directions for research, including the need for studies on the long-term impact and efficacy of XR applications, the integration of XR with AI and IoT, and the development of more accessible and inclusive technologies. Cross-cultural and ethical considerations, economic impact, and scalability are also critical areas for future exploration. Furthermore, drawing from fields like archaeology and environmental science could even provide new research avenues in topics like spatial analysis and human perception of the environment.

In conclusion, this report provides a data-driven overview of the current VR landscape, highlighting the most active and promising areas of development. It serves as a valuable resource for researchers, developers, and stakeholders looking to understand and advance VR/AR technologies. The rapid evolution of XR requires continuous attention to emerging trends and innovations to inform future research and development efforts.

XR Hardware and Software in Studies from 2023: A Comprehensive List

Based on the analysis of the provided text and responses, here's a comprehensive list of XR hardware and software mentioned in studies from 2023 or later, along with citations to the appropriate studies:

Hardware:

Head-Mounted Displays (HMDs):

  • Microsoft HoloLens 2:
    • Ashitiani et al. (2023): Used for examining spatial understanding of brain tumors.
    • Kildahl-Andersen et al. (2023): Used for mixed reality bronchoscopy.
  • Meta Quest 2:
    • Evans et al. (2023): Used for assessing upper-limb motor function.
    • Cunha et al. (2023): Used for cognitive training in adults with ADHD symptoms.
    • De Jesus Junior et al. (2023): Used for multisensory VR nature immersion.
  • HTC Vive Pro:
    • Baillet et al. (2023): Used for studying the impact of task constraints on visuomotor tracking.
    • Kilpelainen & Hakkinen (2023): Used for measuring text legibility in VR.
  • Varjo XR-3:
    • Kilpelainen & Hakkinen (2023): Used for measuring text legibility in VR.
    • Lisle et al. (2023): Used for creating and manipulating 3D paths in mixed reality.
  • HTC Vive:
    • Bonfert et al. (2023): Used for investigating challenges of controlling rotation of virtual objects with force-feedback gloves.
    • Hsu et al. (2023): Used for studying glove pose estimation in VR.
    • Li et al. (2023): Used for studying action recognition based on multimode fusion.
    • Sakurada et al. (2023): Used for investigating perceptual attribution of a virtual robotic limb.
    • Wenk et al. (2023): Mentioned in the context of previous studies on motor training in VR.
  • Oculus Quest:
    • Ganapathi & Sorathia (2023): Used for studying user-elicited gesture-based locomotion techniques.
    • Lu et al. (2023): Used for investigating the effects of physical walking on target selection in VR.
    • Weser et al. (2023): Mentioned in the context of comparing navigation techniques in VR.
  • Oculus Rift S:
    • Baker et al. (2023): Used for examining the difference between 10- and 20-minute immersive VR sessions on pain and affect.
    • Kilpelainen & Hakkinen (2023): Used for measuring text legibility in VR.
    • Palmisano et al. (2023): Used for studying differences in virtual and physical head orientation.
  • Other HMDs:
    • Pico G2 4K: Used by Pedersen & Musaeus (2023) for VR scenarios in a study on emotional regulation.
    • VR-HMDs (Virtual Reality Head-Mounted Displays): Mentioned in various studies without specifying brand or model.

Other Devices:

  • VR controllers: Commonly used for interaction in VR experiences across various studies.
  • Vive Trackers: Employed for full-body tracking in studies by Berg et al. and Boban et al.
  • Leap Motion: Utilized for hand tracking in studies by Hsu et al. (2023) and Nguyen et al. (2023)
  • Kinect sensor: Used in some AR/MR applications, such as by Dong et al. (2023) and Bauer et al (2023).
  • Specialized Devices:
    • VRPMST (Virtual Reality Prospective Memory Screening Task): Developed by Hogan et al. (2023) for assessing prospective memory.
    • Cybercopters: Introduced by Delcombel et al. (2023) for visualizing periodic behaviors in data.
    • X-Board: An egocentric adaptive AR assistant developed by Zhang et al (2023).
    • MoVR therapy suite: A suite of VR therapeutic tools created by Stamenkovic et al. (2023).
    • PA suit: A multimodal haptic suit developed by Kang et al. (2023)
    • Nesplora Aquarium: A VR-based attention assessment tool used by Voinescu et al. (2023)

Software:

Game Engines:

  • Unity: Widely used for developing VR/AR/MR experiences across numerous studies.
  • Unreal Engine: Also popular for creating immersive environments, mentioned by several studies.

Other Software:

  • Lab Streaming Layer (LSL): A framework for collecting and synchronizing multimodal data in VR/AR research, highlighted by Wang et al. (2023)

Statistical Analysis Software:

  • SPSS: Frequently used for data analysis in various studies.
  • R: Also employed for statistical analysis and data visualization in several studies.

Specific VR/AR/MR Applications:

  • Virtual_Decisions: GANGS: A VR experience for adolescent risk-taking training (Bilello et al., 2023).
  • EnhanceVR: A multisensory cognitive training and monitoring tool (Borghetti et al., 2023; Cunha et al., 2023).
  • RealityMedia: A testbed for exploring spatial narratives in VR (Jang et al., 2023).
  • Cybercopters Swarm: An immersive analytics tool for visualizing data (Delcombel et al., 2023).
  • MoVR therapy suite: Includes various VR therapeutic tools (Stamenkovic et al., 2023).
  • Nesplora Aquarium: A VR-based attention assessment tool (Voinescu et al., 2023).
  • EPELI (Executive Performance in Everyday Living): A VR task for assessing goal-directed behavior (Seesjärvi et al., 2023).
  • X-Board: An egocentric adaptive AR assistant (Zhang et al., 2023).
  • VR-based fire safety module: Used by Katz et al. (2023) for training purposes.
  • Virtual Reality Sickness Questionnaire (VRSQ): Used for measuring cybersickness (Josupeit, 2023).
  • Presence Questionnaire (PQ): Used for assessing presence in VR (Palmisano et al., 2023).

References

      Ashtiani, O., Guo, H.-J., & Prabhakaran, B. (2023). Impact of motion cues, color, and luminance on depth perception in optical see-through AR displays. Frontiers in Virtual Reality, 4.
      Baillet, H., Burin-Chu, S., Lejeune, L., Le Chénéchal, M., Thouvarecq, R., Benguigui, N., & Leconte, P. (2023). Impact of task constraints on a 3D visuomotor tracking task in virtual reality. Frontiers in Virtual Reality, 4.
      Baker, N. A., Polhemus, A., Kenney, M., Bloch, R., Ward, N., Intriligator, J., & Edwards, R. (2023). Examining the difference between 10- and 20-min of immersive virtual reality on symptoms, affect, and central sensitization in people with chronic back pain. Frontiers in Virtual Reality, 4.
      Bauer, V., Bouchara, T., Duris, O., Labossière, C., Clément, M.-N., & Bourdot, P. (2023). Head-mounted augmented reality to support reassurance and social interaction for autistic children with severe learning disabilities. Frontiers in Virtual Reality, 4.
      Bilello, D., Swancott, L. J., Kloess, J. A., & Burnett Heyes, S. (2023). Adolescent risk-taking and decision making: A qualitative investigation of a virtual reality experience of gangs and violence. Frontiers in Virtual Reality, 4.
      Bonfert, M., Hübinger, M., & Malaka, R. (2023). Challenges of controlling the rotation of virtual objects with variable grip using force-feedback gloves. Frontiers in Virtual Reality, 4.
      Borghetti, D., Zanobini, C., Natola, I., Ottino, S., Parenti, A., Brugada-Ramentol, V., Jalali, H., & Bozorgzadeh, A. (2023). Evaluating cognitive performance using virtual reality gamified exercises. Frontiers in Virtual Reality, 4.
      Buche, H., Michel, A., & Blanc, N. (2023). When virtual reality supports patients' emotional management in chemotherapy. Frontiers in Virtual Reality, 4.
      Calandra, D., De Lorenzis, F., Cannavò, A., & Lamberti, F. (2023). Immersive virtual reality and passive haptic interfaces to improve procedural learning in a formal training course for first responders. Virtual Reality, 27(2), 985-1012.
      Chen, J., Zhou, Y., & Zhai, J. (2023). Incorporating AR/VR-assisted learning into informal science institutions: A systematic review. Virtual Reality, 27(3), 1985-2001.
      Cheng, R., Wu, N., Varvello, M., Chai, E., Chen, S., & Han, B. (2024). A First Look at Immersive Telepresence on Apple Vision Pro. arXiv preprint arXiv:2405.10422.
      Cunha, F., Campos, S., Simões-Silva, V., Brugada-Ramentol, V., Sá-Moura, B., Jalali, H., Bozorgzadeh, A., & Trigueiro, M. J. (2023). The effect of a virtual reality based intervention on processing speed and working memory in individuals with ADHD—A pilot-study. Frontiers in Virtual Reality, 4.
      Dai, S. C., Xiong, A., & Ku, L. W. (2023). LLM-in-the-loop: Leveraging large language model for thematic analysis. arXiv preprint arXiv:2310.15100.
      De Jesus Junior, B. J., Perreault, L., Lopes, M. K. S., Roberge, M.-C., Oliveira, A. A., & Falk, T. H. (2023). Using multisensory virtual reality nature immersion as a therapeutic modality for improving HRV and cognitive functions in post-traumatic stress disorder: A pilot-study. Frontiers in Virtual Reality, 4.
      Delcombel, N., Duval, T., & Pahl, M.-O. (2023). Cybercopters Swarm: Immersive analytics for alerts classification based on periodic data. Frontiers in Virtual Reality, 4.
      De Paoli, S., & Mathis, W. S. (2024). [Title of the work]. [Publication information not provided in the given text]
      Dong, Y., Liu, X., Tang, M., Huo, H., Chen, D., Wu, Z., An, R., & Fan, Y. (2023). A haptic-feedback virtual reality system to improve the Box and Block Test (BBT) for upper extremity motor function assessment. Virtual Reality, 27(2), 1199-1219.
      Evans, J. O., Tsaneva-Atanasova, K., & Buckingham, G. (2023). Using immersive virtual reality to remotely examine performance differences between dominant and non-dominant hands. Virtual Reality, 27(3), 2211-2226.
      Ganapathi, P., & Sorathia, K. (2023). User elicited gesture-based locomotion techniques for immersive VEs in a seated position: A comparative evaluation. Frontiers in Virtual Reality, 4.
      Gao, M., & Boehm-Davis, D. A. (2023). Development of a customizable interactions questionnaire (CIQ) for evaluating interactions with objects in augmented/virtual reality. Virtual Reality, 27(2), 699-716.
      García, A. S., Fernández-Sotos, P., Vicente-Querol, M. A., Sánchez-Reolid, R., Rodriguez-Jimenez, R., & Fernández-Caballero, A. (2023). Co-design of avatars to embody auditory hallucinations of patients with schizophrenia. Virtual Reality, 27(1), 217-232.
      Ghosh, R., Feijóo-García, P. G., Stuart, J., Wrenn, C., & Lok, B. (2023). Evaluating face gender cues in virtual humans within and beyond the gender binary. Frontiers in Virtual Reality, 4.
      Girginova, K., Vadala, J., Tan, A., Kornides, M., Cassidy, K., Lipman, T., Okker-Edging, K., (2024 – Forthcoming). Augmented Landscapes of Empathy: Community Voices in Augmented Reality Campaigns. Media and Communication.
      Gonçalves, A., Montoya, M. F., Llorens, R., & Bermúdez i Badia, S. (2023). A virtual reality bus ride as an ecologically valid assessment of balance: A feasibility study. Virtual Reality, 27(1), 109-117.
      Haghzare, S., Arnison, M., Monaghan, D., Karlov, D., Honson, V., & Kim, J. (2023). Digital fabrics for online shopping and fashion design. Frontiers in Virtual Reality, 4.
      Harris, D. J., Arthur, T., Kearse, J., Olonilua, M., Hassan, E. K., De Burgh, T. C., Wilson, M. R., & Vine, S. J. (2023). Exploring the role of virtual reality in military decision training. Frontiers in Virtual Reality, 4.
      Hinricher, N., König, S., Schröer, C., & Backhaus, C. (2023). Effects of virtual reality and test environment on user experience, usability, and mental workload in the evaluation of a blood pressure monitor. Frontiers in Virtual Reality, 4.
      Hogan, C., Cornwell, P., Fleming, J., Man, D. W. K., & Shum, D. H. K. (2023). Assessment of prospective memory after stroke utilizing virtual reality. Virtual Reality, 27(1), 333–346.
      Hsu, F.-S., Wang, T.-M., & Chen, L.-H. (2023). Robust vision-based glove pose estimation for both hands in virtual reality. Virtual Reality, 27(4), 3133-3148.
      Jang, S.-Y., Park, J., Engberg, M., MacIntyre, B., & Bolter, J. D. (2023). RealityMedia: Immersive technology and narrative space. Frontiers in Virtual Reality, 4.
      Johnson-Glenberg, M. C., Yu, C. S. P., Liu, F., Amador, C., Bao, Y., Yu, S., & LiKamWa, R. (2023). Embodied mixed reality with passive haptics in STEM education: Randomized control study with chemistry titration. Frontiers in Virtual Reality, 4.
      Josupeit, J. (2023). Cybersickness as the virtual reality sickness questionnaire (VRSQ) measures it!? --An environment-specific revision of the VRSQ. Frontiers in Virtual Reality, 4.
      Juan, M.-C., Elexpuru, J., Dias, P., Santos, B. S., & Amorim, P. (2023). Immersive virtual reality for upper limb rehabilitation: Comparing hand and controller interaction. Virtual Reality, 27(2), 1157-1171.
      Kahlon, S., Lindner, P., & Nordgreen, T. (2023). Gamified virtual reality exposure therapy for adolescents with public speaking anxiety: A four-armed randomized controlled trial. Frontiers in Virtual Reality, 4.
      Kang, D., Lee, C.-G., & Kwon, O. (2023). Pneumatic and acoustic suit: Multimodal haptic suit for enhanced virtual reality simulation. Virtual Reality, 27(3), 1647–1669.
      Katz, D., Hyers, B., Hojsak, S., Shin, D. W., Wang, Z., Park, C., & Burnett, G. (2023). Utilization of virtual reality for operating room fire safety training: A randomized trial. Virtual Reality, 27(4), 3211-3219.
      Kelly, N. J. (2023). Using interpretative phenomenological analysis to gain a qualitative understanding of presence in virtual reality. Virtual Reality, 27(2), 1173-1185.
      Kildahl-Andersen, A., Hofstad, E. F., Sorger, H., Amundsen, T., Langø, T., Leira, H. O., & Kiss, G. (2023). Bronchoscopy using a head-mounted mixed reality device—A phantom study and a first in-patient user experience. Frontiers in Virtual Reality, 4.
      Kilpeläinen, M., & Häkkinen, J. (2023). An effective method for measuring text legibility in XR devices reveals clear differences between three devices. Frontiers in Virtual Reality, 4.
      Kolpan, K. E., Vadala, J., Dhanaliwala, A., & Chao, T. (2024). Utilizing augmented reality for reconstruction of fractured, fragmented and damaged craniofacial remains in forensic anthropology. Forensic Science International, 357, 111995.
      Krugmann, J. O., & Hartmann, J. (2024). Sentiment Analysis in the Age of Generative AI. Customer Needs and Solutions, 11(1), 3.
      Li, X., Chen, H., He, S., Chen, X., Dong, S., Yan, P., & Fang, B. (2023). Action recognition based on multimode fusion for VR online platform. Virtual Reality, 27(3), 1797-1812.
      Lisle, L., Davidson, K., Gitre, E. J. K., North, C., & Bowman, D. A. (2023). Different realities: A comparison of augmented and virtual reality for the sensemaking process. Frontiers in Virtual Reality, 4.
      Lu, Y., Gao, B., Tu, H., Wu, H., Xin, W., Cui, H., Luo, W., & Duh, H. B.-L. (2023). Effects of physical walking on eyes-engaged target selection with ray-casting pointing in virtual reality. Virtual Reality, 27(2), 603-625.
      Maneuvrier, A., Nguyen, N.-D.-T., & Renaud, P. (2023). Predicting VR cybersickness and its impact on visuomotor performance using head rotations and field (in)dependence. Frontiers in Virtual Reality, 4.
      Miah, M. S. U., Kabir, M. M., Sarwar, T. B., Safran, M., Alfarhood, S., & Mridha, M. F. (2024). A multimodal approach to cross-lingual sentiment analysis with ensemble of transformer and LLM. Scientific Reports, 14(1), 9603.
      Nguyen, R., Gouin-Vallerand, C., & Amiri, M. (2023). Hand interaction designs in mixed and augmented reality head mounted display: A scoping review and classification. Frontiers in Virtual Reality, 4. https://www.frontiersin.org/articles/10.3389/frvir.2023.1171230
      Olexa, J., Trang, A., Cohen, J., Kim, K., Rakovec, M., Saadon, J., ... & Saadon, J. R. (2024). The Apple Vision Pro as a Neurosurgical Planning Tool: A Case Report. Cureus, 16(2).
      Palmisano, S., Allison, R. S., Teixeira, J., & Kim, J. (2023). Differences in virtual and physical head orientation predict sickness during active head-mounted display-based virtual reality. Virtual Reality, 27(2), 1293-1313.
      Pedersen, K., & Musaeus, P. (2023). A phenomenological approach to virtual reality in psychiatry education. Frontiers in Virtual Reality, 4.
      Pedram, S., Kennedy, G., & Sanzone, S. (2023). Toward the validation of VR-HMDs for medical education: A systematic literature review. Virtual Reality, 27(3), 2255-2280.
      Phelan, I., Furness, P. J., Matsangidou, M., Babiker, N. T., Fehily, O., Thompson, A., Carrion-Plaza, A., & Lindley, S. A. (2023). Designing effective virtual reality environments for pain management in burn-injured patients. Virtual Reality, 27(1), 201-215.
      Pouke, M., Ylipulli, J., Minyaev, I., Pakanen, M., Alavesa, P., Alatalo, T., & Ojala, T. (2022). A qualitative case study on deconstructing presence for young adults and older adults. PRESENCE: Virtual and Augmented Reality, 31, 249-269.
      Radhakrishnan, U., Chinello, F., & Koumaditis, K. (2023). Investigating the effectiveness of immersive VR skill training and its link to physiological arousal. Virtual Reality, 27(2), 1091-1115.
      Reaver, K. (2023). Augmented reality as a participation tool for youth in urban planning processes: Case study in Oslo, Norway. Frontiers in Virtual Reality, 4.
      Sakurada, K., Kondo, R., Nakamura, F., Kitazaki, M., & Sugimoto, M. (2023). Investigating the perceptual attribution of a virtual robotic limb synchronizing with hand and foot simultaneously. Frontiers in Virtual Reality, 4.
      Sakuma, H., Takahashi, H., Ogawa, K., & Ishiguro, H. (2023). Immersive role-playing with avatars leads to adoption of others' personalities. Frontiers in Virtual Reality, 4.
      Seesjärvi, E., Laine, M., Kasteenpohja, K., & Salmi, J. (2023). Assessing goal-directed behavior in virtual reality with the neuropsychological task EPELI: Children prefer head-mounted display but flat screen provides a viable performance measure for remote testing. Frontiers in Virtual Reality, 4.
      Slater, M., Cabriera, C., Senel, G., Banakou, D., Beacco, A., Oliva, R., & Gallego, J. (2023). The sentiment of a virtual rock concert. Virtual Reality, 27(2), 651-675.
      Stamenkovic, A., Underation, M., Cloud, L. J., Pidcoe, P. E., Baron, M. S., Hand, R., France, C. R., van der Veen, S. M., & Thomas, J. S. (2023). Assessing perceptions to a virtual reality intervention to improve trunk control in Parkinson’s disease: A preliminary study. Virtual Reality, 27(1), 465–479.
      Vadala, J. R., & Milbrath, S. (2016). Using Virtual Reality to Understand Astronomical Knowledge and Historical Landscapes at Preclassic Cerros, Belize. Journal of Skyscape Archaeology, 2(1), 25-44. https://doi.org/10.1558/jsa.v2i1.26915
      Voinescu, A., Petrini, K., Stanton Fraser, D., Lazarovicz, R.-A., Papavă, I., Fodor, L. A., & David, D. (2023). The effectiveness of a virtual reality attention task to predict depression and anxiety in comparison with current clinical measures. Virtual Reality, 27(1), 119–140.
      Wang, Y., & Lin, Y.-S. (2023). Public participation in urban design with augmented reality technology based on indicator evaluation. Frontiers in Virtual Reality, 4.
      Wang, Q., Zhang, Q., Sun, W., Boulay, C., Kim, K., & Barmaki, R. L. (2023). A scoping review of the use of lab streaming layer framework in virtual and augmented reality research. Virtual Reality, 27(3), 2195–2210.
      Wenk, N., Penalver-Andres, J., Buetler, K. A., Nef, T., Müri, R. M., & Marchal-Crespo, L. (2023). Effect of immersive visualization technologies on cognitive load, motivation, usability, and embodiment. Virtual Reality, 27(1), 307-331.
      Weser, V. U., Sieberer, J., Berry, J., & Tuakli-Wosornu, Y. (2023). Navigation in immersive virtual reality: A comparison of 1:1 walking to 1:1 wheeling. Virtual Reality, 28(1), 4.
      Wu, Y., Lukosch, S., Lukosch, H., Lindeman, R. W., McKee, R. D., Fukuden, S., Ross, C., & Collins, D. (2023). Training mental imagery skills of elite athletes in virtual reality. Frontiers in Virtual Reality, 4.
      Yang, C.-L., Matsumoto, K., Yu, S., Sawada, L., Arakawa, K., Yamada, D., & Kuzuoka, H. (2023). Understanding the effect of a virtual moderator on people's perception in remote discussion using social VR. Frontiers in Virtual Reality, 4.
      Zhang, Z., Pan, Z., Li, W., & Su, Z. (2023). X-Board: An egocentric adaptive AR assistant for perception in indoor environments. Virtual Reality, 27(2), 1327–1343.
  1. The studies analyzed primarily focused on virtual (VR) or augmented (AR) technologies. Therefore, it is these two technologies that comprise the umbrella term extended reality (XR) in this section. ↑

Annotate

Powered by Manifold Scholarship. Learn more at
Opens in new tab or windowmanifoldapp.org