Advances in XR Technologies
Jeffrey Vadala
Extended-reality (XR) technologies, which include Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR), continue to reshape how people engage with digital information and physical space. These technologies influence industrial training, healthcare, entertainment, education, and social communication. Their immersive nature, combined with new software and hardware, drives interdisciplinary research. This section of the report presents an overview of 305 papers from the three influential academic journals: Presence: Virtual and Augmented Reality; Frontiers in Virtual Reality; and Virtual Reality. The examination of publications provides a snapshot of current technologies and research priorities.
To provide a holistic, granular understanding of the XR research ecosystem, this analysis explores the question: What specific hardware devices and software platforms are predominantly utilized and discussed in contemporary XR studies, and what does this reveal about the underlying technological infrastructure of XR research? This research rests on a structured analysis of a pre-compiled dataset of the 305 papers. Custom python scripts using the LLM-based Paper-QA generated the dataset by querying themes, hardware, and software from each study. This produced a contained and focused search system or catalogue for addressing the question above.
Technological Infrastructure: Device and Software Preferences
The technology catalogue offers insights into the specific devices and software components favored by researchers. These choices reflect their capabilities, accessibility, and market trends.
VR Headsets: The hardware analysis reveals a diverse landscape of VR headsets used across the 305 papers. This diversity shows significant technical sophistication and specialization for different research needs. The analysis identified the following VR headsets:
- High-End Research Platforms include the Oculus Quest 2 (Meta), with a resolution of 1832×1920 per eye and a 120 Hz refresh rate, and the HTC Vive Pro 2, with a 2448×2448 per-eye resolution and SteamVR tracking 2.0. The Valve Index offers the highest field of view (130°), while the PlayStation VR2 (Sony) features an OLED screen and advanced sensors.
- Established Research Platforms feature the widely presently adopted Meta (Oculus) platform alongside the early leader, the HTC Vive (multiple configurations) for room-scale tracking. Both Oculus and HTC remain dominant foundational research devices. Although eye tracking hardware and software has been miniaturized and shows great promise for research and software application, few established research platforms beyond the HTC Vive Pro with a dual AMOLED display offer eye tracking as a built-in hardware feature.
- Specialized and Legacy Systems range from the nVisor SX111 for industrial applications to the FOVE 0 and Meta Quest Pro, both with integrated eye tracking.
The diversity of hardware used (see catalogue) shows that VR research no longer relies on a small number of hardware platforms. With new headset makers introducing specialized features in 2025, researchers will increasingly be able to select headsets and develop specialized studies based on technical needs like tracking precision, field of view, 4k per eye displays, refresh rates, and eye tracking.
Game Engines and Software Development Platforms
The XR development ecosystem integrates multiple software platforms, frameworks, and tools. The catalog below reveals both, established game engines and emerging specialized platforms:
- Unity 3D (156 papers): As the most widely used platform for AR and VR model development, Unity offers notable versatility. Its popularity stems from extensive documentation, cross-platform support, an active community, and robust prototyping capabilities.
- Unreal Engine (25 papers): Known for its Blueprint scripting and multiplatform support, Unreal Engine is compatible with popular VR platforms (including SteamVR and HoloLens 2) and supports open standards like OpenXR.
AR/VR SDKs and Frameworks:
- ARCore (2 papers): Provides comprehensive platform kits for AR applications, particularly prominent in engineering education and mobile AR development.
- ARKit (3 papers): Apple's AR development framework, enabling sophisticated AR applications on iOS platforms.
- SteamVR SDK (31 papers): Supporting a wide range of VR headsets including Valve Index, Vive Pro 2, and Windows Mixed Reality devices.
- Vuforia (8 papers): This platform offers notable versatility in AR applications across diverse engineering disciplines, with robust computer vision and tracking capabilities.
Specialized Development Tools:
- 3ds Max and Blender: Researchers use these tools extensively for 3D modeling and rendering in AR/VR applications, with Blender being particularly popular in electrical and mechanical engineering research.
- Assemblr Edu: Specialized AR platform for educational applications.
- OpenXR: As an open standard, OpenXR receives support from major engines, promoting cross-platform compatibility and reducing platform fragmentation.
The software landscape reveals a mature ecosystem. It balances accessibility (Unity, Blender) with specialized capabilities (Vuforia, ARCore) and cross-platform compatibility (OpenXR, Unreal Engine).
AR Devices
The analysis identifies several prominent AR devices, demonstrating the growing sophistication of AR research platforms.
- Microsoft HoloLens 2: This optical see-through AR head-mounted display uses laser beam scanning with waveguides and grating-based optical combiners. It achieves an image resolution of 10 to 15 cycles per degree. HoloLens 2 is a leading platform for spatial mapping, gesture recognition, and holographic rendering. Microsoft has shuttered development of this headset and its respective software.
- Epson Moverio BT-300: This compact AR HMD features a micro-OLED display with a 1280 × 720 pixel density and a 23° field of view, making it suitable for specialized research.
- Magic Leap One: A standalone AR device, this platform overlays digital images onto the real world, offering a different set of technical specifications from the HoloLens.
These AR devices represent different approaches to augmented reality research. They range from high-end spatial computing (HoloLens 2) to portable micro-display systems (Epson Moverio) and standalone platforms (Magic Leap One), enabling diverse research. The AR field remains small likely due to costs associated with producing waveguide display technology at scale.
Researchers use a wide array of other technologies. Tracking and Motion Capture Systems include inside-out tracking (Oculus Quest, HoloLens), various SLAM systems (viSLAM, ORB-SLAM3), OptiTrack systems for high-precision capture, and SteamVR for room-scale tracking. For Input and Interaction, teams use Meta Quest and HTC Vive controllers, Oculus Touch, and Leap Motion for hand tracking. Haptic devices like Manus VR gloves and the PneuGlove provide tactile feedback.
Additional Technology Categories
Researchers use a wide array of other technologies. Tracking and Motion Capture Systems include inside-out tracking (Oculus Quest, HoloLens), various SLAM systems (viSLAM, ORB-SLAM3), OptiTrack systems for high-precision capture, and SteamVR for room-scale tracking. For Input and Interaction, teams use Meta Quest and HTC Vive controllers, Oculus Touch, and Leap Motion for hand tracking. Haptic devices like Manus VR gloves and the PneuGlove provide tactile feedback.
Audio and Spatial Sound Systems are also critical. This category includes ambisonic microphones (SoundField SPS200, RØDE NT-SF1), spherical loudspeaker arrays, and spatial audio technologies like ambisonics (FOA, HOA) and binaural rendering. Computing Hardware is robust, featuring GPUs like the nVidia A4000, AMD Ryzen 7 CPUs, and large memory configurations. Finally, Sensors and Measurement Systems are diverse, including RGB-D cameras (Intel RealSense D455), IMU sensors, eye-tracking systems, physiological sensors (EEG, ECG), and laser scanners for 3D mapping.
This network uses two classes of links to capture both formal groupings and actual usage patterns. Category membership links connect each device back to its group (for example “HTC Vive” to “VR Headsets”). Real‑world pairing links show hardware and software that work together, such as headsets to native controllers, headsets to tracking hardware, shared peripherals across models, and development platforms to headsets via standards like OpenXR. After defining these edges, a force‑directed (spring‑layout) algorithm places nodes so that items with many shared links pull close and those with fewer ties drift apart. Each node is colored by category to guide interpretation: red for VR headsets, blue for AR headsets, green for controllers, purple for tracking systems, orange for other input devices, and brown for development platforms.
From this map we see that VR headsets form the densest cluster (as noted by number of edges and connections), reflecting a mature ecosystem of headsets, controllers, trackers and engines. AR headsets sit in a smaller cluster, suggesting room to expand native peripherals and platform support. Shared peripherals such as eye trackers, haptic gloves and VR treadmills link across headsets and highlight innovation drivers in immersion. Tracking systems including both inside‑out and outside‑in, connect to VR and AR devices alike, making universal tracking a backbone of XR. Standards like OpenXR unify the landscape by linking to every compatible headset and reducing fragmentation. Devices with few cross‑links mark gaps and niches where new SDKs, peripherals or integrations can add value.
Intersections and Synergies
The review reveals an array of connections tying research themes, domains, and technologies. Applied work in training and education illustrates this interdependence. Often, studies lean on the Oculus (now called Meta) platform for its low-cost, dependable performance. Virtual-environment systems appear in 126 papers, while 17 projects use specialized environmental simulators. Unity, noted in 156 papers, speeds development by offering pipelines for scene building and environmental simulation (Terkaj et al., 2024; Banquiero et al., 2024). In terms of the key themes of education and training, researchers pair lessons with assessment tools to track skill transfer and measures like cognitive load.
Human-factor inquiries form a second cluster. Investigations of presence, cognitive psychology, and sensory feedback sit mostly in the Psychology and General domains. Here again, the Oculus and HTC platforms provide most laboratory headsets (Hoffman et al., 2024; Glenn & Coxon, 2024). Moving beyond basic tracking, researchers are now beginning to collect physiological and behavioral signals through measurement sensors and analysis suites (Zeng et al., 2024; Cho et al., 2024). Cybersickness studies, though fewer than other VR/AR research areas, use evaluation and feedback systems to pinpoint discomfort and test mitigations (Vlahovic et al., 2024; Joolee et al., 2024)
Healthcare-focused papers blend performance measurement with sensory feedback. They track motor gains after VR-based therapy (Alrashidi et al., 2024) and design haptic exercises for rehabilitation (Quintana et al., 2024). Platforms like Unity combine with platform specific tools provided by Meta to deploy applications and prototypes in practices (Pedram et al., 2024; Albeedan et al., 2024). Because patient safety demands evidence, most studies use detailed assessment instruments to confirm efficacy (Hjellvik & Mallam, 2024; Narciso et al., 2024).
Social-interaction research continued to explore the connection between technology and behavior. Work on multi-user environments is rooted in the drew from social/psychological themes while also drawing on human-computer interaction and presence theory. Researchers explored using shared environments to understand interaction in shared virtual space (cite). Wireless headsets like the Meta Quest facilitated this by providing low-cost headsets with ultra portability + apis for untethered movement and group collaboration. Some researchers even utilized the headset’s support for voice and gesture recognition (Ban et al., 2024), while other researchers utilized extra hand-tracking devices to exploring interactions with non-verbal cues (Cho et al., 2024) (cite).
In sum when we look at the maturing XR field we can see two key synergistic factors. Firstly, we can see that Unity continues to act as a unifying and standardizing force in the context of research using virtual-environment systems. Because of its ubiquity across dislikes and headsets, Unity can be viewed as a key research enabler allowing researchers a straightforward and well-developed path for XR research. Secondly, it's important to note that the range of research tools, including evaluation software and multimodal sensors, is steadily growing. This has the effect of underscoring the field’s growing methodological rigor while providing researchers more tools for research (see OpenBCLS growing catalog of open-source sensors). The appearance of Microsoft HoloLens in select studies hints at a parallel track for augmented reality but the discontinuation of this headset will probably mean AR will continue to suffer a lack of developed research until a new AR headset is produced by a large well-funded hardware company.
Discussion
The analysis of research articles from Presence, Frontiers in Virtual Reality, and Virtual Reality paints a clear picture. As of mid-2025, the research field shows and increased focus on pragmatic uses of XR technologies. More specifically, a principal insight from the thematic analysis is the dual focus in XR research. First, a substantial drive exists toward applied utility, shown by the leading positions of "Immersive Training" and "Educational Technology." This reflects a clear recognition of XR's potential to deliver tangible solutions. Researchers are demonstrating the pedagogical and efficiency benefits of XR in industrial skill acquisition (Terkaj et al., 2024), high-stakes medical procedures (Singh et al., 2024), and soft skills development (Yan et al., 2024; Geng et al., 2024). The presence of these themes indicate that research is progressing beyond feasibility studies into optimization and efficacy. This pragmatic approach drives adoption and investment in large sectors such as government and military. The ability to simulate dangerous or expensive scenarios in a safe, repeatable virtual environment is a clear driver for this trend, as seen in studies on firefighting (Narciso et al., 2024) or surgical training (Postema et al., 2024). Sold to Lucky Palmer’s Auduril industries, Military use and interest of XR is emblematic in the purchase of the shuttered HoloLens AR platform (cite). Although it is unclear how the HoloLens 2 will be used, it’s fair to say that “training” would be a component.
Second, a significant portion of research remains dedicated to understanding the human experience in XR. Themes like "Presence Immersion," "Cognitive Psychology," and "Sensory Feedback" represent a commitment to unraveling the mechanisms behind immersive experiences. Research on presence (Hoffman et al., 2024) and multisensory integration (Sun et al., 2024; Fucci et al., 2024) is crucial for designing authentic and engaging experiences. At the same time, investigations into cognitive load (Wen et al., 2024), perception (Moosavi et al., 2024; de Souza & Tartz, 2024), and memory (Monaro et al., 2024) are essential for optimizing XR applications. This dual emphasis indicates a healthy field addressing both practical needs and theoretical foundations. Insights from these psychological studies inform the design of more effective training, therapy, and user interfaces.
Despite the advancements in research, the analysis reveals persistent challenges. Cybersickness Comfort stands out. While representing a small proportion of studies (3.3 percent), it remains a pervasive issue for users preventing many from using XR technology. The quantity of research related to cyber sickness may be low, but its importance is high if XR is to expand further in research or commercial contexts. Researchers are exploring mitigation strategies, from physiological detection (Yalcin et al., 2024; Sameri et al., 2024) to design interventions like avatar presence (Makani et al., 2024) and adjustments to vection (Teixeira et al., 2024). A definitive solution remains elusive. This poses a barrier to mainstream XR adoption, especially for applications requiring prolonged use.
Human-Computer Interaction (HCI) is also a highly relevant theme. This indicates an ongoing quest to refine how users interact with virtual content. The work goes beyond simple input methods. It encompasses intuitive gestural interfaces (Reski et al., 2024), meaningful haptic feedback (Terenti et al., 2024; Jacucci et al., 2024), and holistic user experience design (Gong et al., 2024). As XR technologies become more complex, the demands on HCI research will intensify. Usability evaluation remains a cornerstone of this theme (Zielke et al., 2024; Wang et al., 2024).
The technology catalog reveals a sophisticated and diverse ecosystem underpinning XR research. The analysis shows strategic diversification based on research requirements, not standardization on a single platform. The VR headset landscape is showing some remarkable technical advancements. Devices offer resolutions from 1080×1200 per eye (older systems) to 2448×2448 per eye (newer platforms). Refresh rates range from 60 Hz to 144 Hz, and fields of view span 63° to 130°. This diversity allows researchers to select optimal platforms for specific experimental needs, whether prioritizing visual fidelity, tracking precision, or eye tracking. That said, with the recent release of micro Oled 4k displays, these resolutions will soon be outpaced offering researchers in 2025 and beyond displays that nearly appear photorealistic in many contexts.
The software landscape reinforces this trend. Unity 3D's prominence reflects its accessibility and versatility. Specialized tools like Vuforia and ARCore demonstrate the ecosystem's maturity. The adoption of the OpenXR standard signals a move toward interoperability and reduced fragmentation. Although still not widely adopted by the research world, Unreal Engine continues to gain traction with the commercial and entertainment industry. When paired with 4k per eye displays, the new lighting and rendering technologies in Unreal 5 will provide researchers to chance to examine how users respond to hyper realistic environments. With near certainty, we are nearing Holodeck level experiences sooner rather than later.
The emergence of comprehensive tracking systems indicates an evolution toward more natural and accurate motion detection. These systems range from inside-out tracking in standalone devices to high-precision OptiTrack motion capture. The integration of advanced audio systems with ambisonics reflects growing attention to multisensory research.
Finally, the inventory of computing hardware underscores the computational intensity of modern XR research. It includes high-end GPUs like the NVIDIA A4000, specialized processors, and large memory configurations, all of which require significant infrastructure investment. Despite new GPU technology pushing the option to run said 4k displays with powerful engines like Unreal 5, the scarcity and high cost of gpus remain and issue. With the tech sector utilizing massive amounts of resources towards GPU consumption, the trend toward scarcity and high cost likely will continue for the foreseeable future.
Despite the field's vibrancy, the analysis highlights a critical gap: accessibility and inclusion. With only 5 studies specifically focusing on this topic, this theme is significantly under-explored. As XR technologies integrate more deeply into society, ensuring they are usable and equitable for individuals with diverse abilities is paramount. Dedicated research is crucial to foster equitable access and prevent new digital divides. The low volume of studies suggests research efforts lag behind technology development. Future work should prioritize designing for diverse populations and developing comprehensive accessibility metrics.
The growing presence of Social Virtual Environments also points to an emerging area of importance. As shared virtual spaces become more common, understanding social dynamics will become a central research imperative. This includes studying virtual identity, privacy, and moderation. It also requires integrating sociological and psychological perspectives with technical development.
Limitations
These findings have limitations. The analysis relies on a pre-compiled dataset, so the quality of the original data and its categorization influence the results. The "relevance" scores, for instance, stem from the original methodology and may be subjective. The scope is also limited to three journals. While influential, they do not represent all XR scholarship. Research from conferences, books, and other journals in fields like computer graphics or medicine is not included. Finally, details about graphics cards, generic controllers, and some tracking systems were not included, limiting the granularity of the technology analysis.
Conclusion
The hype surrounding a consumer "metaverse" faded through 2024 and nearly disappeared during the time of this report’s publication. In research, the metaverse was never a primary focus. Driven primarily by Meta and Mark Zuckerberg, the concept did spur excitement and increased investment giving XR something of a second wind. Never actually coalescing into a unified platform or commodified world used by a large population, researchers continued trends of years past. More specifically, they continued an evidence-based drive to solve concrete problems in industry, healthcare, and education. The dominant research themes are applied, not aspirational. The focus on immersive training and healthcare applications reflects today's major socio-technical trends: the need for scalable remote work and education, and the demand for personalized therapeutic technologies. The technology catalog tells a similar story of maturation. Hardware has diversified, and versatile game engines like Unity are common. XR is no longer a niche tool but a sophisticated ecosystem for creating high-fidelity simulations.
The practical research referenced here can be viewed as establishing the foundation to integrate XR into the global economy's critical workflows. By the time headsets become common place in said workflows, they will probably look nothing like the bulky beasts that chain researchers to their labs now. In terms of actual use and wearability, issues in accessibility and cyber sickness highlight that XR needs to evolve. The technology is powerful, but for many, it remains uncomfortable or unusable. Headsets are still too heavy for all-day wear, and lens technology has matured only slightly, leaving users with mobility or vision issues behind. Without a doubt, this limits widespread adoption in commercial areas but also research as well.
The solutions for inclusion are not obvious. As society grapples with digital inclusion, the XR community's next imperative is clear: shift focus from what is possible to what is comfortable, equitable, and accessible. The newly released Bigscreen Beyond aims to address size and weight issues yet its individualized lenses and faceplate setup make it nearly useless for research.
New AI technologies could help address these issues. The next wave of XR advancement will be defined by the synergy of XR's immersive power and AI's adaptive intelligence. Future systems will use rich data from performance and sensory feedback to create personalized experiences, from adaptive training simulations to dynamic therapeutic applications. The challenge for the decade is to make that value accessible to everyone.
Extended Reality Research Catalogue 2024: Complete Hardware, Software and Device Listings
VR Headsets: HTC Vive
Mentioned in 108 papers - Dominant platform in academic VR research
Representative examples:
- Hoffman, H. G., Seibel, C. C., Coron, L., Simons, L. E., Drever, S., Le May, S., ... & Flor, H. (2024). Increasing presence via a more immersive VR system increases virtual reality analgesia and draws more attention into virtual reality in a randomized crossover study. Frontiers in Virtual Reality, 5, 1452486.
- Gonzalez-Franco, M., Steed, A., Berger, C. C., & Tajadura-Jiménez, A. (2024). The impact of first-person avatar customization on embodiment in immersive virtual reality. Frontiers in Virtual Reality, 5, 1436752.
Oculus Rift
Mentioned in 61 papers - Legacy platform with continued research relevance
Representative examples:
- Suzuki, T., Uhde, A., Nakamura, T., Narumi, T., Amemiya, T., & Kuzuoka, H. (2024). Be sensei, my friend: Aikido training with a remotely controlled proxy trainer. Frontiers in Virtual Reality, 5, 1392635.
- Koseki, Y., & Amemiya, T. (2024). Being an older person: modulation of walking speed with geriatric walking motion avatars. Frontiers in Virtual Reality, 5, 1363043.
Meta Quest
Mentioned in 53 papers - Popular consumer-grade VR platform
Representative examples:
- Abbas, S., & Jeong, H. (2024). Unveiling gender differences: a mixed reality multitasking exploration. Frontiers in Virtual Reality, 4, 1308133.
- Zarouali, B. (2024). People's intentions to use metaverse technology: Investigating the role of gratifications and perceptions. PRESENCE: Virtual and Augmented Reality, 33, 179-192.
Microsoft HoloLens
Mentioned in 51 papers - Leading AR headset in research
Representative examples:
- Ismael, M., McCall, R., McGee, F., Belkacem, I., Stefas, M., Baixauli, J., & Arl, D. (2024). Acceptance of augmented reality for laboratory safety training: methodology and an evaluation study. Frontiers in Virtual Reality, 5, 1322543.
Google Glass
Mentioned in 1 paper - Early AR platform
Representative example:
- Tretter, M., Hahn, M., & Dabrock, P. (2024). Towards a smart glasses society? Ethical
- perspectives on extended realities and augmenting technologies. Frontiers in Virtual Reality, 5, 1404890.
Software PlatformsUnity
Mentioned in 156 papers - Most popular development platform
Representative examples
- Goh, C., Ma, Y., & Rizzo, A. (2024). Normative performance data on visual attention in neurotypical children: virtual reality assessment of cognitive and psychomotor development. Frontiers in Virtual Reality, 5, 1309176.
- Jiang, M., Guo, X., Seno, T., Remijn, G. B., & Nakamura, S. (2024). Examination of the Effect of the Real-Life Meaning of the Stimulus on the Self-Motion Illusion. PRESENCE: Virtual and Augmented Reality, 33, 145-160.
SteamVR SDK
Mentioned in 31 papers - VR development framework
Unreal Engine
Mentioned in 24 papers - High-fidelity graphics engine
Representative example:
- Krüger, M., Gilbert, D., Kuhlen, T. W., & Gerrits, T. (2024). Game engines for immersive visualization: Using unreal engine beyond entertainment. PRESENCE: Virtual and Augmented Reality, 33, 31-55.
Blender
Mentioned in 18 papers - 3D modeling and animation
Vuforia
Mentioned in 8 papers - AR development platform
ARKit
Mentioned in 3 papers - iOS AR framework
ARCore
Mentioned in 2 papers - Android AR framework
Magic Leap
Mentioned in 6 papers - Enterprise AR platform
Epson Moverio
Mentioned in 2 papers - Industrial AR glasses
Virtual Environment SystemsVirtual Environment Platforms
Mentioned in 126 papers - Core VR simulation systems
Representative examples:
- Niki, K., Egashira, S., & Okamoto, Y. (2024). A real-time virtual outing using virtual reality for a hospitalized terminal cancer patient who has difficulty going out: a case report. Frontiers in Virtual Reality, 5, 1269707.
- Ng, R., Woo, O. K. L., Eckhoff, D., Zhu, M., Lee, A., & Cassinelli, A. (2024). Participatory design of a virtual reality life review therapy system for palliative care. Frontiers in Virtual Reality, 5, 1304615.
Environmental Simulation Platforms
Mentioned in 17 papers - Specialized simulation environments
Representative examples:
- Graf, L., Sykownik, P., Gradl-Dietsch, G., & Masuch, M. (2024). Towards believable and educational conversations with virtual patients. Frontiers in Virtual Reality, 5, 1377210.
- Retz, C., Klotzbier, T. J., Ghellal, S., & Schott, N. (2024). CIEMER in action: from development to application of a co-creative, interdisciplinary exergame design process in XR. Frontiers in Virtual Reality, 5, 1376572.
Virtual Space Platforms
Mentioned in 4 papers - Spatial computing environments
Representative examples:
- Kaiser, K., Walters, S., Sheehy, K., Murray, E., & Spencer, K. (2024). Implementing the technology shift from 2D to 3D: insights and suggestions for umpire educators. Frontiers in Virtual Reality, 5, 1368648.
- Yashin, A. S., Lavrov, D. S., Melnichuk, E. V., Karpov, V. V., Zhao, D. G., & Dubynin, I. A. (2024). Robot remote control using virtual reality headset: studying sense of agency with subjective distance estimates. Virtual Reality, 28(3), 132.
Simulation Engines
Mentioned in 2 papers - Physics and simulation systems
Representative examples:
- Hertwig, A. F., Brandewiede, A., & Feufel, M. A. (2024). Using virtual reality to support the design of work systems in 3P workshops: a use case from the automotive industry. Frontiers in Virtual Reality, 5, 1268780.
Virtual Scene Platforms
Mentioned in 2 papers - Scene reconstruction systems
Representative examples:
- Tabone, W., Happee, R., Yang, Y., Sadraei, E., García de Pedro, J., Lee, Y. M., ... & de Winter, J. (2024). Immersive insights: evaluating augmented reality interfaces for pedestrians in a CAVE based experiment. Frontiers in Virtual Reality, 5, 1353941.
Contextual VR Platforms
Mentioned in 1 paper - Context-aware VR systems
Representative example:
- Kaiser, K., Walters, S., Sheehy, K., Murray, E., & Spencer, K. (2024). Implementing the technology shift from 2D to 3D: insights and suggestions for umpire educators. Frontiers in Virtual Reality, 5, 1368648.
Virtual Setting Platforms
Mentioned in 1 paper - Environment configuration systems
Representative example:
- Kaiser, K., Walters, S., Sheehy, K., Murray, E., & Spencer, K. (2024). Implementing the technology shift from 2D to 3D: insights and suggestions for umpire educators. Frontiers in Virtual Reality, 5, 1368648.
Virtual World Platforms
Mentioned in 1 paper - Persistent virtual worlds
Representative example:
- MacArthur, C., Kukshinov, E., Harley, D., Pawar, T., Modi, N., & Nacke, L. E. (2024). Experiential disparities in social VR: uncovering power dynamics and inequality. Frontiers in Virtual Reality, 5, 1351794
Research Tools & Measurement SystemsEvaluation Systems
Mentioned in 10 papers - Assessment and evaluation frameworks
Representative examples:
- Song, Z., & Evans, L. (2024). The museum of digital things: extended reality and museum practices. Frontiers in Virtual Reality, 5, 1396280.
- Schlagowski, R., Volanti, M., Weitz, K., Mertes, S., Kuch, J., & André, E. (2024). The feeling of being classified: raising empathy and awareness for AI bias through perspective-taking in VR. Frontiers in Virtual Reality, 5, 1340250.
Application Platforms
Mentioned in 8 papers - Application development frameworks
Representative examples:
- Amm, V., Chandran, K., Engeln, L., & McGinity, M. (2024). Mixed reality strategies for piano education. Frontiers in Virtual Reality, 5, 1397154.
Data Analysis Systems
Mentioned in 7 papers - Analytics and data processing
Representative examples:
- Ban, Y., Inazawa, M., Kato, C., & Warisawa, S. I. (2024). VR communication simulation with scripted dialog elicits HPA axis stress. Frontiers in Virtual Reality, 4, 1302720.
- Ismael, M., McCall, R., McGee, F., Belkacem, I., Stefas, M., Baixauli, J., & Arl, D. (2024). Acceptance of augmented reality for laboratory safety training: methodology and an evaluation study. Frontiers in Virtual Reality, 5, 1322543.
Assessment Tools
Mentioned in 6 papers - Measurement and assessment instruments
Representative examples:
- Renata, A., Guarese, R., Takac, M., & Zambetta, F. (2024). Assessment of embodied visuospatial perspective taking in augmented reality: insights from a reaction time task. Frontiers in Virtual Reality, 5, 1422467.
- Ismael, M., McCall, R., McGee, F., Belkacem, I., Stefas, M., Baixauli, J., & Arl, D. (2024). Acceptance of augmented reality for laboratory safety training: methodology and an evaluation study. Frontiers in Virtual Reality, 5, 1322543.
Input Devices
Mentioned in 5 papers - Interaction and input systems
Representative examples:
- Liu, S., & Lindlbauer, D. (2024). TurnAware: motion-aware Augmented Reality information delivery while walking. Frontiers in virtual reality, 5, 1484280.
Measurement Sensors
Mentioned in 5 papers - Sensor and measurement systems
Representative examples:
- Guillen-Sanz, H., Checa, D., Miguel-Alonso, I., & Bustillo, A. (2024). A systematic review wearable biosensor usage in immersive virtual reality experiences. Virtual Reality, 28(2), 74.
- Michiels, N., Jorissen, L., Put, J., Liesenborgs, J., Vandebroeck, I., Joris, E., & Van Reeth, F. (2024). Tracking and co-location of global point clouds for large-area indoor environments. Virtual Reality, 28(2), 106.
Data Processing Systems
Mentioned in 4 papers - Data processing and analysis
Representative examples:
- Morales-Vega, J. C., Raya, L., Rubio-Sánchez, M., & Sanchez, A. (2024). A virtual reality data visualization tool for dimensionality reduction methods. Virtual Reality, 28(1), 41.
Interface Design Tools
Mentioned in 3 papers - UI/UX design systems
Representative examples:
- Reski, N., Alissandrakis, A., & Kerren, A. (2024). Designing a 3D gestural interface to support user interaction with time-oriented data as immersive 3D radar charts. Virtual Reality, 28(1), 30.
Development Frameworks
Mentioned in 3 papers - Development and prototyping tool
Testing Platforms
Mentioned in 2 papers - Testing and validation systems
Representative examples:
- Ramírez, M., Müller, A., Arend, J. M., Himmelein, H., Rader, T., & Pörschmann, C. (2024). Speech-in-noise testing in virtual reality. Frontiers in Virtual Reality, 5, 1470382
Control Systems
Mentioned in 2 papers - Control and automation systems
Representative examples:
- Yashin, A. S., Lavrov, D. S., Melnichuk, E. V., Karpov, V. V., Zhao, D. G., & Dubynin, I. A. (2024). Robot remote control using virtual reality headset: studying sense of agency with subjective distance estimates. Virtual Reality, 28(3), 132.
Design Tools
Mentioned in 2 papers - Design and modeling tools
Representative examples:
- Song, Z., & Evans, L. (2024). The museum of digital things: extended reality and museum practices. Frontiers in Virtual Reality, 5, 1396280.
Training Platforms
Mentioned in 2 papers - Training and education systems
Representative examples:
- Ismael, M., McCall, R., McGee, F., Belkacem, I., Stefas, M., Baixauli, J., & Arl, D. (2024). Acceptance of augmented reality for laboratory safety training: methodology and an evaluation study. Frontiers in Virtual Reality, 5, 1322543.
Monitoring Systems
Mentioned in 2 papers - Monitoring and tracking systems
Representative examples:
- Michiels, N., Jorissen, L., Put, J., Liesenborgs, J., Vandebroeck, I., Joris, E., & Van Reeth, F. (2024). Tracking and co-location of global point clouds for large-area indoor environments. Virtual Reality, 28(2), 106.
Communication Tools
Mentioned in 1 paper - Communication systems
Representative example:
- Ban, Y., Inazawa, M., Kato, C., & Warisawa, S. I. (2024). VR communication simulation with scripted dialog elicits HPA axis stress. Frontiers in Virtual Reality, 4, 1302720.
Feedback Systems
Mentioned in 1 paper - Feedback and response systems
Representative example:
- Joolee, J. B., Hashem, M. S., Hassan, W., & Jeon, S. (2024). Deep encoder–decoder network based data-driven method for impact feedback rendering on head during earthquake. Virtual Reality, 28(1), 23.
Haptic Devices
Mentioned in 1 paper - Haptic feedback systems
Representative example:
- Palombo, R., Weber, S., Wyszynski, M., & Niehaves, B. (2024). Glove versus controller: the effect of VR gloves and controllers on presence, embodiment, and cognitive absorption. Frontiers in Virtual Reality, 5, 1337959.
Navigation Systems
Mentioned in 1 paper - Navigation and guidance systems
Representative example:
- Buwaider, A., El-Hajj, V. G., Iop, A., Romero, M., C Jean, W., Edström, E., & Elmi-Terander, A. (2024). Augmented reality navigation in external ventricular drain insertion—a systematic review and meta-analysis. Virtual Reality, 28(3), 141.
Rendering Systems
Mentioned in 1 paper - Graphics rendering systems
Representative example: