Merged XR turns immersive XR creation into an easy three-step process.
With the recent release of the 9.0 update to EON-XR, EON Reality’s main focus has shifted to making the creation and illumination process as smooth as possible. With the new introduction of the recently-patented Merged XR, we’re taking things a step further.
For those unfamiliar with EON-XR’s latest features, the “illumination” process refers to the ability to add digital data and activities to physical environments — meaning that every inanimate object in a real-life environment can be rich with vast amounts of information. This converts every room, setting, and item into an opportunity to create a thorough lesson or experience, leaving the creator’s imagination as the only limitation.
Now, EON Reality is making that process even simpler and more efficient through Merged XR, a dedicated product committed to turning “illumination” into a three-step Artificial Intelligence-based tool that anyone with a smartphone can use.
Step 1: Scan
The first step of Merged XR is to use one of the latest and greatest features in EON-XR, which is the ability to scan a physical environment with a smartphone. By scanning and uploading the environmental surroundings, the AI can then begin to generate the 360° environment for identifying interactive 3D elements. As always, the better the scan, the better the results, so it’s best to lay down a good foundation on which to build the full XR experience.
Step 2: Photograph
In this AI-assisted version of the EON-XR creation process, users will be photographing any important objects or aspects that they would like to highlight in the finished lesson. By doing this, Merged XR will be able to deduce everything from what the object looks like to where it is physically located within the environment.
From here, Merged XR can create knowledge panels while also using technology such as Google Lens to generate annotation names, AI-enhanced web scraping to create annotation text, and Google Translate to bring the information into more than 100 languages both for text and voice support. Additionally, the user can still benefit from some of the more advanced creation tools, such as adding PDFs, assessments, quizzes, and activities such as location- and identity-based challenges.
Step 3: Demonstrate
The “walk and talk” stage of the process sees users simply moving around their physical environment while both physically and verbally adding any additional necessary information. By simply pointing out objects and elements as they’re mentioned, Merged XR’s AI will do the rest of the work, automatically generating a 3D recording with the option to make it a sequence-based assessment as well.
Within this 3D recording, students, trainees, and other viewers will see the creator’s avatar moving around the space while the voiceover description triggers at the proper moments. It will also utilize “magic wand” technology to point out exactly which objects and aspects are being discussed at any given moment, and show both the user’s movement pathway and sequence of interactions throughout the entire experience. Finally, individual objects and components can be highlighted to show additional functionality and details both digitally and physically in the recording.
Through this new simplified process made possible by the new AI-based technology within Merged XR, EON Reality hopes to encourage users around the world to truly embrace the immersive possibilities of bringing students and other viewers into the complete environment rather than simply relying on 360° images and videos or 3D objects alone.
Image courtesy of John Gaeta.