iQ3Connect 2024.3 is expanding the power and ease-of-use of web-based XR training and work instructions. In this latest update, we have drastically simplified the process of creating user interfaces whether completely customized or template-based, improved the capabilities for adopting AR work instructions based on the physical environment, and automated the export of training performance and metrics to other tools, such as LMS. Below are some additional details on the new features in iQ3Connect 2024.3.
Physical Measurement and Data Capture for AR – During AR training or AR work instruction, users can input physical measurements and data into the virtual experience. These inputs can be stored, manipulated, and reported for future use for user feedback, data analytics, and more. These inputs can also be used to control the AR experience such as when additional steps must be taken if measurements aren’t within nominal values.
Easy, Customizable User Interfaces – Simplify the process of user interface design for XR training and collaboration. Import multimedia such as images, pdfs, or videos to create buttons and interactive objects in seconds. Leverage style templates to ensure consistent look and feel for text boxes and buttons.
Quick and Feature-rich Labels for Providing Text, Audio, and Web Resources – iQ3Connect training modules and experiences now support in-built labels and descriptions that can be created and customized in seconds. These labels (also called Information Tags) can be attached to virtual objects and display text, play audio files, and/or provide URL links, offering an easy and quick method for displaying textual information, audio narration, and/or web-based resources.
Automated Export of Training Results and Metrics – Easily export the results and metrics from an iQ3Connect XR training or experience to any web-based tool using the post-message framework.
Wayfinding and Positional Triggers – Guide users to the object, task, or location of interest with virtual waypoints or arrows. Trigger actions in the experience based on the user’s position and/or orientation.
Capture Alphanumeric Input during an XR Training or Experience – Users can now be prompted to input text and numbers during an XR training or experience. This provides training authors with the capability to evaluate trainee performance with open-ended questions and capture open-ended feedback from end-users directly from within the XR experience.