iQ3Connect Product Release v2025.2

iQ3Connect 2025.2 is here! We’re making XR training and work instruction creation easier, faster, and more powerful with new automation capabilities and AI/IoT integrations – so you can focus on what matters most: upskilling your workforce and improving operational efficiency. Key updates include Automated Experience Creation, a new Step Instruction Menu, and AI and IoT integration for work instructions, task verification, and digital twins. Explore the new capabilities of iQ3Connect 2025.2 below. 

 

Training and Work Instructions

 

  • Automated Experience Creation: Automatically create XR training and experiences using the new Training Wizard or quickly duplicate existing content and structure using the new copy/paste functionality. The Training Wizard will automatically create a fully functioning XR experience from 2 or more States (i.e. Scenes/Steps). The Training Wizard can also be used to automatically add the Next/Back navigation buttons to a step-by-step experience. The new copy/paste functionality enables the existing content (actions) or structure (timelines) to be quickly duplicated.  

 

  • Work Instruction Task Verification – AI + IoT Integration: Integrate artificial intelligence outputs and IoT sensor data into XR experiences. A new training action, External Signal Receiver, is now available which will listen for external data input and update the XR experience in real-time, either to report the data as-is or update the XR experience dynamically based on the content. These integrations are primarily targeted toward digital twins and work instruction task verification but can be applied to any use case.

 

  • Step Instruction Menu: The new Step Instruction Menu provides an improved UI that combines text information and menus/buttons. The Step Instruction Menu is completely customizable, from the colors, to the icons, to the number of buttons. The Step Instruction menu works in 2D, AR, and VR mode, whether mobile, tablet, PC, or HMD. 

 

  • XR Action – Time Tracking: Capture the time it takes to perform a step (or any arbitrary sequence) using the new Time Tracking actions. An unlimited number of durations can be tracked, with each time duration stored as a data point within the experience. This data can be automatically passed to an LMS system or easily exported to 3rd party tools.

 

  • XR Action – Flash Object: A new training action that will cause the defined object to flash. The flashing can be customized to change the visibility, transparency, or highlighting of the object.

 

  • Animations in GLB files: The ability to play animations embedded in GLB files has been expanded to include new end animation behaviors: reset, stop, loop. Reset will reset the animated objects back to their initial position/configuration, stop will leave the animated objects in their position/configuration as set at the end of the animation, while loop will cause the animation to play continuously. These settings are included as part of the GLB Animation action.

 

  • Augmented Reality – Virtual Hands: When entering AR-mode on a head-mounted display (HMD), the virtual hands will no longer be shown by default. Instead, the virtual menus and guides will now be mapped directly to the user’s physical hands, increasing the user’s visibility of the physical space. As part of this change, there is a new setting that can be used to turn back on the virtual hands if preferred and set that as the default behavior.

 

  • Text Box Behavior Improvements: Easily display text to end-users in 3D environments with our new Text Box behavior: Scene-Follow. When set to Scene-Follow mode, text boxes will appear in front of the user and will gradually follow them around the 3D environment. 

 

  • XR Performance Improvements: The lag and frame rate drops for animations and state transitions (i.e. LoadState actions) have been greatly reduced, leading to an overall smoother experience for the end-user.

 

  • Improved UI/UX for Experience Authoring: Some important elements of the experience creation process have been updated to provide an improved UI/UX. The Training Add Action menu has been reorganized to provide a more intuitive order, while the Create State and Add Timeline buttons have been moved to the relevant section header, eliminating the need to scroll to access these functions.

 

  • Reduced the Number and Frequency of the Movement Locked and Orientation Locked Messages: When an end-user is in an XR experience with the movement or orientation locked, they are automatically provided with a message when trying to move or rotate to inform them that their controls are locked. These messages were occurring too often which could disrupt the overall experience. Thus, the number and frequency of these messages has been greatly reduced.

 

 

Virtual Workspaces and Classrooms

 

  • Real-Time Digital Twins – AI + IoT Integration: Virtual workspaces can now be linked to data sources such as IoT sensors and AI output to create digital twins that are updated in real-time based on the real-world environment. These integrations are primarily targeted toward digital twins and work instruction task verification but can be applied to any use case.

 

  • Save Scene in 2D Menu: The Save Scene button has been added to the standard 2D menu, making it more readily accessible. The Save Scene button can now be found in the upper left corner menu: My Content > Scenes & Sessions, see image below. Note: It can still be found in the XR menu as well.

 

 

Enterprise Hub

 

  • Launch Workspace Improvements: Launching a Workspace has been streamlined to provide easier access to XR models and Workspace settings. Launching a Workspace directly from the Project Home screen will now automatically select all of the available XR models in that Project. Alternatively, entire folders can now be selected from the XR Models page when launching a Workspace. Finally, a new quick settings menu will appear when launching a Workspace which provides options to change the workspace duration, public/private setting, and menu enabled/disabled.

 

  • Project Management and Default Template Editing: The ability to manage and administer Projects has been expanded. The Default Project Template can now be edited. Project Templates can now be used to update Projects (individually or in bulk) even after the Project has been created. The background environment can now be customized as part of the Project Template.

 

  • UI/UX Improvements for Inviting Users to Projects and Accessing Multimedia: The UI/UX for inviting users to Projects has been greatly improved, making it easier to quickly invite users and understand what invitations are still pending. Additionally, the Multimedia page is now the first page shown when the External Assets tab is selected.

 

  • Announcements Readability Improvement: When an announcement is selected, it is now fully expanded as a pop-up to improve readability

iQ3Connect Product Release v2025.1

iQ3Connect v2025.1 introduces a range of enhancements to expand the capabilities of web-based spatial training and work instructions, while improving the ease of XR content creation. Key updates include intuitive touch-based AR alignment, new training actions and triggers to improve interactivity, navigation and event-responsiveness, enhancements to tracking and logic to simplify performance monitoring and adaptive adjustments, and a streamlined UI/UX to make the training creation process even easier. Explore the new capabilities of iQ3Connect 2025.1 below. 

 

Training and Work Instructions

 

  • Touch-based AR Alignment – Head Mounted Displays (HMDs): Precise alignment of the virtual and physical environments can now be achieved through intuitive touch-based alignment with the controllers or hands. This new feature is compatible with any AR headset and only requires the end-user to touch 2 designated points in the physical environment to achieve alignment. This new process works without markers and can align to any arbitrary surface, regardless of size, shape, or material.

 

  • XR Action – Move Objects: A new training action has been added to the iQ3Connect Training Studio – Move Objects.  Creators of XR experiences can now visually define which virtual objects end-users can interact with and manipulate. Whitelist objects to quickly define a small number of interactive objects, leaving everything else locked-down, or blacklist objects to quickly lock down (make non-interactable) a few objects while making everything else interactable.

 

  • XR Action – Wayfinding: A new training action has been added to the iQ3Connect Training Studio – Wayfinding. The wayfinding action displays a virtual path toward a specific part/location in the environment, enabling end-users to more easily find the part or navigate to the desired location. Wayfinding can be paired with a User Position Trigger to automatically spawn events and actions once the user arrives at the destination.

 

  • XR Trigger – User Position: The new User Position Trigger can spawn actions and events based on the distance of the user to a designated object. Use this trigger to detect once the user has reached within a certain distance of the desired object or location. This trigger is available from the Wayfinding Action and can be used with or without wayfinding’s virtual path.

 

  • XR Trigger – Object-to-Object Proximity: The new Object-to-Object Proximity Trigger can spawn actions and events based on the distance between 2 virtual objects. Use this trigger to detect once the distance between 2 objects falls below a defined threshold distance, such as when verifying if an end-user has correctly placed a virtual part/tool/etc.

 

  • Tracking, Logic, and Variables: Our extensive tracking, logic, and variable system is now accessible directly from the Training Studio, meaning that scripting is no longer required. Our new visual interface makes it easy to record end-user outcomes (such as time to completion, incorrect steps, answers to questions, etc.) through a flexible tracking system. Define and change variables based on the metrics to be tracked and the performance of the end-user. Variables can be used in combination with logic statements to adjust the training dynamically to user performance and user selections.

 

  • Camera and Navigation Control: A creator’s ability to control the camera and navigation of the end-user has been drastically simplified with improvements to the Training Studio. Locking camera movement and/or orientation is now controlled through toggle switches in the State properties. Navigational boundaries (such as preventing end-users from walking through walls) can be set up visually, including the setup of stacked navigational rules (i.e. a user is allowed to move throughout a room but not through the objects within a room).

 

  • Download Training Data and Results: Training data and results can now be downloaded directly from the iQ3Connect Hub. In Workspace Templates > Past Workspaces, a new Download Training Results icon is available next to each past workspace.

 

  • Training Creation UI/UX Improvements: Some basic UI/UX improvements have been made to improve the efficiency of training creation. The training studio will now open automatically when a training is created, and the starting timeline is now premade. The action property viewer will automatically open when an action is added to a timeline. A new start/stop button will enable seamless transitions between authoring and training preview modes.

 

 

Virtual Workspaces and Classrooms

 

  • Improved Object Movement: Object movement has been drastically improved to provide for faster and more accurate movement of objects within the 3D environment. Improvements have been made for both PC/tablet and XR movement controls. Movement is now controlled via click-and-drag (replacing the old click-on to move, click-off to stop movement system) while a simple move speed modifier (such as holding the shift key or clicking on an on-screen button) allows users to seamlessly transition between fast movement and precision movement.

 

  • Improvements to Rejoining a Workspace: When attempting to join a meeting multiple times from the same account, preference will be given to the most recent connection, removing any older connections from the workspace. To prevent accidental or malicious removal, the older connection will be prompted to accept or reject the newer connection. Acceptance of new connection (or inactivity in responding to the prompt) will then remove the old connection and allow the new connection into the workspace.

 

  • Information Tag Status Indicator and Render Improvement: When viewing the list of information tags in a workspace, active info tags (tags already added to the workspace) will now be highlighted in the list to provide a visual indication of what’s in the workspace and what’s not. Info tag placement on 3D objects has also been improved, minimizing improper occlusion of the info tag to improve its visibility.

 

  • Default Environment Update: The default background environment for the iQ3Connect Workspace has been updated.

 

 

Enterprise Hub

 

  • Simplified Help Access: Accessing the iQ3Connect Knowledge Base has been greatly simplified. The help icon now redirects users to the knowledge base directly.

 

  • XR Model Organization – Grid View: Grid View (as opposed to the default list view) allows users to view an enlarged snapshot of all their XR models in an organized grid directly from the iQ3Connect Hub.

iQ3Connect Product Release v2024.5

Web-based spatial training and work instructions are now more accessible than ever with the release of iQ3Connect 2024.5. This latest update simplifies AR alignment, model optimization, animation authoring, and point cloud visualization. Discover the exciting new capabilities of iQ3Connect 2024.5 below. 

 

  • AR Alignment – Tablets and Mobile Devices – To make AR experiences and work instructions more user friendly, iQ3Connect has introduced automated physical/virtual alignment for tablets and mobile devices. With just two taps, your virtual and physical worlds are now aligned. No need to rely on markers and no need for flat surfaces. This new automated alignment works for objects of any size and shape.

 

  • Real-Time Model Simplification and Optimization – While our automated optimization often eliminates the need for manual simplification, we’ve now made it easier than ever when manual intervention is needed. Authors can simplify models dynamically while creating immersive experiences, ensuring critical decisions are made at the right time. Concerned about over-simplifying or losing essential details? With iQ3Connect’s robust versioning system, restoring previous versions of your model is seamless and instantaneous — empowering you to create without worry.

 

  • Author Animations with Alignment and Precision – New tools are available to make it even easier to create customized animations. A new alignment mechanism makes it easy to animate objects in relation to other objects, such as when needing to snap pieces together, while a new precision move mechanism allows for fine control of animations at millimeter accuracy.

 

  • Point Cloud Visualization – Import and view large point cloud datasets faster than ever. Our latest update drastically reduce the time it takes to process point cloud files while our optimized rendering allows any device to seamlessly view point clouds, even with billions of points.

September 2024 Newsletter

iQ3Connect – September Newsletter

Reimagine Workforce Training with iQ3Connect’s Release of XR Visual Authoring

It’s time to reimagine workforce training, and rethink the potential of immersive technology to capture and share corporate knowledge, build critical skills, and empower collaborative learning.

 

iQ3Connect is excited to announce a transformative solution for workforce development with our XR Training and Experience Creator – a visual (no-code) authoring environment for immersive training, work instructions, and knowledge capture. Accessible via a web-browser from any device whether it’s a laptop, tablet, phone, or AR/VR headset. Empower anyone on your team to create engaging XR experiences as quickly as a slide deck and more cost-effectively than video production. No XR, CAD, or visual design expertise required. 

 

By removing the traditional barriers to XR adoption, iQ3Connect enables 10X faster content creation at 10% of the cost. 

 

Key Benefits:

  • No-Code Authoring: Visual tools to empower anyone to build interactive training without XR, CAD, or coding expertise.
  • Cost-Effective: Deploy immersive learning without the prohibitive costs of traditional XR technology, keeping your budget intact.
  • Scalable: Training can be accessed instantly from a web browser on any device, eliminating app downloads and traditional IT and hardware bottlenecks.
  • Fast & Easy Access: Skip the time-consuming travel and launch training programs faster than ever, ensuring your workforce remains up-to-date and competitive from any location

 

The Future of Training is Here

 

With the iQ3Connect XR Training and Experience Creator, you can finally unlock the full potential of XR technology. This platform makes it easier, faster, and more affordable to create immersive training experiences that drive engagement, retention, and performance.

 

Get Started Today!

 

Request a free-trial or schedule a demo.

 

The future of training is immersive, affordable, and scalable—thanks to iQ3Connect.

iQ3Connect Product Release v2024.4

iQ3Connect 2024.4 is streamlining the creation and management of XR training and work instructions. All you need is a web browser! This latest update enhances animation creation, integrates immersive 360-degree media, automates multimedia updates, and introduces new ways to build, visualize, and interact with XR scenes. Discover the exciting new capabilities of iQ3Connect 2024.4 below. 

 

  • Easy Animation Creation and Import – Seamlessly control animations embedded in .glb or .fbx models directly in iQ3 training and work instructions. Alternatively, with iQ3Connect’s Training Creator, authors can create their own animations on any 3D model in seconds. Animations can be applied, repeated, and reused infinitely across multiple different objects.

 

  • 360-Degree Images and Videos – Enhance your XR training and experiences with 360-degree images and videos. These immersive media options are perfect when 3D models aren’t available, or they can be combined with 3D models to create rich, interactive XR experiences.

 

  • Automated Content Management – iQ3Connect can now automatically import and update 2D and 3D content, even if included in XR training and experiences. Assets such as CAD models, videos, and PDFs will be automatically updated in the XR experience when their underlying data is updated, ensuring your XR experiences are always up-to-date.

 

  • Multimedia Movement and Anchoring – Now, multimedia objects can be moved and rotated using the same Coordinate System Controls as 3D objects. This unification simplifies object manipulation and enhances ease of use, allowing multimedia objects to be more accurately placed in the 3D virtual environment. Additionally, multimedia can be anchored to 3D objects to create realistic virtual scenes.

 

  • Dollhouse Mode – Traditionally, iQ3 Virtual Workspaces display 3D models at 1-to-1 scale providing users with a realistic experience. With the new Dollhouse mode, users can switch effortlessly between a 1-to-1 scale and a miniaturized version of the 3D model. This new mode provides an easy way to visualize large models, just like viewing scale models in reality.

July 2024 Newsletter

iQ3Connect – July Newsletter

June 2024 Newsletter

iQ3Connect – Magic Leap and Product Release

iQ3Connect Product Release v2024.3

iQ3Connect 2024.3 is expanding the power and ease-of-use of web-based XR training and work instructions. In this latest update, we have drastically simplified the process of creating user interfaces whether completely customized or template-based, improved the capabilities for adopting AR work instructions based on the physical environment, and automated the export of training performance and metrics to other tools, such as LMS. Below are some additional details on the new features in iQ3Connect 2024.3. 

 

  • Physical Measurement and Data Capture for AR – During AR training or AR work instruction, users can input physical measurements and data into the virtual experience. These inputs can be stored, manipulated, and reported for future use for user feedback, data analytics, and more. These inputs can also be used to control the AR experience such as when additional steps must be taken if measurements aren’t within nominal values.

 

  • Easy, Customizable User Interfaces – Simplify the process of user interface design for XR training and collaboration. Import multimedia such as images, pdfs, or videos to create buttons and interactive objects in seconds. Leverage style templates to ensure consistent look and feel for text boxes and buttons. 

 

  • Quick and Feature-rich Labels for Providing Text, Audio, and Web Resources – iQ3Connect training modules and experiences now support in-built labels and descriptions that can be created and customized in seconds. These labels (also called Information Tags) can be attached to virtual objects and display text, play audio files, and/or provide URL links, offering an easy and quick method for displaying textual information, audio narration, and/or web-based resources.

 

  • Automated Export of Training Results and Metrics – Easily export the results and metrics from an iQ3Connect XR training or experience to any web-based tool using the post-message framework.

 

  • Wayfinding and Positional Triggers – Guide users to the object, task, or location of interest with virtual waypoints or arrows. Trigger actions in the experience based on the user’s position and/or orientation.

 

  • Capture Alphanumeric Input during an XR Training or Experience – Users can now be prompted to input text and numbers during an XR training or experience. This provides training authors with the capability to evaluate trainee performance with open-ended questions and capture open-ended feedback from end-users directly from within the XR experience.

iQ3Connect Announces ISV Partnership with Magic Leap

iQ3Connect is proud to announce our ISV partnership with Magic Leap! iQ3Connect provides a scalable, end-to-end XR platform that allows Magic Leap enterprise customers to bring their CAD, point cloud scans, PDFs, videos, and other digital assets securely into an immersive environment for the purpose of training or multi-user collaboration. With iQ3Connect, there is no software installation or individual licensing on the device, greatly simplifying enterprise deployment. Use your Magic Leap 2 out of the box to view and interact with your digital assets in minutes, just use the built-in web browser or QR code app.

 

Check out the Magic Leap website for additional information: https://www.magicleap.com/apps

Icon

Form Submitted!

Thanks for submitting the form. An iQ3Connect representative will be reaching out to you shortly. In the meantime, please feel free to review some of our resources below.

Icon

Form Submitted!

Thanks for submitting the form. An iQ3Connect representative will be reaching out to you shortly. In the meantime, please feel free to review some of our resources below.

Icon

Form Submitted!

Thanks for submitting the form. An iQ3Connect representative will be reaching out to you shortly. In the meantime, please feel free to review some of our resources below.

Icon

Form Submitted!

Thanks for submitting the form. An iQ3Connect representative will be reaching out to you shortly. In the meantime, please feel free to review some of our resources below.