The WebXR standard will support hand tracking from devices like the Oculus Quest. This was very exciting news for us. The opportunities for directly tracking your hands with no additional gadgets like gloves, external trackers, or software installations like plugins is critical for wide adoption as we have focused so much of our effort on making our platform frictionless and zero IT footprint for all users. In the near future we envision our users to directly use hand tracking in their workflows. Let’s see how this works today and take a quick peek at the future.
Oculus Quest works out of the box for iQ3Connect meetings. Just open the built in browser and use iQ3 just as you would on a PC. Fire up an IQ3 VR meeting space, load your 3D data, and enter VR.
3D models in iQ3 are intelligent and not just a collection of objects as you might find in many VR experiences. This enables our users to perform real production engineering tasks collaboratively in iQ3. You can use your hand controllers to very precisely pick apart a model such as this engine. This could be part of a design review, ergonomic checking or product training steps where a trainee can interactively learn assembly/disassembly of a complex product.
Now let’s switch over to the WebXR samples page, which hosts some cool examples of using a web browser for VR/AR. I can put down my controllers and the Quest will switch over to tracking my hand and fingers. It is a completely seamless experience.
The tracking showed some hiccups but overall it was a great experience to see my virtual hand. As you can see my fingers are fully tracked so I can make complex gestures such as grabbing or picking a virtual object with my fingers. Watch the complete video here and stay tuned for more updates on this topic.