When we set out to create Fynd CORE it was with the intention to unify the features we regularly saw clients needing, requirements shared across industry, education, and defence. Part of this challenge was to remove the barriers that exist between all the different devices we use in our digital life. VR glasses, AR headsets and mobile phones all have their own strengths and weaknesses that should dictate how and when we use them. Let’s look at what that means!
One shared world
At its simplest, Fynd CORE is a group of people moving around in a 3D environment, like an office or a forest. When considering this as the basis, we can design our software to run on a multitude of devices and surfaces. By allowing users to participate in a CORE session on the device they know best, we can tear down some mental barriers when it comes to adopting new technologies like VR and the Metaverse! If you can try out CORE on your phone first, then VR is much less scary when you recognise the menus and places.
Playing to each device’s strengths
Working with all these devices means we must consider what you can do on each one. For example, there is no point in supporting our complex Immersive Learning system for mobile phones, with awkward controls and too much info for such a little screen. Conversely, writing large amounts of text in VR is slow and frustrating! Instead, perform the training in VR and let the mobile user handle the text input. When dealing with physical objects around you, VR has the downside of removing you entirely from the environment around you. This is where AR lets you merge the physical and virtual worlds into one, giving you a new layer of information without removing you from your surroundings or the people around you.
Use case: NATO operational support
This concept of sharing experiences across devices got properly put to the test in November 2022, when we ran a series of tests with NATO ACT, Latvian Ministry of Defence and LMT. The virtual trainer we built for their satellite communications vehicle was implemented for desktop, VR, Hololens and mobile AR, in this case even running on a 5G network in the field. This experiment showed that it’s possible for an operator to use AR in the field, while a remote assistant can follow along from their office, able to see the operator moving around the vehicle and guide them along. If needed, they can even jump into VR to stand next to the operator and point!
Cross-platform is the way to go
We have a lot of faith in our approach to our Enterprise Metaverse, meeting our users where they are comfortable and providing new opportunities with VR and AR. Going forward, we will be running a lot of tests to find out what our users prefer to use each platform for, and what they feel is lacking. As Fynd CORE evolves, this constant loop through different platforms will define our development and user experience design.