Academic program Choose suit

AVRT: Taking a Closer Look at the VR Training (Part 1)   

Demand for immersive learning experiences is constantly growing. Virtual reality continues to expand across industries changing the way training is delivered and accelerating the pace of learning.  

We had a big talk with our partner AVRT who is pushing the boundaries of how immersive technologies drastically improve traditional training. Andy Higgs, Sales Director, has guided us through the VR-based training platform they provide for emergency services, military, and high-risk operatives.

What is the Adaptive Virtual Reality Training platform?

There is a free-roam, fully immersive VR training system that’s aimed primarily at law enforcement and military but has applications in other areas as well. We are focused on building the system in collaboration with around 80% of UK police forces and some beyond the UK as well. We’ve got a highly functional system that really has been designed for police officers and for soldiers based on their feedback.

Primarily our system is designed for critical encounters. It’s designed for those scenarios where we’re expecting there to be a danger either to members of the public, for example, or to the officers or soldiers themselves. The scenarios that we present to people are ones that do have an element of risk to them.

If we compare a traditional training to a virtual one, what significant differences can you note? 

We spent quite a bit of time working with UK police and some international police forces as well looking at how they train at the moment. And that was key to us to look into how we can improve that. The areas we focused on initially were for firearms, the sort of two-dimensional screen-based trainers and live training, you know, using simulation or paint rounds, for example, and for conductive energy weapons we looked at the live roleplay they did with the big padded suits and the roleplay with instructors.

And what you get is, when comparing it to virtual simulation, we have a much better way of being able to replay, assess and understand the decision making, because there’s no approximation in there. We’re not relying on what instructors saw or what the user saw. We record everything in full 3D so we can see exactly what someone was seeing at the point they were making decisions.

In the virtual world, we can put you anywhere. We can put you into a petrol station, we can put you on a rooftop, and you have to deal with all those additional stressors that are present when you’re doing a live roleplay and actually taking people away from that real world where they feel comfortable dealing with a human size target in the virtual world.

We’ve all often seen that we get a much more natural, realistic response. People engage more, tactical communication with the subject increases.

Based on what metrics do you evaluate the effectiveness of simulated training?

So in terms of effectiveness of training, what we are looking at is primarily the direct comparison between virtual reality training and the traditional training methods. And I suppose the way that we gauge that is through the feedback. There are very few metrics in terms of the effectiveness of VR training versus traditional training, because a lot of the way that it’s done and this is especially true of training in the UK, it’s very subjective.

So when you look at something like conducted energy weapons, for example, a scenario is carried out and the debrief to the officer will focus on a framework, the national precision model. But the way that framework is raised to the officer will depend very much on the instructor who’s doing it. And we built a system that would mirror that.

We wanted to be able to handle that very subjective training so that you can ask the user what their decision making process was. Now, unless the scenario has gone horribly, horribly wrong, there’s very rarely a specific right or wrong answer to those questions onto that debrief condition. Obviously, we can build in metrics and that’s where it does get interesting because when you look at traditional training, it’s sometimes very difficult to judge where an rounds or probes would have hit on an on a on a dummy or on a target or on an instructor typically.

So whereas obviously in the virtual world, things are very specific. We can measure distances from subjects, we can measure reaction times, we can measure exact shot placement and the view from the officer when they were taking particular actions. So I think when you compare the two types of training and it’s really key to remember that we’re comparing traditional training with simulated training, virtual reality training, XR training and not XR training in the real world. And I think that’s a mistake a lot of people make. But I think those are the real keys to it is looking at where we can generate metrics that you can’t get from traditional training and and then also getting the feedback from users to say how it felt compared to traditional training.

How the sensations for this or that scenario are created? Do the sensations from the virtual bullet hit feel realistic?

Typically, what we’ve done is we’ve looked at the sensations that are included in the developer kit to start with, and we then used the creator tools within the suite of software to then create those sensations. And a lot of that is done through trial and error. We experiment with the intensity, with the frequency and with the amount of electrodes that we’re effectively using until we get a sensation that feels like something we would want to include. The fidelity of the sort of sensation created within the TESLASUIT is that there aren’t really any limits to what we can make the user feel.

We are, to a certain extent, creating the sensations based on what we believe it feels like. We are quite lucky in the UK in that we don’t have a lot of people getting shot and we’ve not come across anyone who’s actually been able to give us an experience of it. We have had people that have had conductive energy weapon exposures. It’s very difficult to simulate the total sensory overload that you do get from something like that. But in terms of the physical placement and things, we think we’ve got a relatively good way of simulating it.

It’s never going to feel exactly like the real thing. But again, it’s about that incentive to make sure that your decision making is right. It’s an unpleasant enough sensation to make sure you don’t want to get hit again.

Based on what requirements does your team create training scenarios, 3D environments and assets? Are you guided by the expertise and experience of the police or the military?

Yeah. One of the reasons that we’ve spent so long taking our system out and putting it in the hands of operational users is because the simulated training needs to be appropriate to what the operational users are going to want. We’re not the experts. We wanted to put it in the hands of experts.

So the AVRT system and attached devices such as the TESLASUIT have all gone through a very human centric design process. The feedback that has come from those officers, from those soldiers, from those users who have been in the system is invaluable. And we’re confident in saying that everything that’s been developed into it is there because someone who is doing this kind of work, engaging with this kind of situation on a day to day basis has told us it needs to be there. So it’s that real world experience really, that has shaped what we do.

We provide a series of 3D environments, 3D characters, weapons, and other assets that they effectively use as a bit of a tool kit to build up the scenario they want to produce. And we’ve got the freedom because our VR system is completely computer generated to be able to play around with that content, to be able to supply anything that anyone wants. Obviously, our main feedback has been from UK police and UK military. Our characters have a UK feel to them, but tweaking the content for overseas clients for different areas of a business is obviously very easy for us to do.

What places or environments does the user find himself while undergoing the scenarios?

We want to be able to provide a series of environments that will allow instructors to get across whatever learning outcomes are relevant to them as to tend to be the kind of places that police officers would experience. We have some urban street things, including a petrol station, a car park, vehicles that are around. We have domestic interior, for example, basic house that you can go into and have to consider factors such as knives and work surfaces, things like that. And we have a rooftop in the system, which gives you the additional consideration of how do we keep people away from rooftop areas, especially if we’ve got someone who’s vulnerable. And we chose all of these environments, and the 3D assets so that a police instructor will be able to take that and then sort of deliver quite a rich scenario with some decent intelligence and back story using those building blocks.

Are there any limitations or factors that guide the development of environments from the technical point of view?

So in terms of the actual technical development of our environments, there’s a couple of guiding factors. One is that we use standalone headsets. We moved away from large PC driven headsets and tracking systems that have particular technical issues to the platform we use at the moment. So the actual capacity of the headsets guides us into very highly optimized models.

AVRT: Taking a Closer Look at the VR Training (Part 1)    1

But the standalone headsets, as you’re probably aware, are coming on in leaps and bounds now. So it’s not not particularly a limitation. Our environments are designed to create the stresses that an officer might encounter or a soldier might encounter. But then we are engaging you with some VR characters and the actual situation itself. So they’re very carefully designed environments that give us lots of freedom to be able to see a scenario in different areas, whether it be around a vehicle, whether it be by a dwelling, whether it be by a shop or a petrol station. Those environments are large and allow you to site the particular activity wherever you want to in there. And other than that, we gear those scenario, those environments up really to the kind of content scenarios that we know trained instructors, both military and law enforcement, will want to deliver in them.

Do you collect data for further analysis and identifying behavioral patterns?

The key here is the VR training does give us the ability to create training patterns, to be able to create profile of the officer, to be able to understand behaviors. The other side to that, of course, is the level of data and the kind of data that would be collected for those kind of use cases can also be very, very sensitive and can also tie into investigations and things like that.

So the key is we will always be guided by the customer, by the police force, by the military unit as to how much, if any, of that data they want to collect and what they want to do with it. We have potential industry partners that specializing in collecting, processing and interpreting that kind of data. So it’s always there is an option.

So yes, that data collection facility is there, but it’s not something that we’ve typically focused on at the moment and won’t do unless we get told that it’s a particularly desirable feature.

Could your VR training platform be used for civilians to gain a better understanding of what police officers face daily?

We do introduce people to slightly different worlds and different experiences, you know, around the recruitment of soldiers and police officers. We have used our system to drive an understanding of these kinds of processes.

And primarily, we did this around some police encounter panels, which are where police forces will invite members of the public to come in and make their judgments on scenarios that have happened, situations that have happened. And traditionally, they’ve done through body worn video and they’re done through, you know, explaining how the scenario went. But with VR training, we can effectively recreate that scenario and we can put a civilian into a situation and understand how they react as a you know, as a non-police, trained civilian.

And what we saw from that was that those civilians gained such an insight into the kind of split second decisions that a police officer has to make. And we see that as absolutely key, especially when we’re looking at someone like a decision maker who isn’t necessarily a police trained person. And being able to put someone into a scenario risk free where they have to make decisions will aid with them understanding the impacts of the policies they make, all the decisions that they make. And haptics, again, I think is a very useful part of that, because you can, to a certain extent, make someone feel the jeopardy, the consequences of being in a very high stress situation where decisions got to be made instantly.

Are you planning to further integrate any new technologies or devices to the existing platform?  

Yeah, we’ve always wanted to design AVRT as an integration platform. So right from the ground up, it’s been designed so that we can take the best technologies out there and work with them. So we are always on the lookout for what the next sort of set of haptic technologies are. As I said earlier, my desire is to have a good haptic glove that we can use to simulate a variety of different things around, critical encounters, form factor is going to be the key there. We obviously integrate the real world weapons in, but there are other senses that perhaps VR doesn’t touch on. So yes, haptics I think is going to be a part of our story for quite some time to come.

Do you see VR training being widely used for the police or military training in the short term?

We’re reaching this point now where VR training has been around for a while. Certainly in the private sector, it has achieved a significant amount of penetration. And we’ve seen police forces in the UK and beyond that have looked at running experimental projects, that have looked at research projects. There’s a lot of academic work around it. I think we’re reaching that kind of critical mass point now where large scale commercial adoption within the public sector is coming. And there are a number of systems out there. There are people that are investigating those systems and there are programs within the police forces that are starting to look very seriously at this.

So I think we’re at that tipping point now where we’re going to see the leaders start to do this. And very quickly, we will see a fairly large scale adoption of this kind of technology into training programs.

Images and videos are taken from AVRT official LinkedIn account and YouTube channel.