26 October 2020

Live - Virtual - Constructive - Autonomous? - A Framework for Training Approaches

In military and defence training circles there is a very commonly used acronym called "LVC". It stands for Live, Virtual , Constructive:

  • Live training: Real soldiers exercising against real soldiers on a training ground (and more broadly real soldiers practicing with real kit anywhere physical)
  • Virtual training: Real soldiers exercising against real soldiers in a simulation, probably (but not necessarily) digital (and again more broadly real soldiers practicing with virtual kit anywhere non-physical, would also cover Fortnite!)
  • Constructive training: Real soldiers exercising against so-called "computer generated forces" (CGF) with a simulator (think almost any traditional first person shooter)
When I first came across the model I struggled to remember what was what as so tried to put it into some sort of 2x2 matrix. The dimensions I eventually decided were:
  • Whether the training environment was the real physical world, or a digital (or other) simulation of it, and
  • Whether the opposition was being controlled by a human or a computer
One of the reasons I struggled was that nobody ever talked about the 4th space on the matrix - where you train in a physical environment but the opposition is computer controlled - which I've labelled Autonomous - hence LVCA. This is odd since this mode is not as far fetched as it sounds, fighter pilots have been practicing against UAVs for a while (although they may be remote-controller not autonomous), and there is some emerging work on mobile semi-autonomous robot targets (see https://www.youtube.com/watch?v=tlqMlPQpeAo)

So a complete matrix might look like this:

Now whilst emerging from the military and defence world it does seem to me that this LVCA gives  useful model for thinking about skills and process type training within the civilian world. Of course we need to think about "non-player characters" (actors? - who may represent clients, patients, customers or colleagues) instead of the "opposition", but I think the model holds up pretty well:

  • Live training: Students learning with real people/kit – e.g. role-play, practical hands-on
  • Virtual training: Students learning with real people via a simulator (e.g. Trainingscapes) (or maybe Zoom) (virtual role-play)
  • Constructive training: Students learning by interacting with computer controlled NPCs in a simulation
  • Autonomous training: Students learning by interacting with computer controlled physical entities (e.g. hi-spec mannequins)

As ever the key is that these approaches aren't competing, it's about finding the right blend of each given the subject, students, situation and budget in order to deliver the best possible training - and to help provide follow-up to keep that training fresh.

Hopefully this LVCA matrix will give you some fresh insights into the training you are trying to deliver, and perhaps open up some new ideas as to how it could be delivered for the benefit of your organisation and, of course, your students.

23 October 2020

Daden Newsletter - October 2020


In this latest issue of the Daden Newsletter we cover:

  • COVID19 and 3D Immersive Learning (still!) -  With COVID19 showing no signs of abating delivering corporate training and academic syllabuses out into 2021 remains challenging to say the least. Where do "traditional" approaches such as Zoom, eLearning and video fall short, what issues does a VR route face, and how can 2D/3D immersive learning improve the mix?
  • WebXR - With Facebook introducing stricter account and content controls on Oculus headsets WebXR offers a way to deliver VR content to VR headsets without needing to download anything. The same approach can also deliver 2D/3D content to an ordinary web browser. A promising way forward?
  • Plus snippets of other things we've been up to in the last 6 months - like starting work on a virtual hospital ward and experimenting with VR in the garden!

Download your PDF copy of the newsletter here.

We hope you enjoy the newsletter, and do get in touch if you would like to discuss any of the topics raised in the newsletter, or our products and services, in more detail!

19 October 2020

XR 2x2 Segmentation

(click image for better resolution)

Trying to unpick all the different emerging systems in the "extended reality" space such as Augmented Reality (AR), Mixed Reality (MR), Virtual Reality (VR) and then the "traditional" approaches such as 3D video games and virtual worlds can be a challenge, and we keep trying to find new ways to understand and communicate it.

In the graphic above we've tried to show the difference between the systems in terms of the access device and what they are trying to do to reality.

In one dimension we have whether the system is trying to completely replace what the user is seeing/experience - they only see the digital world, or whether it is augmenting it - adding new content into what the user otherwise sees as the physical world.

In the other dimension we have whether the system is accessed through a flat screen or a headset. The flat screen might be an ordinary PC screen, or a tablet or smartphone, whilst the headset might be a VR one or an MR one. A key point is that for the AR/MR solutions the access devices has some way of also showing the physical world which is being manipulated - via the phone camera for AR, or a transparent visor for MR.

These two dimensions then give us our four main use cases:

  • AR - which is overlaying reality but viewed through a (conventional) flat screen
  • MR - which is overlaying reality but viewed through a headset (visor)
  • 2D/3D systems like Second Life/computer games - which is completely replacing reality and viewed through a flat 2D screen
  • HMD VR - which is completely replacing reality and viewed through a headset (screen)
A key point is that the dimensions are about how we access and what we are trying to achieve. They are not about the software technology being used to generate the environments, or how its branded!

Let us know if you've any comments on this segmentation, and/or if you find it of use.


16 October 2020

Video: David's talk on Virtual Personas at AI Tech North

David's talk at AI Tech North on "Enriching Virtual Humans through the Semantic Web and Knowledge Graphs" is now available on video:

It's a 20min watch, and followed by another interesting session on analysis of language in social media use.

7 October 2020

MS&T Feature Daden Thoughts on the impact of COVID19 on Virtual Training

Military Simulation And Training Magasine Editor Andy Fawkes has published a video drawing on the views of S&T industry leaders to discuss how the pandemic has accelerated existing digital trends and that this is the time to reimagine the management and delivery of simulation and training. Prompted by some of our posts here on the relative merits and experiences of 3D and VR immersive training, and the impact of COVID on the market, Andy includes some of our thoughts in the video.

You can view the video at: https://www.halldale.com/articles/17613-digital-trends-accelerated

MS&T Editor Andy Fawkes

5 October 2020

World Space Week - WebXR Solar System playground


Normally at this time of the year we'd be down at The Hive in Worcester for the BIS World Space Day event - the largest of its kind in the UK. Of course with COVID19 that's not happening, so here instead is a little "work in progress" of a solar system playground in WebXR. You can't view this on the web, but if you have a VR headset that is WebXR compatible (most of them now) you can use the headsets web browser to go to:

  • https://www.daden.co.uk/webxr

Then follow the Solar System Playground link from that page to the WebXR page, click on the Enter VR button, and immerse yourself in the Solar System.

This is a quick tour of what you should see and what you can do:

The main features are:

  • Set sizes of planets to linear or log scale
  • Set orbit sizes to linear or log scale
  • Just grab hold of a planet to bring it up close to look at it, and turn it over in your "hands"
  • Display labels on planets and/or audio naming as you click on them
  • Randomise planet positions, sort them into order, and then have your solution scored
  • Hide the floor (not for those with vertigo!)
  • Move with joystick or by clicking on the footsteps
  • Set sizes and orbit sizes to "real" values, scaled to the sun. This make things VERY small and VERY spread out. There's a guideline to help you find all the planets, and Pluto (and maybe Neptune) might actually be outside of the "star" bubble - feels very otherworldly that far out!

Note that this is still a work-in-progress and may have a few bugs, but with World Space Week happening and lots of us in some sort of lockdown we thought it a good time to get it out!

Any bug reports or comments for improvements in the comment or by email to wsw@daden.co.uk.

Enjoy, and hopefully we can meet physically for World Space Week next year!

2 October 2020

Daden make the Midlands Tech 50 for the 2nd Year in a row


We're pleased to announce that we've made it onto the Midlands tech 50 link for the 2nd year in a row.  The Tech50 awards, organised by BusinessCloud, celebrate the most innovative new tech companies for consumers, business and society at large.