10 August 2020

Virtual Reality vs Immersive 3D - the Search for the Right Words!



As a company that been creating immersive experiences for over 15 years we find that the contemporary obsession with headset based virtual reality (HMD-VR) is often at risk of a) forgetting what valuable work has been done in the past in non-HMD immersive 3D environments and b) not highlighting to potential  clients that a lot of the benefits of "VR" can be obtained without an HMD, and not having the funds for, or access to (esp in COVID) HMDs does not need to stop a VR project in its tracks.


One problem is that we just don't have the right terminology, and what terminology we have is constantly changing.

"VR" has almost always been assumed to mean HMD-based experiences - using headsets like the Quest, Rift or Vive - or even their forerunners like the old Virtuality systems.

 But in that fallow period between Virtuality and Oculus DK1 3D virtual worlds such as Second Life, There.com and ActiveWorlds were enjoying a boom-time, and often found themselves labelled as "virtual reality".


One problem is that there seems to be no commonly accepted terms for the classic Second Life (or even Fortnite) experience, where you can freely roam a 3D environment but you have a 3rd (or sometime 1st) person avatar view of it. It's certainly not 2D. It's sort of 3D - but not as 3D as the stereoscopic experience using a VR-HMD. I've seen 2D/3D or "3D in 2D" but both are cumbersome. We sometimes refer to it as "first-person-shooter" style (but that doesn't go down well with some audiences), or "The Sims-like". 

There's also a qualitative difference between say a 3D CAD package where you're rotating a 3D model on screen (called an allocentric view) and the experience of running through Fortnite, Grand Theft Auto, or Second Life (called an egocentric view).  You feel "immersed" in the latter group, not just because of the egocentric view point but also because of the sense of agency and emotional engagement.

At a recent Engage event I went to I'd guess (from avatar hand positions) that about 50% of attendees were in VR-HMD and 50% using the immersive-3D desktop client. So should it be described as a VR or immersive 3D system? Our Trainingscapes is the same, we can have users on mobile, PC and VR-HMD devices all in-world, all interacting. And Second Life is often "dismissed" as not being "proper VR" - but when Oculus DK1 was around I went into SL in VR - see below - so did it stop being VR when they went from DK1 to DK2?


So if a system can support both - is it a 2D/3D system or a VR system? That is why we tend to refer to both the 2D/3D  approach and the VR-HMD approach as being "immersive 3D" - as long as you have a sense of agency and presence and the egocentric view. It's the experience and not the technology that counts.

And don't get me started on what "real" and "virtual" mean!

No wonder clients get confused if even we can't sort out what the right terms are, and its far too late for some de jure pronouncement. But perhaps we all could try and be a little bit more precise about what terms we do use, and whether they are just referring to the  means by which you access an experience (e.g. VR-HMD) or to the underlying experience itself (such as a virtual world or virtual training exercise).

In later posts I'll try and look more closely at the relative affordances of the 2D/3D approach (better name please!) vs the VR approach, what researchers experiences of virtual worlds can teach us about VR, and also how "virtual worlds" sit against other immersive 3D experiences.



30 July 2020

Garden VR



OK, why has it taken me 4 months of lockdown to realise that I've got the ideal room-scale VR space out in my garden! Having thought of the idea I did have some doubts about a) were there too few straight lines for it to manage the tracking, b) would rough grass flag as intruders in the scene and c) what happened if the dog walked through, but in the end it all worked swimmingly.

Wifi reaches about half way down - so that may be an issue, although I found it hard to draw out more  the first half the garden as a space. Oculus kept putting its "draw boundary" panel right where I was looking and walking and drawing didn't help - but I'll see if I can do better another time. I ended up with a space 15 paces by 7 - far bigger than the attic (and no slopey ceilings).

The image below shows a rough mapping of the space to the WebXR demo room - so I could walk about half of it (Oculus hides the warning barriers in photos - annoying in this case as I'd set myself up to show the exact extent!)



After that everything worked just as though I was indoors - apart from the occasional need to walk back closer to the recover the wifi. I certainly lost all sense of where I was in the garden and alignment, the soft grass didn't interfere with the immersion, and the slight slope up to the house end was a useful warning!

Not related to being in the garden I did notice that I felt more latency/unease with the 3D photospheres (and even more with the stereophotospheres) than with the true 3D spaces - where I felt none at all. Perhaps one reason why there were a lot of reports of unease with VR is a lot of people were having photosphere experiences - although admittedly true latency issues remain (but made worse by doing "crazy" things in VR - like rollercoasters - rather than just walking around in a room!

One experience which was heightened was the Mozilla vertigo experience - walking on ever smaller blocks over nothing. I suppose because I could move more in the garden I could better explore it and fully immerse myself in it - and it certainly made me check I could feel grass under my feet before I stepped - particularly when I just stepped off the blocks and into space.

Anyway all the space allowed me to have a good walk around the solar system model without using teleports and actually get the planets lined up for the first time! Even in the garden they are proving too big so need to at least halve the sizes!






13 July 2020

Further Adventures in WebXR - Playing with the Solar System



Having a bit more of a play with WebGL/WebXR and now have a nice draggable solar system! Could be a neat learning tool once finished to get the planets in the right order, too  look at their globes in more detail, and perhaps access further information about them. With World Space Day/Week going virtual might be time to set up a gallery of experiences for people to try that week. 

Need to sort a few more things first though - like rings for Saturn, a starscape backdrop, and change the highlight colours. Maybe also the option to check your order and give the you the right one. Also need to add a sun, and shrink the planets even further!



The more we play with WebGL/WebXR the more excited we are by it as a tactical solution, quickly creating small but powerful bespoke VR experiences that can be instantly accessed by anyone with a WebXR compatible VR headseat without any need for an install!


9 July 2020

Virtual Archaeology Review publishes Virtual Avebury Paper




The Virtual Archaeology Review has just published Professor Liz Falconer's paper on the Virtual Avebury project we did last year. The paper looks at the response to the VR experience by visitors to the National Trusts Avebury Visitor Centre - where two people at a time could collaboratively explore the Avebury site as was - i.e. without a village being built in the middle of it and all the missing stones replaced!

You can read the paper at: https://polipapers.upv.es/index.php/var/article/view/12924/12360


Key findings included:

  • More than 1200 members of the public experienced a 3D, fully immersive simulation of Avebury Henge, Wiltshire, UK over a nine-month period.
  • Patterns of use and familiarity with information technology (IT), and using mobile technologies for gaming were found that did not follow age and gender stereotypes.
  • There was little correlation between age, gender and IT familiarity with reactions to Virtual Avebury, suggesting that such simulations might have wide appeal for heritage site visitors.

Some of the key data are shown below:


Emotional Responses to Virtual Avebury



Experiences of Virtual Avebury


Responses to the Virtual Avebury Soundscape


Read the full paper at: https://polipapers.upv.es/index.php/var/article/view/12924/12360




6 July 2020

DadenU Day: WebXR


MozVR Hello WebXR Demo Room

For my DadenU Day I decided to get to grips with WebXR. WebXR is an new standard (well an evolution of WebVR) designed to enable web-based 3D/VR applications to detect and run on any connected VR or AR hardware, and to detect user input controls (both 6POS and hand controllers). This should mean that:

  • You can write and host VR applications natively on the web and launch then from a VR headsets built-in web browser
  • Not worry whether its Oculus, HTC or A.N.Other headset, both for display and for reading controllers
  • Have a 2D/3D view automatically available in the web browser for people without a VR HMD.
What WebXR does NOT do is actually build the scene, you use existing WebGL for that (essentially a 3D HTML standard, not to be confused with WebXR or WebVR!) through something like the Three.js or A-Frame frameworks.


To get a good sense of what web-delivered VR (via WebXR) can do I headed over to Mozilla's demo at https://blog.mozvr.com/hello-webxr/. This room has a bunch of different demos, and a couple of "doorways" to additional spaces with further demos. If you view on a 2D browser you just see the room, but can't navigate or interact (don't see why WebXR should pick up ASDW same way as its picks up a 6DOF controller). If you go to the page in your Oculus Quest (or other) browser you also see the same 3D scene in 2D. BUT it also offers you an "Enter VR" button, click this and your VR lobby and the 2D browser disappears and you are fully in the VR space as though you'd loaded a dedicated VR app. Awesome. In the space you can:

  • Play a virtual xylophone (2 sticks and sounds)
  • Spray virtual graffiti
  • Zoom in on some art
  • View 360 photospheres - lovely interface clicking on a small sphere that replaces the VR room with a full 360/720 photosphere. I'd always been dubious about mixing photospheres and full 3D models in the same app but his works well
  • View a stereoscopic 360 photosphere - so you can sense depth, pretty awesome
  • Enter a room to chase sound and animation effects
  • View a really nice photogrammetry statue which proves that web VR doesn't need to mean angular low-rez graphics 
MozVR Photogrammetry Demo

There's a really good "how we did it" post by the Mozilla team at: https://blog.mozvr.com/visualdev-hello-webxr/

Having seen just what you can do with WebXR the next step was to learn how its done. For that I went to the WebXR sample pages at https://immersive-web.github.io/webxr-samples/

Although thee are a lot simpler than the MozVR one, each shows how to do a particular task - such as user interaction, photospheres etc. You can also download the code and libraries for each from GitHub at https://github.com/immersive-web/webxr-samples.

Movement demo

Controller demo

The only downside of these seems to be that they use Cottontail - a small WebGL/WebXR library/framework purely developed for these demos and not recommended for general use - so adapting them to your own needs is not as simple as it would be if they were written in Three.js or A-Frame.

Keen to actually start making my own WebXR I started by copying the GitHUb repository to my own server and running the demo's up. Issue #1 was that any link from a web page to the WebXR page MUST use https, using http fails!

Starting simply I took the photosphere demo and replaced the image with one of my own. The image had worked fine on the photosphere display in Tabletop Simulator but refused to work in WebXR. Eventually I found that the image had to be in 2048x1024, higher resolutions (but same ratio) fail. Also the photosphere demo is for stereoscopic photospheres so you have to remove the " displayMode: 'stereoTopBottom'" parameter.

Hougoumont Farm at Waterloo in WebXR Photosphere

Next up was to try and add my own 3D object. I liked the block room in one of the demos and worked out how to remove their demo blocks form the middle, and to hide the stats screen. Nice empty room loaded up. Then I hit the bump that I usually write in Three.js or A-Frame and I could't just cut-and-past into their WebXR/Cottontail template. Then I ran out of time (it was Friday after all!)

I've now found a page of really basic Three.js WebXR demos at https://threejs.org/examples/?q=webxr so the aim for this week is to get those working and start on my own WebXR spaces.

It's obviously early days for WebXR, but given the MozVR demo this really could be a lovely download-free way of delivering both 2D/3D to ordinary browsers, and full VR to headsets without any downloads. Joy!




29 June 2020

9 Business as Usual Uses for VR and AR in College Classrooms - Our take!


Saw an interesting looking article on "9 Amazing Uses for VR and AR in College Classrooms" on  Immersive Learning news the other day - although actually a retweet of a 2019 article. But reading it I was struck by how most of the uses they talk about are things that we've been doing for years.

So here's their Top 9 uses, and what we've done that's identical or close.

1) Grasping Concepts



When we built a virtual lab for the University of Leicester we also built 3D animations of what happens at  a molecular level. Students had found it hard to link the theory of a process with the mechanics of using the kit, and the combination of both really helped them to link and understand the two.

In another example a finance trainer we helped build for the University of Central Florida represented financial flows as tanks of water and piles of virtual money so as to better enable students to grasp more complex financial concepts.


2) Recreating Past Experiences for New Learners



Not one of ours but there was an awesome recreations of the WW1 trenches, augmented by the poetry of the war created by the University of Oxford back in the 2000s. We have though also used immersive 3D to recreate conversations between analysts and patients so that new learners can revisit these and actually sit in the virtual shoes of the analyst or patient.


3) Stagecraft for Theater Students



One of the first projects we got involved with was helping theatre educators at Coventry University make of us immersive 3D to teach stage craft and even create new cross-media pieces. There was also the wonderful Theatron project back in the 2000s that recreated a set of ancient theatres in order to better understand how they were used by staging virtual plays, and we did the Theatrebase project were we built Birmingham's Hippodrome Theatre and digitised a set of scenery from their archives to show how virtual environments could be used to both teach stagecraft but also to act as an interactive archives and to help plan and share stage sets between venues.

4) Virtual Reconstruction of History



With Bournemouth University and the National Trust we recreated Avebury Ring as part of an AHRC funded project and ran it for the summer at the visitors centre so that visitors could explore the Ring as it was 5000 years ago in VR - and without the village that has now been built in the middle of it!


5) Going on Space Walks



We've done the Apollo 11 Tranquility Base site 3 times now, in Second Life, Open Sim and now Trainingscapes. We've also done an exploration of the 67P comet and a whole Solar System explorer.


6) Reimagining the Future

                               

Back in 2010 we built the new Library of Birmingham virtually (hence VLOB) for Birmingham City Council so they could use it to plan the new building and to engage with the public and later subcontractors. The multi-user space even had a magic carpet ride!

7) Practicing Clinical Care



We have done almost a dozen immersive 3D exercises for health and care workers, ranging from paramedics and urinalysis to end of life pathway care and hospitalised diabetic patients.


8) Hands-on Railroading



OK, hands-up, we've never built a virtual railroad - but we have done equipment operation simulations on things ranging from air conditioners to jet engines!


9) Feeling the Impact of Decisions




In the article this is actually about team-work and collaboration within virtual spaces. Whilst we have had some "fun" builds - for instance virtual snowballs for Christmas parties we're also really interested in how to use these spaces to discuss issues and approaches through tools like walk-maps and 3D post-it notes. The classic though has got to be the fire demo where if you choose the wrong extinguisher then the fire blows up in your face - and as seen from the image above your body flinches away exactly as it would do in real life!


So there you are, 9 business as usual use cases for immersive 3D and VR as far as we're concerned!



25 June 2020

Daden joins Team iMAST



We're pleased to announce that Daden has been selected as a member of Team iMAST, the Babcock and Qinetiq led team which is bidding to support the modernisation of the UK Royal Navy’s individual maritime training.

Down selected to bid earlier this year, the bespoke Team iMAST collaboration – led by Babcock and comprising, QinetiQ and Centerprise International along with the Universities of Portsmouth and Strathclyde – has recently been joined by Thales and Learning Technologies Group to further bolster its highly-experienced offering. And boasting its Innovation Ecosystem of more than 50 Small to Medium sized Enterprises (SMEs) - including Daden, Team iMAST is ready to deliver training to the Royal Navy when and where it is required, if selected.

Team iMAST and the Innovation Ecosystem will enable critical technology integration, backed by proven naval training resources, to drive future-ready training solutions for all elements of the Royal Navy. To launch this Ecosystem, two successful events have already been held with the most recent hosted by Team iMAST at the Digital Catapult, the UK’s leading agency for the early adoption of advanced digital technologies.

With its wealth of proven expertise, Team iMAST is uniquely placed to support this training outsource programme through its unrivalled industry know-how. The programme will provide an opportunity to help shape the future of Royal Navy training as a strategic partner and drive efficiencies and new technology. 

Daden is focusing on a variety of use cases of virtual humans in support of the project.