29 June 2020

9 Business as Usual Uses for VR and AR in College Classrooms - Our take!

Saw an interesting looking article on "9 Amazing Uses for VR and AR in College Classrooms" on  Immersive Learning news the other day - although actually a retweet of a 2019 article. But reading it I was struck by how most of the uses they talk about are things that we've been doing for years.

So here's their Top 9 uses, and what we've done that's identical or close.

1) Grasping Concepts

When we built a virtual lab for the University of Leicester we also built 3D animations of what happens at  a molecular level. Students had found it hard to link the theory of a process with the mechanics of using the kit, and the combination of both really helped them to link and understand the two.

In another example a finance trainer we helped build for the University of Central Florida represented financial flows as tanks of water and piles of virtual money so as to better enable students to grasp more complex financial concepts.

2) Recreating Past Experiences for New Learners

Not one of ours but there was an awesome recreations of the WW1 trenches, augmented by the poetry of the war created by the University of Oxford back in the 2000s. We have though also used immersive 3D to recreate conversations between analysts and patients so that new learners can revisit these and actually sit in the virtual shoes of the analyst or patient.

3) Stagecraft for Theater Students

One of the first projects we got involved with was helping theatre educators at Coventry University make of us immersive 3D to teach stage craft and even create new cross-media pieces. There was also the wonderful Theatron project back in the 2000s that recreated a set of ancient theatres in order to better understand how they were used by staging virtual plays, and we did the Theatrebase project were we built Birmingham's Hippodrome Theatre and digitised a set of scenery from their archives to show how virtual environments could be used to both teach stagecraft but also to act as an interactive archives and to help plan and share stage sets between venues.

4) Virtual Reconstruction of History

With Bournemouth University and the National Trust we recreated Avebury Ring as part of an AHRC funded project and ran it for the summer at the visitors centre so that visitors could explore the Ring as it was 5000 years ago in VR - and without the village that has now been built in the middle of it!

5) Going on Space Walks

We've done the Apollo 11 Tranquility Base site 3 times now, in Second Life, Open Sim and now Trainingscapes. We've also done an exploration of the 67P comet and a whole Solar System explorer.

6) Reimagining the Future


Back in 2010 we built the new Library of Birmingham virtually (hence VLOB) for Birmingham City Council so they could use it to plan the new building and to engage with the public and later subcontractors. The multi-user space even had a magic carpet ride!

7) Practicing Clinical Care

We have done almost a dozen immersive 3D exercises for health and care workers, ranging from paramedics and urinalysis to end of life pathway care and hospitalised diabetic patients.

8) Hands-on Railroading

OK, hands-up, we've never built a virtual railroad - but we have done equipment operation simulations on things ranging from air conditioners to jet engines!

9) Feeling the Impact of Decisions

In the article this is actually about team-work and collaboration within virtual spaces. Whilst we have had some "fun" builds - for instance virtual snowballs for Christmas parties we're also really interested in how to use these spaces to discuss issues and approaches through tools like walk-maps and 3D post-it notes. The classic though has got to be the fire demo where if you choose the wrong extinguisher then the fire blows up in your face - and as seen from the image above your body flinches away exactly as it would do in real life!

So there you are, 9 business as usual use cases for immersive 3D and VR as far as we're concerned!

25 June 2020

Daden joins Team iMAST

We're pleased to announce that Daden has been selected as a member of Team iMAST, the Babcock and Qinetiq led team which is bidding to support the modernisation of the UK Royal Navy’s individual maritime training.

Down selected to bid earlier this year, the bespoke Team iMAST collaboration – led by Babcock and comprising, QinetiQ and Centerprise International along with the Universities of Portsmouth and Strathclyde – has recently been joined by Thales and Learning Technologies Group to further bolster its highly-experienced offering. And boasting its Innovation Ecosystem of more than 50 Small to Medium sized Enterprises (SMEs) - including Daden, Team iMAST is ready to deliver training to the Royal Navy when and where it is required, if selected.

Team iMAST and the Innovation Ecosystem will enable critical technology integration, backed by proven naval training resources, to drive future-ready training solutions for all elements of the Royal Navy. To launch this Ecosystem, two successful events have already been held with the most recent hosted by Team iMAST at the Digital Catapult, the UK’s leading agency for the early adoption of advanced digital technologies.

With its wealth of proven expertise, Team iMAST is uniquely placed to support this training outsource programme through its unrivalled industry know-how. The programme will provide an opportunity to help shape the future of Royal Navy training as a strategic partner and drive efficiencies and new technology. 

Daden is focusing on a variety of use cases of virtual humans in support of the project.

23 June 2020

Intelligent Virtual Personas and Digital Immortality

David's just done a guest post for the VirtualHumans.org site on "Intelligent Virtual Personas and Digital Immortality", pulling together some of our current work on Virtual Personas with David's writings on Digital Immortality and the site's interest in Virtual Influencers.

You can read the full article here: https://www.virtualhumans.org/article/intelligent-virtual-personas-and-digital-immortality

11 June 2020

Daden Newsletter - June 2020

In the latest issue of the Daden Newsletter we cover:

  • COVID19 and 3D Immersive Learning -  With corporate training and academic syllabuses and delivery being revised to cope with the challenges of social distancing stretching out until mid 2021 at least, to what extent will trainers and educators look again at the potential of 3D immersive learning and virtual reality - or will they fallback on the more "traditional" approaches of VLEs and Zoom?

  • Virtual Conferences - It's not just in virtual training and learning that immersive 3D can help - several organisations are now using immersive 3D conference and meeting environments to give participants more sense of "being there" and encouraging more serendipitous networking than yet another Zoom webinar. David reports on two recent events he attended.

  • Trainingscapes 2.0 Sneak Peak - We're getting close to the launch of version 2.0 of Trainingscapes - see some screenshots of the new-look application.

  • Plus snippets of other things we've been up to in the last 6 months - like being named one of the West Midlands Top 50 most innovative companies.

We hope you enjoy the newsletter, and do get in touch if you would like to discuss any of the topics raised in the newsletter, or our products and services, in more detail!

8 June 2020

Daden U Day: My Beautiful Soup

From Darrell Smith:

On a recent project we had difficulties in scraping the summary paragraph from Wikipedia article pages and Beautiful Soup was suggested as a possible tool to help with this.  The Beautiful Soup Python library has functions to iterate, search and update the elements in the parsed tree of a html (and xml) document.

So download and install the library do a quick test was to fetch the URL of the web page we’re interested using the ‘requests’ HTTP library to make things easy. The http document is then passed to create a ‘soup’ object,.

result = requests.get("https://en.wikipedia.org/wiki/HMS_Sheffield_(D80)")

src = result.content

soup = BeautifulSoup(src, 'lxml')



The prettify # makes the html more readable by indenting the parent and sibling structure



Searching for tag types (such as ‘a’ for anchor links) is simple using ‘find’ (first instance) or ‘find_all’,  this shows all internal (Wikimedia links) and external links (“https://”)


Lets just get links that refer to “HMS …”


Now lets get the text paragraphs we’re interested in, this can be done using the ‘p’ tag

Then index to 2nd paragraph using list to get summary paragraph (n.b. first paragraph is blank) we’re after.

Dedicated Wikipedia Library

While Beautiful Soup is a good generic tool for parsing web pages, it turns out that for Wikipedia there are dedicated python utilities for dealing with the content such as the Wikipedia library (https://pypi.org/project/wikipedia/) which wraps the Wikimedia API simply

wp.search(“HMS Sheffield”) returns the Wikipedia pages for all incarnations of HMS Sheffield, and we can use wp.summary(“HMS Sheffield (D80)”)  to give hte element from page we’re interested in.

The wp.page(“HMS Sheffield (D80)”) also gives the full text content in a readable form with headings.

Again we can select the first paragraph for the summary (exclude URL), and possible use other paragraphs using the headings as index/topic markers.


Smart Quotes!  While trying this out I also found a useful function to get rid of those pesky Microsoft smart quotes causing trouble in RDF definitions on the same task. Unicode, Dammit converts Microsoft smart quotes to HTML or XML entities:

1 June 2020

Choices in Immersive Learning Design

When designing a new immersive learning experience we find that there are a number of dichotomies or spectra  that it is helpful to talk through with a client in order to ensure that all parties have a good idea of what is driving the immersive learning design and what the experience might feel like. Often there is a lot taken for granted, a lot discounted or assumed, and its not until you start talking about all these options that some of the preconceptions on both sides emerge.

To help us talk these through with clients we've even realised them as cubes within our virtual campus so that we can go in remotely with clients and move the boxes around as we talk about them, and typically lay them out on a cost/effort vs importance floor map - the sheer act of doing that helps to create visual and spatial cues which help in recall and even help to show the thinking that is going on.

So here are what we think are some of the key dichotomies, and you can find a fuller list and discussion of the remaining items in our Immersive Learning White Paper.

- Simulation vs Serious Game

In recent years this has become the big one – to what extent do you want the immersive experience to be a “simulation” of reality (so high on accuracy), and to what extent do you want it to be game-like (and so highly motivating)? The situation gets even further confused when people start talking about “gamification”. Having been involved in games design since before the days of personal computers we know that this really all comes down to game mechanics. To us something becomes a “game” as soon as you start to introduce (or exclude) rules or features that do not exist in the real world. Those things you introduce are called game mechanics – and might range from a simple countdown timer or scoring system to highly artificial features such as power-ups and upgrades.

- Linear vs Freeform

When we first engage with tutors and learning designers who have been used to working on eLearning projects we find that they tend to come with a very linear mindset. The learning is a sequence of actions and tasks, and each screen only provides a few options as you don't want to crowd the screen or confuse the learner. Coming from a virtual worlds background we are far more used to open learning spaces with lots of possibilities – trying to get tutors and designers to “unlearn” can be hard. One of the best approaches we have found is to get them to think of a learning exercise in terms of drama, or even e-drama. In fact, it's not even scripted drama we're often after, it's improvised drama. It's telling the student: this is the scene, here are the props, the actors are going to do something and you need to respond.

- Single vs Multi User

A major design decision is whether an environment is designed to be used by a single user (so they only see themselves) or by multiple users (so everyone sees and can interact with everyone else). Obviously multi-user is essential if you are looking at team and collaborative learning, or you want staff (or actors) to role-play characters in the simulation “live”. But multi-user suggests an element of scheduling, and also requires the users to have a network connection, so doesn't give the individual learner the maximum flexibility (e.g. learning on the underground), or let them practice in private.

- Synchronous vs Asynchronous

This choice is only relevant in multi-user mode – should the environment be designed for asynchronous use – i.e. everyone uses it at their own time and pace, or for synchronous use – more like a physical world team learning session where the team (and the tutor/assessor) are all present at the same time.  In asynchronous mode we are really talking about lots of individual single-user experiences, people using the environment as and when. With synchronous mode we are talking about timetabling and co-ordination, but the benefit is that we get to practice those team tasks that it may just not be feasible to practice and rehearse in the physical world due to limitations of time or distance.

We hope that's made you think through our ideas for immersive in a new way, and don't forget to check out the white paper for more details, or contact us if you'd like to talk them through - or even play around with the box set in our virtual collaborative 3D space.