3 December 2015

Beware the RED5/VIZOR VR Headset

Heading off to an event this week I realised I'd forgotten to pack one of our Google Cardboards for a demo. I remembered seeing a headset in the RED5 shop in the Bullring so I dived in there and picked one up ( a VIZOR) for £25 (down from £30). Not cheap, but OK as I needed it on the spur of the moment. Build wise the headset is actually quite nice, comfortable and reasonably sturdy, and the slides that hold the phone actually work better than the suckers on the other "plastic" Cardboard we have - and essential given the rubber backing of my iPhone. The lens adjustment slides (and twist for focus) work OK, although with the iPhone the image didn't quite fill my view, so there was a bit of edge visible. But all livable with for a quick demo or XMAS present! However the big problem is that the device has no external button - the phone is completely enclosed (although a foam side lets your plug in your headphones) - so there is no way of actually manually selecting options on the screen. This is an immediate problem for the basic Cardboard demo app as it waits for you to click on the icons to launch the various demos. Nothing on the box tells you that it hasn't got  a button (why would you admit to a lack of a feature), and if you don't know better you won't think to look or ask - unless like me you're in a hurry!)

When we built our Comet 67P Cardboard app we had a big debate about what to do about "the button", as we knew that some older VR viewers may not have them, and also knew that on our Cardboard it was already showing signs of failing from over-use. So in the end we decided to have "stare to activate" on by default, so that you just had to look at something for a few seconds to activate it. The settings also let you switch this off if you preferred the button method - but the crucial thing was that the app would work "out the box" even if you had no button.

Have used the VIZOR I'm really glad we took that decision. Being in High St gadget shops there has got to be a good bet that the VIZOR will be one of the most bought VR headsets this Christmas. If people buy it, kids (or grown-ups) slide their phone in on Christmas Day full of expectation, but then only to find that they can't use half the games and apps they download for it because it doesn't have a button and the apps have not been written to work "button-less", they are going to be mightily annoyed.

30 November 2015

DadenU Day: NuGet

By: Nash Mbaya

For the Daden U Day I decided to look into creating a private NuGet feed for Daden.

NuGet is the package manager for the Microsoft development platform. Our main development tool here at Daden is Microsoft visual studio which is part of this platform. As a result we use NuGet and its public feed in our projects as way of managing external code.

So what about internal code you ask. How is that managed? The short answer and simple answer is that it was not before I wrote this blog post. If there was utility a class or a script written by any of the developers here and another developer wanted to use the same code they would have to firstly know of codes existence. Then secondly they would have to hunt for a latest version of that script, class or dll which could be in any number of projects.

So the aim of creating Daden’s own private NuGet feed was to eliminate a some of the issues caused by this by:

1.Creating a central repository for Daden’s library files.

2.Reducing the fragmentation of said libraries files having a NuGet feed which would provide the latest version of any of our library files.

3.Improving awareness of the different libraries and scripts that are available to Daden’s internal development team

4.Automatically updating library files which are in use.

The Server

There are three main ways of creating your own NuGet feed. You can either have a local feed which is a simple folder with packages on your machine or a shared network drive. You can buy a hosted solutions in the cloud or you can also host a remote (or internal) feed on a server that runs IIS. I chose the latter because I didn't want to rely on the windows file system but instead utilise the IIS server that Daden already has which is used to serve intranet web pages.

Setting up the local feed is easy and straightforward.

Step 1: Create a new Empty Web Application in Visual Studio

Step 2: Install the NuGet.Server Package using the NuGet package manager in Visual Studio

Step 3 (Optional): Configure the Packages folder

The default Packages folder is a folder within the web application hierarchy in the solution view.

To change the folder to one of your choice open the web.config file and change the value of the ‘packagePath’

Step 4: The final step is to publish the web application.

The Packages

So the purpose of NuGet is to manage packages. But what are packages? what do they look like? and how can I make one?

A NuGet package is a file which inside it can contain .NET assemblies file, JavaScript files, HTML/Razor files, CSS files,class files, images, and even files that add configurations to your project. It is very similar to a .zip file.

Packages can be formed of a ‘lib’ folder which should only contain .dll files. A ‘content’ folder which can hold any kind of script or file you want to be part of the project. .NET assembly files inside the ‘lib’ file will added as references to the project. Files located in the ‘content’ folder will be added to the root folder of the project. Folders can also be placed in the ‘content’ folder.

At the begin of the U Day I initial attempted to create NuGet packages using the command line. The commands were simple and straightforward but unfortunately the packages I was creating were not valid, so decided to use the Nuget Package Explorer. NuGet Package Explorer is a version of nuget.exe command line application with a user interface. Though the nuget.exe command line application is useful if you plan to incorporate package creation and publishing into build process I would recommend using NuGet Package Explorer. When you launch the application you are presented with two panels. In the left panel you enter metadata about the package. In the right panel you build up the structure of the package by creating the previously mentioned ‘lib’ folder or ‘content’ folder. You can then add your files to the package. Finally you can publish the package and of course, save it!

Apart from problems I had with the nuget command line application, which were probably through my lack of understanding, I would highly recommend NuGet. The GUI tool was a breeze to use. Now Daden is running its own private NuGet feed enabling us to easily manage and reuse library files across different applications. 

16 November 2015

DadenU Day: MongoDB

By: Iain Brazendale, Development Manager

Weather data from OpenWeatherAPI routed to a MongoDB database, and then displayed in Datascape 2

At Daden we all enjoy new challenges, so once in a while we take a day out to do some blue skies research and spend a day working with technologies that we find interesting. These days are known as Daden U days and help the team to take a wider look at technology and encourage innovation.

I’ve always had an interest in Databases but have never had the opportunity to look into NoSQL, so for the Daden U day I spent some time investigating MongoDB.

MongoDB unlike traditional SQL databases is a document database and as such stores its data in an unstructured way. I love how easy it is to get JSON data into the database and extract it.

I’ve got to say a big well done to the team at MongoDB, the getting started documentation is some of the best I’ve seen. Being able to dive right into documentation specifically aimed at C# was fantastic. Installing MongoDB, adding data and writing the first program to read the data took less than five minutes. Trust me when I say this was a lot shorter than my first adventures (many years ago) with MSSQL and Perl.

So was the Daden U day worth it? Well I now know more about NoSQL than I did last week, and more importantly it has encouraged me to learn more. Was it worth it to the company? Well funnily enough a few days after the Daden U day a new client asked me “Do you know anything about MongoDB” and I was happy to reply “Yes”.

9 November 2015

Programming Android Wear

By: Joe Robbins, Daden Developer

For our recent Daden U-day, I decided to learn about Android Wear, Google's operating system for smart-watches and other wearable technology. The philosophy behind Wear is to provide users with quick, convenient access to the most important information and functionality from their smart-phone applications. Wear apps should therefore be viewed as extensions of their smart-phone counterparts, not as a replacement for them.

moto360_2sm.png

Having recently got my hands on a Moto 360 (2nd Generation) smart-watch, and given the future potential of the platform for extending Daden's own applications, I thought it would be worthwhile getting to grips with how to develop apps for Wear.

As the day began, I downloaded and installed Google's own development environment, Android Studio, along with all the necessary packages to develop for wearables. Following this I had to configure my devices to allow my own apps to run on them. Android wear works by installing an application on a smartphone or tablet, and then this device will communicate with the wearable.

The first task I set myself was to build a simple, “Hello World”, app and get it to run on my Moto 360. This proved difficult, since I needed to find and install a specific driver for my smart-phone in order to install my own applications on there. However, once I found out about this, it was fairly straightforward to launch my “Hello World” program on the watch.

moto360_hw_sm.png

I then tried my hand at creating my own notifications, so I built a simple app on the phone that featured a single button, which once pressed, fired off a notification that could be viewed on the watch.

After this, I tried to build up more complex demo apps, but soon found that the debugging experience provided by Android Studio was somewhat less than optimal (perhaps Visual Studio had spoiled me), but I struggled to pinpoint the reason that my applications weren't running.

I spent the rest of the day watching a series of videos published by Google themselves, describing the design principles that should be followed when making a wearable application. This was very insightful and I learned a lot of important design guidelines that I would follow if I delve further into developing for this platform.

Since wearables make use of much smaller screens than a more traditional smart-phone and have more limited input mechanisms, building an Android Wear app brings with it new design challenges. The foremost of which is having to contend with the limited real-estate on screen. The designer therefore has to make sure that they are only presenting the most important information at any time, these devices are not intended for extended periods of use, rather, the user should be able to glance at them and see the information they need already waiting for them.

roundsquare-world.png

The application can detect the type of wearable face and optimise content accordingly.

In conclusion, despite teething problems, developing for Android Wear is an area that shows great potential, when utilised appropriately. I will continue to follow the development of the operating system as it continues to mature. It was encouraging to see how Google has built a micro-philosophy around how this new technology should be designed for. Personally, I look forward to the day that a Daden project gives me the opportunity to create something more substantial on this exciting young platform.

3 November 2015

Coverage: Innovate UK backs Birmingham virtual reality firm with new funding

Birmingham Post covers Phase 2 of our Virtual Field Trips project.

Innovation Birmingham Campus-based Daden secures £230,000 to support the rollout of its programme which aims to prepare students for field trips

Read the full story.

30 October 2015

Getting to grips with High Fidelity

TextPanel_sm.png

Having been reading more and more of the Alpha Tester posts about High Fidelity I thought it was about time to give it a try - and our Daden U day on 30th October provided the ideal opportunity. High Fidelity is the new virtual world development from Second Life founder Philip Rosedale, and is currently in Alpha.

Getting High Fidelity started was a bit hit and miss. I downloaded the Interface client, but although it booted up it kept crashing on my laptop with weird graphics effects, and no webcam control. I tried another office PC and couldn't get it even to run. I went back to the laptop, disconnected the external monitor and webcam, and it booted up like a dream :-)

The "big feature" of High Fidelity has always been the webcam avatar control - your web cam picks up your head movements, and even lips and eyes and animates your avatar in real time. This worked pretty well, although some times your head almost wobbles off and you have to reset, and since your conscious (at least initially) of the tracking you move your head in a very false way - which soon gets pretty tiring. Having seen it work I switched off the webcam control. High Fidelity also gives primacy to voice for avatar to avatar contact - and there are only third party text chat apps. Otherwise avatar control is pretty SL like, although flying has no flying animation and some of the camera/avatar control combinations seemed a bit tricky and I didn't have the hang of them even by the end of the day (added to which lots of places don't appear to have a solid ground!). Changing avatar was simple, although its now in one piece not two, and accessed from the Edit not File menu, both changes since the video tutorial.

Having mastered the avatar it was time to try building. The public sandboxes didn't let me build so I downloaded the Stack Manager to run up my own server. This was, as the video said, just a couple of minute task to get your own server running and choose the content pack. The process of creating a domain key has also changed since the video, but I managed to muddle through.

When first loaded into my new world all I had were stars - despite pre-loading the large floating island. I then spotted the island as as small dot miles away so had to fly over (something I've also found with other public spaces - poor choice of landing spots). Once there it looked pretty nice, good rock, grass and tree textures. Accessing the market, choosing an item and rezzing it was dead simple. Rotate, resize and vertical shift controls were easier than SL, but the movement is of the drag variety and I soon found objects disappearing off into the distance as I struggled to get a decent vantage point. I also had to add a ground plane to stop myself falling through the floating rock.

Bringin in my own objects was next. High Fidelity supports FBX (and OBJ, although FBS is usually mentioned), and I had to try about 6 or 7 different FBX objects from my collection til I found one that would come in, and even then without its texture. However the OBJ I brought in was half the City of London skyline, so that looked pretty neat (see below)! High Fidelity only imports from a URL, so you need to upload your objects to a server first.

LondonSkyline_sm.png

Coding was the next task. I found a simple "touch" script in the documentation, added this to an prim (you can have just cube and sphere prims, and no prim torture), and it worked. Like the models the scripts have to be uploaded to a server, and referenced by URL (so at least you have central script editing). There is an entity embed option but this seems very constrained - one liners only. I then extended the script to include collision. You can't do floaty text, so I rezzed a separate "text" object, and again got the script to control the text on the board ("hello world" above). Again lack/changes in documentation were a problem, as nothing told you how to change the text property or what it was called, I followed the other examples and guessed at textContent, as that was the editor label, but then found it was just "text", and sure enough in one of the older demo videos the editor label is just "text" as well! (well it is alpha...)

The final thing to play with on this task was good old media-on-a-prim - having an active web page on a surface in world that you can click on and navigate - dead easy to rez a Web object and add a URL, the Daden web site  - see below.

MOAP_sm.png

One last task was to add our Daden sandbox to the public directory, so I paid the $20 for a year's domain name (DadenWorld) and posted it up. Note that since High Fidelity is a fully distributed system (like OpenSim) DadenWorld is only up when my server is up. Interesting that NATO is listed there, and also looked like the Swiss Army!

DadenWorld.PNG

So all in all a good day's play with the system and I achieved all my goals - avatar control, building, importing, basic scripting, server running. Crashes were frequent at first, but after a while it all settled down. It is still in Alpha, and it shows (most of the menu options are very techie), but beginning to be usable. The distributed model is nice, and the webcam avatar control great fun, and we'll keep playing with it as the system develops and work out where it sits alongside Unity and WebGL, our current preferred technologies - and whether High Fidelity of Project Sansar (from Linden Lab itself) will be the NEW SECOND LIFE.

30 September 2015

September Newsletter - RosettaVR on Google Play store

In this latest issue of the Daden Newsletter we look at:

  • The release of RosettaVR, our first VR application for Google Cardboard on Android. This lets you view Comet 67P/Churyumov-Gerasimenko and the ESA Rosetta probe and Philae lander in full immersive 3D. The app is a free download from the Android Play store. We can use the same technology to put your products, locations, processes or lessons into the hands of your users.
  • The work we've done with schools, universities and other stakeholders on our Virtual Field Trips project as part of InnovateUK's Design for Impact project. We now have a clear feasibility design for how to build and roll out virtual field trips as a national (or even international) service to allow schools and universities across the globe to create, share and conduct virtual field trips of their own.
+++ STOP PRESS - We have just been told that we now have Phase 2 funding for Virtual Field Trips - more next issue +++

We also provide a quick preview of our forthcoming 2nd Generation Datascape 3D immersive data visualisation application - more next time.

We hope you enjoy the newsletter, and do get in touch if you would like to discuss any of the topics raised in the newsletter, or our products and services, in more detail!

Download your PDF copy of the newsletter 

25 September 2015

Daden releases Rosetta Mission in VR on Google Cardboard

rosettavr.png

We released our first Google Cardboard VR app onto the Google Play store today. RosettaVR lets you look at Comet 67P/Churyumov–Gerasimenko and the Rosetta probe and Philae Lander. For the comet we have both the official ESA 3D shape model, and the textured model developed by Mattias Malmer. For the first release we've reduced the resolution of the models to make for a smaller download, but plan to make a "pro" release shortly with the full size models. The Rosetta and Philae models are from EGPJET3D, and the probe includes an animation mode so that you can watch the huge solar panels expand out.

We've used the release as a chance to try out some different interaction modes, and will continue this theme in future releases. With the limited controls of the standard cardboard, and the typical use case of standing in a crowded room, how best to do you move through a space or look "around" a model. For RosettaVR the main interaction is rotation control, you can set the featured object spinning around the X or Y axis at different speeds, and quickly stop and start it so as to get to a particular point. Then you can zoom in or out. All of this by just looking at the controls, and with the option of single button click confirmation. There is also an additional "drag" mode, so than by holding the button down you can "drag" the comet around a bit to optimise the view.

Rosetta_sm.png

RosettaVR is currently available only on Android, but if there's the interest we'll do an iOS version (which also works with Cardboard) too - let us know. As mentioned we're also planning a "Pro" version with higher rez models. As well as helping to communicate about the wonders of the solar system and of space exploration the aim is also to use it as a demo to show people what can be done with VR, and to experiment with the different interaction modes. 

And of course whilst this app is all about Rosetta, there are no limitations as to what the objects you are viewing can be in an application like this. They could be products, show homes, cars, machinery animations, process animations, historical artifacts - almost anything. The immediacy and accessibility of the Cardboard also make it ideally suited to customer engagement at trade shows or in retail centres, and as something extra for your salesforce to have with them all the time. So if you have objects or places that you would like to put virtually into the hands of your customers just give us a call on 0121 250 5678, or drop us a line at info@daden.co.uk.

Download RosettaVR onto your Android phone today. (You will need a Google Cardboard to get the full experience).

22 September 2015

Daden wins Phase 2 Funding for Virtual Field Trips

We're delighted to announce that we have been successful in gaining Phase 2 development funding for our “Virtual Field Trips as a Service” initiative from Innovate UK, the UK's Innovation Agency, in Phase 2 of the Design for Impact Competition.

Launched in May 2014, Design for Impact aimed to identify and then support innovative technology that had been proven in pilot projects in education but had yet to have a national impact. Working with The Open University (OU) and Birmingham based Design Thinkers UK we submitted a proposal for Virtual Field Trips as a Service, taking the concepts developed as part of the Virtual Skiddaw project that developed for the OU in 2013, and looking at how this could be scaled up to a national service for schools and universities.

We were one of 15 projects (out of around 200) selected in September 2014 for Phase 1 funding. From Nov 14 to Apr 15 we worked with teachers and students at Washwood Heath Academy in Birmingham, virtual world educators in Second Life, university lecturers at a Royal Geological Society workshop and a range of other stakeholders to understand the potential, challenges and key features of any virtual field trip service.

Innovate UK has now announced that ours is one of the projects selected to receive Phase 2 "development" funding. For Phase 2 we will be working with our existing partners, the Open University and Design Thinkers UK, and we're delighted to be joined by new partner the Field Studies Council. Phase 2 will see the development of a prototype system, and a full trial and assessment with both universities and schools, the latter facilitated by working closely with one of the Field Studies Council’s own field study centres.

dekstop-mobile.jpg

We are really proud to be one of the project chosen for Stage 2 Funding. There was some stiff competition and we were up against some other very innovative and exciting ideas. This project gives us the opportunity to develop an immersive 3D environment that is optimised for educational use, yet flexible enough to let educators create and customise content. Almost everyone we’ve spoken to has not only seen how virtual field trips can be a natural complement for physical field trips, but also how the technology could be used to provide a wide range of virtual experiences in support of other subjects from history to languages.

The service is intended to support, not replace, physical field trips. It will help students and staff prepare better for a field trip, can provide additional context during the field trip, and gives a focus for post-field trip data analysis, revision, virtual visits to comparative sites, and provides a catch-up for those who may have missed the physical trip.

DowntheValley_sm.jpg

At the end of the 12 month project we should be in a position to start taking the service to market. Whilst the project is focussed on UK education there are also obvious opportunities overseas – particularly letting students have virtual “exchange” field trips. As well as looking at desktop and mobile delivery the project will also be looking at using the latest generation of virtual reality headsets such as Oculus Rift and Google Cardboard.

You can read a bit more about the project on the project page, but we will be revamping that shortly for Phase 2, and we're also looking at producing a separate micro-site or social media site to support the project. More news in due course.

3 September 2015

All about Google Cardboard

We've just added a short introduction to the Google Cardboard VR "headset", and some ideas about how you could use Google Cardboard and its VR experience in your organisation. Visit our Google Cardboard page now.

1 September 2015

VR and AI at the Edinburgh Fringe

Whilst up at the Edinburgh Fringe Festival last week two shows caught my eye from a Daden perspective, I caught one and my daughter caught the other. 

ABACUS (Sundance Version) Trailer from Lars Jan on Vimeo.

ABACUS was a "presentation" by "Japanese cult-icon Paul Abacus" in the style of a visuals rich TED talk, filmed live by two steadicam operators. The visuals were very "Datascape" at times, 3D graphics over the Earth's surface", and some of the graphs were even based on Second Life stats! But as it the style of so many of these things what starts out straight soon begins to slowly collapse as the "presenter" goes off-piste, rages against the world, and goes ever so slightly mad. Apparently "Paul Abacus" is the fictional creation of Los Angeles-based director Lars Jan, and the show (and unveiling) caused quite a storm at the Sundance festival. It was OK, but one can't help feeling it needed to be either more real and straight, or far more off the wall.

The second show was Spillikin, which featured the Robothespian robot as a robot/AI created by a dying husband to keep his wife, who as Alzheimer's, company. As the show notes say "... the husband, already an obsessive archivist, builds a perfected robotic version of himself, to be deployed after his death: a patient carer, an aide-memoir, a singing partner, able to give order to her confusion, and to bear without complaint the endless repetition required to reassure her." We've seen a number of academic projects using chatbots to support Alzheimer's sufferers in just this way, and we're currently working on a Digital Immortality paper which also looks at some of the issues about achieving agency after death.

10 August 2015

Cardboard vs Rift

oculusrift_centsq.jpg

nashcardboard_sm.JPG

Over the last few weeks we've finally had time to start playing with Google Cardboard - and I must say we are impressed. Whereas Oculus DK1 was great, but then DK2 was such a pain that we dread getting it out of the cupboard Google Cardboard is a joy to use, and might even drive me to buy an Android phone!

For those that aren't familiar with Cardboard it IS just a cardboard holder (and a couple of lenses) for your smartphone. You can build one yourself (details at https://www.google.com/get/cardboard/get-cardboard/), or buy a kit for about £10 on Amazon. Then fold the eye hood over, slide your mobile phone in and away you go. The phone's sensors are used to do the basic rotational head-tracking, and there's a single button on the box which triggers a single touch to the screen, and that's it.

What stunned us at first was the quality. With a MotoX its about the same as DK2, and with a Samsung Galaxy S6 it feels better than DK2. You don't get lateral head tracking, which needs an extra sensor anyway on the Oculus, but that's hardly vital. The big downside is that you have to hold the box in place with your hands (the strap that came with ours was useless), so you can't hold a game controller or joystick to control your movement. However the big upside is that you just have to hold the box in place with your hands!  Just pick up and go. No complex PC configurations, no leads running everywhere and trying to peek out from under the headset to see the keyboard. Just hold it to your eyes and you are away! Combine this immediacy with the fact that anyone could have hundreds of these available ($10 each and bring-your-own-phone) in a class or business and all of a sudden VR seems much more feasible right now from a business point of view.

The big technical challenge initially is how to design the UI. You only have gaze and one button, and it's interesting that the most common solution at the moment is actually the same as we developed for the Oculus Rift version of Daden Campus -  if you stare at something it becomes active, and they either stare a bit longer and it activates, or it waits for the key press to activate. Simple. Using this model even movement becomes easy to implement, stare at you feet to start moving, you walk in the direction you are looking so you can steer as you go, look at your feet again to stop.

We're now working up our first Cardboard demo app which we should have available in the next few weeks, and then we'll do a Cardboard version of Daden Campus and build Cardboard support into all our demos and projects from then on. I'll then have to have a Cardboard on me whenever I go to events and meetings, and the Android phone to go with it!

All we now need is a simple head-frame to give you the proper hands-free for when you need it, something like a cross-platform Samsung VR Gear, or better still a low cost option like the Durovis or XG VR.

If you'd like to read more about VR then download our white paper on Virtual Reality, and if you'd like to have a demo, or would like us to create a VR experience for you, your students, employees or customers then just get in touch

3 August 2015

Daden New Starters

We'd like to welcome Joe and Ishaq to Team Daden. Both are recent graduates, Joe from the University of Birmingham and Ishaq from Birmingham City University. Both have joined as junior developers. Joe is working on migrating our Discourse chatbot code from Perl to C#/ASP.NET (something that has been on the to-do list for years), and Ishaq is updating our Virtual Reality demos and getting to grips with WebGL ready to start work on the WebGL version of Datascape2 - cunningly named DatascapeGL!

17 July 2015

Work Experience - Abbas

Abbas.JPG

Abbas, a year 10 student from a local Birmingham school joined us for 8 days this summer. As well as helping out on tasks around the office we also gave him the opportunity to learn some C# and Unity programming, and play with Google Cardboard. Here is his report.

"Over the eight days that I have been at Daden, I have learnt new skills and have experienced what working life is like.

On the first day that I arrived, I was given a sheet of paper on which David (MD) had listed some possible tasks for me to complete (one of them was to write a blog post), also included in the list were two coding challenges, in the first I was expected to create a program that reflected the average mathematical capabilities of a human. My second challenge was to write a program that displayed numbers such as 9,999,999 in words in the same way a human would say it.

On the second day, I decided to undertake a new challenge, and I began learning 'C#'.

It was far more verbose and complex than Python (which was the only programming language I was familiar with at the time). Once I had gained a decent understanding of the language, my colleagues assisted me in writing functional bubble sort and binary chop search programs. I then started working with Unity and created a simple 3D ball game. While I was building the game, I learned fundamental Unity skills which I could build upon.

Overall, I have definitely gained a lot from this experience; my level of ability and understanding of programming has undoubtedly improved from eight days ago. Along with that, this experience has also given me an idea as to what a working environment is like."

Thanks Abbas, it was great to have you with us, and all the best for your future studies.

22 May 2015

Virtual Field Trips at Collaboration Nation

vftaas-team.jpg

On 21st May we took part in Collaboration Nation, InnovateUK's celebration event for the 15 Design for Impact Feasibility projects (out of over 200 applications). Alongside the other projects we had 4 minutes(!) to present our work, as well as running a small stand in the networking area. There were around 100 delegates to the event, mainly drawn from major organisations (public and private sector) with an interest in technology in education. The image above shows our Virtual Field Trips team of David Burden (Daden CEO), Prof. Shailey Minocha from The Open University, and James Rock from Design Thinkers UK.

Our presentation went down well (despite video problems) and several people commented that the point where the avatar jumps up into the air and flys, and then the geological cross-section lifts up was one of their favourite parts of the morning.

dekstop-mobile.jpg

On our stand we were showing off both the original Virtual Skiddaw and the new Virtual Field Trip test-bed. One of the best examples of the test-bed was running a new version of our old Apollo 11 sim on both the laptop and the MotoX phone, and this being multi-user with both avatars being able to see each other - just shows that modern mobile phones are more than capable of running virtual world applications.

allprojects.jpg

And here to finish is the "team photo" of all 15 project teams! There was some great stuff being done and some noticeable clusters around 3D virtual worlds/Unity3D type tech, Arduino and maker bots, gestures/haptics and 3D printing. Interesting that there were no AR projects in the final 15. Hopefully all of us will find a way to bring our projects through development and on to market.

30 March 2015

Datascape in InnovateUK Best Practice report on DataViz

innovateuk_report.png

Daden is honoured to be not only included in InnovateUK's new report on design and best practice in data visualisation but also to have a Datascape image as the front cover of the report. You can read the InnovateUK blog post and download a copy of the report.

29 January 2015

Daden Chatbot Significantly Passes Modified Turing Test in a Virtual World

As reported in a peer reviewed paper just published by Loyola Marymount University a chatbot avatar based on our Discourse chatbot engine recently significantly passed a modified Turing Test run in the virtual world of Second Life. The highlights from the paper are: 

  • The first natural language Turing Test conducted in a 3D virtual environment. 
  • 78% of participants incorrectly judged a chatbot to be human. 
  • Agency judgments were due to the quality of the AI engine and psychological factors. 
  • Passage of the Turing Test involves a complex process of human–computer interaction. 

The test itself was based on a test we set out in another paper back in 2009. There we described how most current Turing Tests were very unbalanced, since both judges and "hidden humans" knew they were taking part in a test, and often the only person who didn't was the chatbot! We described how virtual environments such as Second Life offered an ideal level playing field, since everyone, human or chatbot, looked like an avatar, could have access to the same information about the environment, and crucially the expectation of most people in the world was that the other avatars they talk to are other humans - so all the bot has to do is not give away the fact that it was a bot, rather than try and convince an already sceptical judge that it was a human!

In the modified Turing Test that we proposed - a "covert Turing Test" - subjects (aka judges) went into a virtual world for supposedly another purpose, and whilst there had a conversation with a bot (or human). Only after the event were they asked whether they thought the avatar they were talking to was a bot or a human.

This is exactly the test that Loyola Marymount set up. You can read the full details and results in the paper, but the essence was that subjects were invited into Second Life to give comment on a new shop layout and design, and that there would be a shop assistant in there to help them. 50% of the time the assistant was a chatbot, and 50% a human. A nice touch in the double-blind design was that the humans playing the assistant also didn't know that they were taking part in a Turing Test, so they couldn't act "over-human", or even "more robotic" - as often happens in physical Turings.

loyola.png

The very simple avatar used by both the humans and the chatbot.


The bottom line was that 78% of the test subjects did not recognise the avatar as being a chatbot when run by the chatbot - well in excess of the 30% target set in the full Turing Test, or even a 50% random choice. 

So what does this mean? 

Well first we don't see the Turing Test as a measure of "intelligence" or "sentience" - it is purely a reasonable test for how good a chatbot is at mimicing human conversation. And seeing as having "human-like" conversations with computers could be useful for a whole range of things (training, health-care etc) then it's a reasonable test of the state of the art. 

The problem has been that most current Turing Test implementations are VERY artificial - I've been a "hidden human" and I know how unnatural the conversations are on both sides. With the covert Turing we were trying to create a more "fit for purpose" test - can a human tell the difference between a human and a chatbot in a practical, realistic setting with no pre-conceptions - the sort of test we need if we want to deploy chatbots into real world applications. Yes it is a lower bar than a full-on Turing Test (which itself is almost taking on the artificiality of a linguistic chess game), but for us it is a very valid waymark on the route to the full Turing Test.

If you'd like more information on Discourse, or our use of chatbots within our Trainingscapes then please get in touch.

PS. You may be able to get a free copy of the article by following  the PDF link against the articles listing in the journals' contents page, rather than from the article page!

28 January 2015

Virtual Field Trip - Second Life Workshops

As part of the the Virtual Field Trips project we're running a couple of introductory presentations/discussions/workshops with the Open University in Second Life. The first on 3rd Feb is kindly being hosted by The Science Circle, and the second on 12th Feb is kindly being hosted by the Virtual Worlds Education Roundtable. Check out their web sites for locations and times. We will then be hosting a more focussed design workshop session on 3rd March on the Open University's Deep Think island - contact us for details.

Remember we also still have the survey running - please take part.

26 January 2015

Virtual Reality for Immersive Learning - Presentation

We've just put our virtual reality presentation on Slideshare. This hopefully takes a balanced view of the potential, and issues, for virtual reality in the training, learning, education and visualisation spaces based on our own first hand experience of Oculus Rift DK1 and DK2 and integrating them into our applications. You can also download the original white paper.

If you'd like a demo of the VR work we've done so far let us know and we'll see if we can arrange a demo.

15 January 2015

Virtual Field Trips Survey - Take Part!

As part of our InnovateUK funded Virtual Field Trips as a Service project we are running a survey to find out how people currently approach physical field trips and how a virtual field trip service might be able to help them. The survey takes only 5-10 minutes. There are separate surveys for teachers/lecturers, for students, and for specialist fieldwork tutors. Please let us have your thoughts, and do let us know if you'd like to see the results.