16 December 2016

Daden on DMB's VR Infographic

The Digital Marketing Bureau has recently completed an infographic on players in the UK VR space. We're glad to see that Daden are on it - although the colour choices and marker size make it hard to see which category we (or anyone else) are in! For reference we think we're in (by their categories):
  • Academia
  • Education
  • Healthcare
  • Public Sector
  • Training
  • Medical
And of course Dataviz!

Graphic is by thedigitalmarketingbureau.com

12 December 2016

DadenU Day: Experimenting with Motion Capture

Custom Animations for Unity using Windows Kinect
by Nash McDonald
For the Daden U Day I decided to investigate how much effort was involved in creating animations for 3D characters using motion capture. What tools were available and finally what results would this process of creating animations yield compared to the traditions way of animating using key frames. For the motion capture I planned to use a Microsoft Kinect v1. I chose this because it was the only motion capture device that was available to me at the time. Ideally I would have preferred an Xbox One Kinect camera as Kinect V1 was first released June 16 2011 which is 5 years ago from the time of this post. I planned to use the Kinect with markerless motion capture. There are other cameras available on the market for markerless mocap but they are specialist equipment which require special rigs.
The morning was spent doing some research on the best way for capturing motion from the kinect camera and turning it into animations usable in Unity with the Mecanim state machine. Many google searches later and countless youtube videos I came to the conclusion that there was only two ways of accomplishing my goal for the day. The first was to directly record motion in Unity using the Microsoft SDK for Unity3D and custom scripts written by the Unity Community. The second was to capture the motion using mocap(motion capture) software then create animation files usable in Unity3D. I chose the latter because i wanted to have the ability to edit the animations before using them in Unity. I found a very good piece of software iClone 6.
iClone 6 is an application used for generating 3d scenes and animations. it can animate objects as well characters. With iClone 6 i was able to install the iClone Kinect mocap plugin. I imported one of the avatars from Fieldscapes into iClone. Afterwhich I was now able to use the Kinect Mocap Plugin to record animations using in iClone. See image below

I recorded a simple touch animation to simulate touching an object in Fieldscapes. The animation recorded would have ideally needed further refinement as the recording from the Kinect was not very accurate. I was able to export the character as an FBX file containing the animation data which can be imported into Unity 3D. The whole process from start to finish was mostly painless. I suspect if the animation was more complicated the result of the mocap would have needed an animator to spend a good few hours ironing out the rough edges. The process and equipment I used were very good to get a rough start and speed up the whole animation process but could never replace the traditional way of animating each bone in the character's body using key frames.

8 December 2016

Webinar: Virtual Reality - Getting to Caprica

David has just presented another Brighttalk Webinar. This one is more of a personal view about the future development of Virtual Reality, and uses the VR/Virtual Worlds scenario presented by Caprica (the excellent Battlestar Galactica prequel) as a key reference point.

View at BrightTalk
You can also watch David's other recent Brighttalk webinars on VR for Data Visualisation and for eLearning:

14 November 2016

Datascape 2.0.2 released

We've released v2.0.2 of Datascape2. This is primarily a maintenance release, fixing bugs and making some improvements to usability. Key fixes are:

  • If no mappings match then show new mapping screen, not empty match screen
  • Fixed data import crashes with field name titled "Group" (reserved word)
  • Fixed "sequence contains no matching elements" error Mapping Templates 26/10/2016
  • Removed need to use ToLower() when doing some lookups (esp after a Round or other SQL function)
  • The Global Spherical Mapping template was swapping Longitude and Radius values each time the data was re-plotted.

Full release notes at: https://dadenwiki.atlassian.net/wiki/display/DAT/R...

Datascape can be downloaded without registration from: http://www.daden.co.uk/conc/datascape/datascape-do...

1 November 2016

Fieldscapes Survey and Video

We're running a short survey to help inform the final stages of the development of Fieldscapes, please find 5 minutes to fill the survey out here:


In addition we have also just launched our first proper "Introduction to Fieldscapes" video. YOu can watch it here or on YouTube.

18 October 2016

P53 Protein Model

We found some nice data on the RCSB Protein Data Bank which gives the x/y/z angstrom unit location of every atom in a huge range of proteins. The data was in a fixed width format and didn't take long to convert to CSV and add in some of the meta data contained at the top of the file. The visualisation shows both the whole dataset, about 65,000 points covering 30 different co-located models, and then another mapping shows just a single model. Scrubbing is used to filter through the different models, and then on the single model to filter through the different chemical elements. Shape is used consistently to also show element, and colour to either show strand or element.

This sort of protein visualisation was always something we thought that Datascape wouldn't be particularly brilliant at, and there are several dedicated apps to do it, but we were impressed at how good the results appear to look.

Download the free Datascape trial app now!

17 October 2016

Datascape v2.0.1 Released

Today we released Version 2.0.1 of Datascape. This is primarily a maintenance release and includes:
  • Several small usability improvements
  • Several minor bug fixes
  • The implementation of a new key based licensing system
  • Ability to show multiple hover labels.
We've also now made a simple single click download link - no filling out any forms - so no excuse not to give it a try!
From here the route map is:
  • v2.0.2 in 3-4 weeks to allow for workspace import and export to encourage greater sharing of visualisations
  • v2.1 in 1-2 months which will feature export to WebGL of visualisations, to not only share on the web but also so they can be viewed in VR mode in Google Cardboard.

22 September 2016

Brighttalk Webinars now available

David's two Brighttalk webinars are now available for you to view for free and at your leisure:
Enjoy, and we'd welcome any feedback. David will probably be doing another Brighttalk event looking more broadly at the use of 3D in DataViz in November.

21 September 2016

A 3D Dataviz Taxonomy

Whilst prepping the slides for last week's Brighttalk 3D Dataviz Webinar (watch it now) I started to put together a taxonomy of 3D data visualisation.

The starting point is a 3D plot - we are plotting data against 3 axes, not 2.

There is then a big divide between an allocentric and an egocentric way of viewing the data. Allocentric means that your reference point is not you, it's something else. In egocentric you are the reference point. In practice this means that in an allocentric plot if you move the viewpoint it feels like its the data moving, not you; and in an egocentric plot if the viewpoint moves it feels like you're moving and the data is staying still. Since this latter is how the physical world works it's what our eyes and brains are used to, so we feel more at home, and we can maintain context and orientation as we move through the data. Tests we did a few years ago with Aston University compared allocentric and egocentric ways of exploring 3D data, and showed that generally performance was better for the egocentric view.

Within the allocentric branch the next divide is whether the plot is static (in which case I suppose you could argue it's neither allocentric or egocentric) - as you might get in say Excel, or whether you can rotate and zoom the plot (as in something like Matlab). Are there any further sub-divisions?
On the egocentric branch we think the divide between viewing the data on a 2D screen (as in "3D" computer games), and viewing it through a VR headset in "real" 3D is far more a case of how you view the data rather than in any fundamental change in how it is being plotted. To us the big benefit is going egocentric not allocentric, not going from 2D screen to 3D headset. In fact our experiences with Oculus DK1 and DK2 suggest that the 3D headset is actually a worse way of viewing data in many (most?) cases. Luckily Datascape will be agnostic between 2D and 3D displays once we release V2.1 - you'll be able to do both. 3D wall displays using head-tracking glasses are probably another example of a different view rather than a different method of plotting. But again are there other more useful/detailed distinctions that can be made?

Let us have your thoughts in the comments, on our Facebook or Linked-In group, or to @datascapevr.

12 September 2016

The 3D Dataviz Escalator

    Back when we did some of the original research work and testing on immersive 3D data visualisation that led to Datascape we developed this "benefits escalator" to show the increasing possible benefits of moving visualisation away from 2D graphs to immersive, multi-user 3D spaces. With the launch of Datascape 2.0 we felt it needed a bit of a facelift. What's interesting is that there is very little there that we thought actually needed changing - the fundamental message of the chart still stands:
    • 2D plots can rapidly become crowded and unreadable, and provide no spatial cues to help remember them - they are all just lines on paper or screen.

    • Adding a 3rd dimension to a 2D plot let's you add an extra dimension of data, but the image gets even busier, and if you can rotate and zoom the chart then you get rapidly disorientated.

    • Moving into an immersive 3D space where its the data that stands still but you that has the sense of movement gets around much of the disorientation, which in turn lets you take up lots of different viewpoints inside and outside the data to better understand it, and the sense of moving in amongst the data gives you a spatial relationship to the data which can help with recall and with story-telling within the data.

    • Given that the 3D space is near infinite you can spread your data out far beyond the limits of the page or screen, yet you can zoom in on the smallest part but still have the context of "the whole" in the background. The space also lets you use 3D models to represent the data which can let you communicate far more than a colour key, and with minimal visual interference unlike 2D pictograms. The 3D space can also be a home for multiple charts, all on different axes, but all in the same space or frame of analysis.

    • The final step is when we make the space multi-user, so that you can see where everyone else is in the space, and what they are looking at - opening up whole new possibilities in collaborative data visualisation and visual analytics.

    You can read more about immersive visual analytics in our white paper, or download a trial copy of Datascape to try it out yourself.

    7 September 2016

    Immersive Visual Analytics

    We've updated our Immersive Visual Analytics White paper - first published in 2012. The updates reflect our experiences with visual analytics through Datascape 1 since then, market developments such as the rise of VR, and the new opportunities opened up by tools such as Datascape 2.

    Download the Immersive Visual Analytics White Paper today!

    Immersive Visual Analytics - Updated White Paper Published

    We've updated our Immersive Visual Analytics White paper - first published in 2012. The updates reflect our experiences with visual analytics through Datascape 1 since then, market developments such as the rise of VR, and the new opportunities opened up by tools such as Datascape 2. Download the White Paper today!

    6 September 2016

    Brighttalk AR/VR Summit - Daden's Two Talks

    Daden have two talks at the forthcoming Brighttalk "Brave New World: Augmented and Virtual Reality" webinar summit taking place on 13th & 14th September.
    Sign up to either (or both!) free live webinar presentations today.

    5 September 2016

    DadenU Day: Vulkan Graphics API

    By: Sean Vieira

    As June was coming to a close there were big decisions to be made. The most important day in a long time was approaching. No, not the EU referendum. The obviously more important DadenU day, on the 1st of July. Once again the decision on what I would do came late but I had a few things that interested me.

    One of those things was the Vulkan API, by Khronos Group, which is a modern graphics and compute API that is cross platform and has a focus on parallelisation and efficiency. An evolution on the company’s OpenGL API it aims to be an open standard for devices ranging from PCs, smartphones, games consoles and embedded systems. It was first released in February 2016 and currently has very little software support, though notable game engines such as Unreal Engine 4 (Epic) and Source 2 (Valve) have jumped on the bandwagon. More information on Vulkan can be found here. So I took this opportunity to jump on the bandwagon myself and see what it was all about. Vulkan-2016-Transparent-bg.png

    Vulkan’s cross platform support offered up the first choice of the day, but it wasn’t a particularly difficult decision. The graphics card in my work PC is not supported by the API, and is actually the most recently released AMD GPU architecture that is not supported by Vulkan. At the time of writing only AMD cards that are based on GCN (Graphics Core Next) 1.0 or later are supported, so Android was my platform of choice.

    Despite having very limited Android development experience (outside of Unity) I took it upon myself to try Vulkan out. Having already prepared my phone for use with Android Studio, I went in search of some examples to test, and running these tests would take up the majority of my day. There were two prominent resources that I found for Vulkan on Android; the first was an official Android NDK tutorial, and the other was the official SDK by Qualcomm, which is for their Adreno GPU.

    My attempts to get the former to work were futile. A combination of my development inexperience coupled with, what I later discovered was, an incompatible version of Android made this a frustrating exercise. Most of the issues came from trying to ensure that the Android target versions weren’t incorrect and I spent a lot of time trying to sort it out. In the end, I had gotten nowhere and decided it was best to move on to the latter option.

    This proved to be much easier and more productive. I managed to import the basic 'triangle' project from the SDK into Android Studio with little to no hassle, and I had managed to compile it without error. Now that it was compiled I it on my device, only to find that the app popped up on screen and then disappeared in a flash. After a little bit of digging around I found out that it was most likely an issue with the Android version requirement of Vulkan - it required API level 24 (the N-Preview) whereas my device was still only on API level 22 (ie Lollipop).

    After failing to update the company phone to the correct version, my final throw of the dice was to forego Android and use the new laptop, which was bought specifically to demo our products. Being able to maintain a high frame rate when running our applications was a priority and for this reason the laptop contains a dedicated NVIDIA mobile GPU - which is fortunately supported by the Vulkan API. To test this, I downloaded the ‘Chopper’ demo from NVIDIA’s website and ran the it. Thankfully the helicopters flew forwards, albeit stuttering fairly frequently, and I could begin.


    Unfortunately the laptop wasn’t set up for development, so I had to install the necessary applications before getting started. Once Visual Studio, CMake, and the LunarXChange SDK were installed, I got to work and attempted to run an example project. I used this handy tutorial video by Niko Kauppi to help me along the way.

    The SDK first needed to be prepared for development by building the necessary libraries. These had to be built in a specific order, to ensure that each library project has the correct dependencies. Each project required a debug and release build as well. The glslang library needed to be built first, followed by the spirv-tools, and finally the samples that I would run.

    Each library required a Visual Studio solution be made so that they could be compiled, so I used CMake through the command line to build the glslang library first. Once the solution had been created I opened it and compiled the 'ALL_BUILD' project within the solution twice - one debug and one release build. I did the same thing for the spirv-tools and samples libraries, in that order.

    Now I could run some samples. The very first sample I decided to run was the 'instance' sample, which is the most basic example project, so I set it as the startup project of the solution. It creates an instance of Vulkan… and destroys it. To ensure that it was actually running, I added in a line of code at the end of sample_main() to wait for a keypress. Having done this, I re-built the project and the console window that flashed previously, now sat waiting for my to press a button, with no errors.

    Looking at the source file for the project, we see that the sample_main() function contains four major steps. To create an instance the API needs to know some information when creating the instance, and this is stored in an VkInstanceCreateInfo struct. However, this structure requires knowledge of the general application info, so we first fill out a VkApplicationInfo struct with information such as application name, engine name, and api version.

    // initialize the VkApplicationInfo structure

    VkApplicationInfo app_info = {};


    app_info.pNext = NULL;

    app_info.pApplicationName = APP_SHORT_NAME;

    app_info.applicationVersion = 1;

    app_info.pEngineName = APP_SHORT_NAME;

    app_info.engineVersion = 1;

    app_info.apiVersion = VK_API_VERSION_1_0;

    // initialize the VkInstanceCreateInfo structure

    VkInstanceCreateInfo inst_info = {};


    inst_info.pNext = NULL;

    inst_info.flags = 0;

    inst_info.pApplicationInfo = &app_info;

    inst_info.enabledExtensionCount = 0;

    inst_info.ppEnabledExtensionNames = NULL;

    inst_info.enabledLayerCount = 0;

    inst_info.ppEnabledLayerNames = NULL;

    VkInstance inst;

    VkResult res;

    res = vkCreateInstance(&inst_info, NULL, &inst);


    std::cout << "cannot find a compatible Vulkan ICD\n";


    } else if (res) {

    std::cout << "unknown error\n";



    vkDestroyInstance(inst, NULL);

    Once we have the VkInstanceCreateInfo, we create the instance by calling the vkCreateInstance method and pass it the info struct and a null VkInstance by reference. Unless there are any issues with drivers and compatibility, this should create the instance. The final step is to destroy the instance, so that the application can safely close. This is achieved by calling the vkDestroyInstance by passing the VkInstance by reference.

    This was as far as I managed to get on the day. It was farther than I expected to get but not as far as hoped. I have learned a few things from this, and unfortunately not much to do with coding using the Vulkan API. It was fairly hard to find somewhere to start when it came to the Android version, though most of that can be put down to my inexperience, and I had to cross reference a few tutorials to try and figure out exactly what I needed to do. Getting started with the Windows version was much easier, though that could be down to having more experience.

    Learning Vulkan is something that I would certainly like to pursue in the future and having managed to set it up on the laptop provides me a foundation for future exploration. Perhaps next time I can get round to rendering a triangle!

    19 August 2016

    Fieldscapes Editor - Simple Timer Tutorial Video

    This week's Fieldscapes tutorial shows you how to set-up and use timers in Fieldscapes. Timers would be typically used:

    • To limit how long a student can spend on a task.
    • To provide hints part way through a task if not yet completed

    You can use timers in conjunction with variables and rules, so that different messages can be given by the timer based on how far through a task the student is, or a task could finish either when all of a set of locations have been visited (controlled by variables) or when they run out of time (controlled by the timer).

    Check out the Timer entry on the wiki for more information.

    Note the tutorial is set in Oaker Wood. This is somewhere on the English/Welsh borders. Rather than using a third party height map to create the landscape we used Paint to "colour in" the contour lines of a map in deepening shades of grey, and then imported this as a height map into Unity3D. We then used Unity's terrain smoothing tool to remove the "stepped" heights in the areas we were going to use, and then added some Unity trees and grass, and also a few boxes to show the village centres. As ever more time could be spent but it shows how you can quickly create a landscape using very basic tools - and at no cost (although watch the OS "derivative works" licence!).

    18 August 2016

    An Introduction to Datascape slidedeck

    This slide-deck provides a useful introduction to 3D data visualisation and Datascape. It covers the main controls for Datascapes and provides examples of some of the main visualisation types that can be created with it.

    Datascape Introduction from Daden Limited

    And don't forget that you can download a FREE 30 Day Trial copy of Datascape, so as to give it a try yourself.

    12 August 2016

    Datascape 2.0 - The Basic Worklflow

    In this post we'll show you the basic workflow in Datascape so that you can get a good idea of how it works.

    1. Choose a Workspace

    The workspace defines the backdrop and axes. As well as standard XYZ axes you can also have a world map (or in fact a panel with any texture on it), or a dynamic Open Street Map pan which you can pan and zoom to anywhere in the world. There is also a 3D globe, and again you can change the texture on that (Mars anyone?).

    2. Import Data

    You can import directly from Excel or CSV files. YOu can also use the SQL readers to hook directly into a SQL database. Hadoop, JSON etc coming soon. You can preview your data, optionally set field types, and Datascape warns you of any errors or gaps in the data as you import. All we ask is that there is one line of data for each point you want to plot on screen.

    3. Choose a Mapping Template

    Datascape comes with a set of mapping templates for the most common chart styles - including geospatial (height=altitude, or other value), geotime (height = time), spherical and cylindrical (to save you the maths!). Or you can choose Manual Mapping and roll-your-own.

    4. Customise the Mapping

    This panel is the absolute heart of Datascape. Here you specify how each attribute of the plotted point is defined, in terms of either fixed values (eg "red"), or fields form the data. You can use standard SQL functions, and combine fields in quite complicated ways to define each attribute - but often you'll just go click-click-click to assign individual fields to individual attributes. You can also use Translation Tables to convert numbers or text to colours and shapes.

    5. Visualise, Explore and Iterate

    Then click on Plot! You can then fly around the space, viewing the data from any angle, inside and out. The best bet is to get something basic plotted first (even in 2D, set Y=0), and then add Y, colour, shape etc to enrich the visualisation and get it to start telling you what is going on in the data. You can change the axes scales if you need to spread the data out more, and use a search box or dynamic "scrubbing" filter to pull out subsets of the data. You can hover over a point to read its hover text, click on it to see all the data fields for that record, and if you've defined one in the mapping even click through to a URL in your web browser to examine a related web page (eg an entry in a star catalogue).
    And that's the basic process. Datascape comes with several demo workspaces that you can play around with to see how the mappings work before you import the first of your own data.
    Happy plotting!

    11 August 2016

    Aug 16 Newsletter: Datascape 2 released and Fieldscapes update

    In this latest issue of the Daden Newsletter we are proud to announce:

    We also have:

    • The release on Apple App Store of our iCentrumVR app, an example of how VR can be used to explore locations for real estate, lettings, education, training, or heritage goals
    • A look at the Fieldscapes Editor - coding free creation of 3D immersive learning experiences for PC/MAC, tablets or VR. And news of the forthcoming Fieldscapes Beta.

    Read it now!

    VR in Education: Moving the Classroom to Mars (and other field trips)!

    Bookings are now open for David's talk at the on-line VR/AR Summit on 13th Sept. David will talk in general about the potential benefits, opportunities and challenges of VR in education - and also provide a high level overview of Fieldscapes and how its easy authoring system will open up VR educational content generation to a wider audience.

    Follow the link below to register.

    10 August 2016

    Datascape V2.0 is Released!

    After almost 2 years of work, some of it funded through an InnovateUK project with partner IGI, we have today released Datascape 2.0 into the wild.

    Building on our experience with Datascape 1 (which was written in Unity) Datascape 2 is a completely re-written application which provides the same overall experience and capability as Datascape 1 but with far higher usability and performance. Datascape 2 (known as Datascape2XL while in development) is written in C# and DirectX, and is currently only available on Windows.
    By using native Windows tools we’ve been able to create an application which looks and behaves far more like the sort of business apps that people are used to. You can import data directly from Excel spreadsheets and CSV files (or link it to a database), and use drag-and-drop to assign fields to parameters in the mapping screen. A simple tree view gives you immediate access to all the components of the visualisation, including the axes and graphical background.

    Datascape2 also has power. Depending on your PC configuration you can visualise and interact with over 15 million data points (we’re managing over 2 million just on a laptop!). We already have a Hadoop interface under test, and V1.1 (due Sep 16) will also add Virtual Reality support for Oculus Rift.

    Datascape provides a range of 3D objects to use as markers, including common objects such as man, woman, car, ship, plane etc. You can also import your own 2D images to use on the panel axes, and link points out to URLs which can be viewed in your browser.

    You can download the Trial version of Datascape 2.0 for FREE today. After 30 days the trial version reverts to 10,000 point limited Community Version, but of course we’d like you to upgrade to one of our Solo or Commercial licences – which start from just UKP49+VAT!

    We'll be publishing weekly visualisations onto the blog, and tutorial videos, so that you can get an idea of the wide variety of visualisations which Datascape can create.

    We already have a good set of video tutorials on YouTube and Vimeo, and a well developed Datascape Wiki, but if you need any more help just post on our Forums or email us at datascape at daden.co.uk.

    8 August 2016

    VR in Learning and Development

    We've just written a 2 part blog post for the world of Learning Conference blog trying to demystify and de-hype VR in Learning and Development. You can read the blog post here.

    Also of interest is this recent survey by Kallidus on the attitudes of L&D professionals to VR:

    91pc of L&D professionals plan to implement virtual reality (VR) in their learning organisation, with over a third planning to use VR over the next three years.

    95pc of 200 L&D professionals said they see VR as being useful for enhancing training, with 81pc thinking it has 'real potential' for learning.

    A further 11% dubbed VR as the 'next big thing', and just eight per cent feel VR is 'just hype'.

    Over half (53pc) of the respondents have prioritised VR as the next new mode of learning they most want to implement - ahead of virtual classrooms, mobile learning, games-based learning and social learning - in terms of priority.

    The biggest benefits of VR were cited to be; aiding in creating a more engaging learning experience (89pc), making high-risk or impractical training achievable (84pc) and helping the organisation to be more innovative (81pc).

    Only two per cent of respondents said their organisation is already using VR for training.

    73pc perceive cost, lack of knowledge on how to use VR (61pc) and lack of cultural appetite (38pc) as potential hindrances to adopting VR.

    Tim Drewitt, Product Innovator at Kallidus says: "Although only a third of the L&D professionals we surveyed have had any hands-on experience of VR, the vast majority are very excited about its potential to add something special to the learning mix. Time will tell, but it's possible that this exciting immersive technology could be adopted faster than previous new learning approaches and may prove to be as game-changing in learning as the advent of the PC."

    4 August 2016

    Datascape Tutorial - Enhancing Network Graphs

    Starting from where Tutorial 5 left off, this tutorial shows you how to add shapes, point colour and link colour to improve the visual look of the graph and show more aspects of the data it represents. The video then shows you how to add a web link to each point, so that it can open a URL linked to that point - here we link surnames to a Facebook people search. Finally the video shows you how to create a chord based circular mapping for the data to get an alternative visualisation.

    Read more about how to do this on the Datascape Wiki.

    1 August 2016

    DadenU Day: Oculus Rift CV1

    By: Nash Mbaya

    Upgrading Oculus Rift Unity SDK to Consumer Version 1 SDK

    For our third Daden U Day I looked into updated some of our products which had Oculus support to the latest version CV1.

    The Oculus Rift CV1 or Consumer Version One is the latest Oculus Rift model which was made available to the public as of March 28th 2016. I began the day by attempting to install the setup software downloaded from the Oculus website. It was all looking promising until the software prompted me to update my display drivers. I installed the latest drivers for the graphics card in my development machine which is an AMD Radeon 6800 series. After installing the drivers I opened the Oculus software installer once again and once again is asked my to update my display drivers. After some research(googling) I discovered that if your machine does not meet the minimum requirements it display that message. It will also not allow you to progress with the installation process until it is satisfied that the machine has a decent graphics card. I knew my graphics card was below par but I was hoping I could install the drivers and use the SDK for Unity for development. I was not interested in actually using to play.

    Fortunately for me we are technology company so I found another machine to develop on. After downloading the latest version of the Oculus SDK. It was an easy process to upgrade SDK in the my Unity project (Daden Campus).

    I would like to give a big thumbs up to the guys and girls at Oculus who put a lot of thought and effort into making the process simple. They didn't change much of how you add Oculus Rift support to your project but changed the underlying code. You still use the prefabs though the structure has changed slightly most of the classes still exit with the same names. By the end of the day I was able to run the Unity application and play through it. The only problems I was left having were related to the xbox controller which I'm still working on ...

    P.S. We'll releasing Daden Campus onto the Rift Store once we've finished working on the controller and have got all the interactions working. You download the non-Rift version of the Campus and have a play to get an idea of what the space is all about.

    25 July 2016

    iCentrum VR app now on Apple App Store

    We have now launched our iCentrum photosphere VR app onto the Apple App Store. Just search for "icentrumvr" to download it to your iPhone. The app has both a VR and non-VR modes. In VR mode you can use it with a Google Cardboard headset for the full immersive experience. In non-VR mode you get a single screen display, and can just move the phone around you in order see the whole of the spherical image.

    As with the Android version there 6 images, linked by the red hotspots, and an information zone with more details of the iCentrum building and supporting projects.

    21 July 2016

    Easy Prop Editing in Fieldscapes

    This new feature in Fieldscapes is probably better covered by a blog post rather than a video. With Fieldscapes one of the things that we are trying to do is create a 3D immersive learning environment, player app and authoring app that build on all the good parts of systems like Second Life and Unity, whilst avoiding some of their downsides. One key example with this is prop editing.

    In Second Life scripts are associated with objects/props. The only way that you could edit the script was to find and go to the prop, and then click on it to view and edit the scripts. Lots of time wasted! With Unity you always have a list of props available, and can edit any prop directly from the UI, but you've lost the sense of spatial context as you only have a god-like camera.

    With Fieldscapes we're trying to implement a best of both worlds. As previously shown you place props by just rezzing them in-world, in front of your avatar, just like Second Life. You can then clcik on the prop to edit its behaviours (just as in SL and sort of Unity). But what we've also now implemented is a Props list, accessible from the left hand Control menu (that might get a better name before release). This builds up a list of props used in the exercise, and at any time you can open the window to see the props list. But also against each prop are a couple of actions:

    • Edit - This immediately opens the behaviour window for that prop, so you can edit any prop from anywhere
    • Teleport (TP) - This immediately jumps your avatar to just in front of the selected prop - so you can remind yourself of its context, and optionally tweak its location or click on it to edit it.

    The screenshot below shows the Behaviours window opened for an off-screen prop.

    Now that we have the props list we can also begin to change our approach to exercise creation. The most effective way now is to start by just laying out your props, not setting any behaviours, but do name them all. Then you can go and sit your avatar up on some high bluff and just click through each prop in turn to edit it - and with all the other prop names which it may want to reference already being in place.

    One by-product of this is that we no longer need a separate location bookmarking system to help with editing. You can just use the props list to jump to anywhere in the exercise, and if you want you could even add some invisible, inactive props at locations that you can to get an edit view from (perhaps an aerial view) but which you don't want to be active elements of the exercise.

    We may find other actions to add to the props list in due course (deletion is an obvious one), but even as it stands it significantly speeds the time to develop an exercise.

    19 July 2016

    New Videos: Multiple Choice for Fieldscapes, and Network Data for Datascape

    Several new videos up on our YouTube and Vimeo channels in the last few weeks. The latest ones are:

    Multiple Choice in Fieldscapes

    How to add a multiple choice question in Fieldscapes. Very much work-in-progress and the final version will be a lot slicker, but the use of the system node approach gives you an incredibly flexible system which can be used for tasks like allowing a student to choose what activity they want to see/do next, as well as using it to give answers to formative or summative tests. Of course since this is a 3D world you can always use objects, or even images, instead of text questions by embedding the question and answer into the environment, rather than use a simple, but familiar, text based option.

    Network Data in Datascape

    Probably one of the hardest things to deal with in Datascape are network graphs. This video shows how to import and plot network graph data by using two data files, one representing the nodes, and one the links/edges that connect the nodes.

    18 July 2016

    DadenU Day: Using Resharper

    Joe Robbins, Junior Developer writes:

    For our recent Daden U day, I decided I would look into Resharper, a suite of tools for improving productivity in Visual Studio. The wealth of features on offer is a bit too great for me to cover exhaustively in this post, but I will be talking about some of the ones that stood out for me, along with my general opinion on the whole package.

    I began by watching the video above which gave an overview of the tasks that ReSharper can assist a programmer with. I was impressed immediately, and began to question why some of these features didn’t come with Visual Studio as standard.

    Setting up Refactor couldn’t be more straightforward, just sign up for a free trial, download the installer, fire up Visual Studio and you’re good to go.

    I then realised that it would be difficult to try out ReSharper in full, as doing so required a pre-existing project. However, I did cobble together some minimalistic C# classes and began playing around with them.

    So here’s my list of the top four tasks that ReSharper makes that much easier, although note that this was only based on a few hours of experimentation and I may well discover even better ones over the remainder of my free trial.

    Generating constructors

    If you’ve spent any significant time doing object oriented programming then you’ll be able to appreciate my first pick. Whenever I create a new class, I tend to start by defining the various properties that need to be included, and then I’ll move onto the constructor. The issue being that most constructors are simply boring, just initialising the properties from the passed arguments.

    But thanks to ReSharper, this boredom is a thing of the past.

    Simply hit Alt+Enter to bring up the ReSharper menu and select "Generate Code".

    Choose “Constructor”.

    Tick off the properties that we wish to initialise in our constructor and hit “Finish”.

    And there we go, our whole constructor created without a single keystroke.

    Surrounding code

    Another small touch that could add up to plenty of time saved is the ability to surround code with things such as while loops, or try/catch blocks. ReSharper’s got you covered on that front.

    Just highlight the code you wish to surround and press Alt+Enter to bring up the ReSharper menu. Go to "Surround with…" and chose the construct that you want to wrap around the highlighted block.

    And ReSharper does the rest, leaving me free to fill in my while condition without having to write any of those pesky braces manually.

    Extracting Class from Parameters

    Many of us have been in the situation where the parameter list for a method we’re writing just keeps growing and growing. Our programmer conscience nags at us to turn the argument list into its own class, but that’s far too much work.

    Let’s say we have a method with a few parameters, like the one here. We can press Ctrl+Shift+R to bring up the ReSharper refactoring menu, and select "Extract Class From Parameters".

    We give a name for our new parameter class and describe which parameters we want to include within it.

    ReSharper does the rest, producing our new class, along with constructors, getters and setters. It also alters the parameters for our original method to take account of this change.

    Move Type to Another File

    Sometimes when we code, we will put a small class in the same file as pre-existing one, assuming it will stay small and won’t need to be used elsewhere in the project. Often, however, this is not the case, and when this happens it is good organisational practice to move the second class into its own file.

    Simply place the caret next to the name of type we wish to move and press Ctrl+Shift+R to bring up the refactoring menu, then select “Move To Another File”.

    Give a name to the new file we’re creating.

    Then when we look at our project, we can see that we have a new file which contains the definition of our class.


    On the whole, I was very impressed with the power and ease-of-use provided by ReSharper and still feel that I'm only scratching the surface. And that’s before mentioning that the premium version of the plugin comes with tools for performance and memory profiling, unit testing assistance and more.

    So do I feel that ReSharper is worth the money (£239 for a year as a new business customer, more info here: https://www.jetbrains.com/resharper/buy/#edition=commercial)? At the moment it is too early to tell, come back to me after I’ve finish my 30-day free trial. What I can tell you is that even during my brief time spent with these tools, I could see how they would fit into my workflow, saving me plenty of time over the course of a project. I wouldn’t be surprised if all that saved time began to cover the cost of getting a licence for this very promising toolbox.

    A PDF version of this article with screenshots is available at: http://www.daden.co.uk/docs/ReSharper-DadenUDayBlog.pdf

    11 July 2016

    3D Objects in Datascape

    In the next beta release of Datascape we're adding a wider set of 3D objects, adding a variety of "everyday items" to the base list of platonic solids and other marker variations. The initial list of extra objects looks like this:
    • Bicycle
    • Car
    • Computer
    • Dhow
    • Factory
    • Firewall
    • FishingBoat
    • House
    • Human
    • Liner
    • Lorry
    • Man
    • MobilePhone
    • Monkey
    • Motorbike
    • Office
    • People
    • Plane
    • Raft
    • Server
    • Soldier
    • Speedboat
    • SphereHD
    • Submarine
    • Tank
    • Telephone
    • TreeConical
    • TreeSpherical
    • TreeTriangular
    • Warship
    • Woman
    We'll add more based on user feedback, and in due course allow you to add your own by converting from common 3D modelling formats. Bear in mind though that the more complex an object is then the fewer datapoints your PC may let you plot.

    A full List of Supported Shapes is available on the Wiki.

    One "trick" with Datascape is to have a dataset with a single point at 0,0 and set to a base shape, eg Man - rotated so lying down. They bring your data in in another dataset but set X,Z positions so that the points line up with the base model - which would then let you show for instance frequency of trauma or disease locations on a body, and how they vary over time.

    29 June 2016

    Video Tutorials: Variables in Fieldscapes

    We've posted up two new videos to show you how to use variables in Fieldscapes. More to follow as this is an important area to create more complex exercises.

    The first video gives a basic introduction to variables in Fieldscapes and shows how they can be used as both an interaction counter and to drive exercise events, such as bringing an exercise to an end.

    The second video shows you how to use variables to control access to the objects in scene. For instance a student could have to answer a set of questions correctly, or complete some tasks before the system reveals a key piece of information or a reward

    28 June 2016

    Video Tutorial: Dynamic Filtering in Datascape2XL

    This video shows how a dataset can be dynamically filtered (what we call scrubbing) by using a set of sliders to set up a range of values for a field to include, and then also being able to slide that window through the data.

    22 June 2016

    Fieldscapes Video - Introducing the Editor

    We've put together a short video showing how the Fieldscapes Editor can be used to create a very simple "Hello World" type exercise. Even though it's short it introduces you to most of the basic concepts of how the editor works:

    - Everything is tied to a location

    - You choose props from inventory lists, and place them as you want

    - You define what student interactions a prop responds to

    - You define a list of actions that a prop does in response to the student action

    That's it really. We'll do more videos to introduce the more advanced features such as variables, rules, multi-choice which added together enable you to create quite complex exercises without resorting to a line of code.

    You can read more about the editor on the Fieldscapes Wiki.

    21 June 2016

    Datascape Snippets

    We've started posting 30 second Datascape videos up to Twitter. So as to make them more generally available we're now also posting them up to our Datascape channel on You Tube. There are 4 snippets so far:

    • Closest Stars
    • Refugee Data
    • Guide Lines and rotating camera
    • NSA CDX Cyber-security data

    More to follow at hopefully a rate of about one a week. Make sure you subscribe!

    17 June 2016

    First Fieldscape Video - A Field Trip to the Moon!

    We've posted up our first video of the Explorer app from the Fieldscapes project - showing a field trip to the moon!

    Fieldscapes: Apollo Educate (work in progress) from DadenMedia on Vimeo.

    It's all very much work in progress but hopefully gives you some idea of the very clean and simple user interface that students can have in Fieldscapes, and how some of the basic interactions will work. Over the coming months we'll post up several more Moon based field-trips to show how we can use the editor to create very different feeling experiences from the same location and props. We'll also post up more Earth-bound field trips of course!

    Make sure you follow the Fieldscapes Video Channel to catch further videos.

    The video is also available on YouTube.

    9 June 2016

    Three Virtual Reality Apps in a Month!

    A trio of virtual reality applications have been launched in one month on the Google Play Store by Birmingham based virtual reality and 3D specialist Daden Limited.

    Two of the applications support Daden’s Virtual Field Trips project for Innovate UK, the UK’s Innovation Agency; designed to prepare students going on field trips by allowing them to get to know the site beforehand and supporting classwork and revision after the trip. Both apps are set in Carding Mill Valley on the Long Mynd in Shropshire, a popular fieldtrip site.

    The third application, iCentrumVR, allows users to take a guided tour of Innovation Birmingham’s latest campus building, by moving their phone to look around a series of 360 degree photospheres, as well as accessing information about the building and its facilities, including its impressive event space, start-up support programme and office suites for digital start-ups.

    All three of the mobile applications are free to download, and can be used with or without a Google cardboard Virtual Reality headset.

    Based at Innovation Birmingham Campus, Daden has been providing 3D immersive applications for over a decade and recently expanded its team by 50% in response to increased demand.

    David Burden, Managing Director at Daden, said: “The fact that we can release three virtual reality applications in a month shows how affordable we can now make this technology for organisations looking to promote their own venue or business, or to educate and train their own staff or students. It’s also testament to our staff that they can get to grips with these innovative technologies so quickly."
    “These apps show that VR need not be confined to big brands and Hollywood style budgets, it is a marketing, sales and education tool which is very much here and now.”

    Sean, Nash and Joe - the developers behind the three apps

    8 June 2016

    Cyberdata in Datascape2


    We're working our way through some of our favourite Datascape1 visualisations as a way of testing out Datascape 2. This one shows some cyber data from an NSA CDX exercise. Each line is a data packet going between two computers. The home network is the bottom "layer", and foreign networks are stacked above. Each node is a computer terminating a packet at a particular time, and is shaped by source/destination, and coloured by port. Lines are sized to data packet size and coloured by protocol.

    There are lots of very obvious activity patterns which belie certain types of network behaviour, and anything moving from the home network to another network (any line with a vertical component) is immediately of interest.


    As the video clip on Twitter shows if you hover over any node or edge you can see its details.

    3 June 2016

    Fieldscapes Editor - The First Exercises


    We're making great progress with the Fieldscapes Editor now. With the Player/Explorer effectively done the focus is now very much on the Editor and the Infrastructure.

    The Fieldscapes Wiki already gives an idea of how the Fieldscapes authoring process will work - and it's very much authoring not programming as it's just form filling with no syntax to worry about! In summary to create a Fieldscapes exercise you:

    • Choose a location (we'll provide generic ones as well as models of real world places, and if you've the 3D Unity skills you can contribute your own)
    • Choose a prop and then drag it into a location on the landscape (we provide a set of generic markers - info point, image point, quiz etc, as well as topic specific objects, e.g geography field trip, biology field trip etc)
    • Click on the prop to set its behaviours - such as:
      • Whether its visible at start
      • Whether it displays a message/image window, and the content thereof
      • Whether it changes the state of another prop (eg making the next point on a route appear)
      • Whether it changes the value of a variable (eg count of places visited)
      • Whether it stays visible and/or stays active
    • Move on to the next prop.
    • Save and test, and refine til ready

    We'll post more updates as the editor develops, and hopefully put up a video of the editing process in the next couple of weeks.

    1 June 2016

    Carding Mill Valley VR Photosphere App Launched

    Today we've launched our 3D Photosphere VR app for Carding Mill Valley. You can download it from the Android Play Store (just search "cardingmill") to your Android phone, and it works with or without a Google VR Headset.

    This app is intended as a counter-part to our FieldscapesVR app so that people can compare the two approaches to virtual reality field trips. With this app there are three photospheres which give you a photo-realistic view of the valley at 3 key points. However you cannot visit any other part of the valley, or fly around.

    By contrast FieldscapesVR provides a 3D model of the valley which is less realistic looking than the photospheres, but which allows you to move anywhere around (and above) the detailed 3D model. In due course we will also be able to provide far more interactivity in the 3D model version than in the Photosphere.

    The valley is a popular site for school field trips and the application is primarily intended to allow schools to:

    • virtually visit the site whilst planning their field trip
    • to use when back at school as part of their further studies and during revision

    The application is very much intended as a "beta" of the final product, so we'd be delighted to hear your views.

    The application can be used without a VR Headset in a normal "single-screen" mode but we recommend buying an Amazon Cardboard to get the full benefit (and it makes it a lot easier to use out on the ground in bright sunlight!). If you do need a headset, ensure that you get one which has a button or trigger, as that is (currently) required to use the app in VR mode.

    Do try both versions and let us know how they compare either:

    25 May 2016

    Datascape2XL Tutorial #2: Visualising GPS Data

    Datascape Tutorial 2 - Visualising GPS Data from DadenMedia on Vimeo.

    This second video tutorial shows you how to import a file with GPS data and display it on Open Street Maps in Datascape2XL. The same process will work for any data which has latitude and longitude fields. The video also shows you how to use the vertical axis as either altitude or time.

    Also available on YouTube at https://www.youtube.com/watch?v=q3c0W7rKyw4