- Academia
- Education
- Healthcare
- Public Sector
- Training
- Medical
Graphic is by thedigitalmarketingbureau.com
We've released v2.0.2 of Datascape2. This is primarily a maintenance release, fixing bugs and making some improvements to usability. Key fixes are:
Full release notes at: https://dadenwiki.atlassian.net/wiki/display/DAT/R...
Datascape can be downloaded without registration from: http://www.daden.co.uk/conc/datascape/datascape-do...
We're running a short survey to help inform the final stages of the development of Fieldscapes, please find 5 minutes to fill the survey out here:
https://www.surveymonkey.co.uk/r/TQZG75R
In addition we have also just launched our first proper "Introduction to Fieldscapes" video. YOu can watch it here or on YouTube.
We found some nice data on the RCSB Protein Data Bank which gives the x/y/z angstrom unit location of every atom in a huge range of proteins. The data was in a fixed width format and didn't take long to convert to CSV and add in some of the meta data contained at the top of the file. The visualisation shows both the whole dataset, about 65,000 points covering 30 different co-located models, and then another mapping shows just a single model. Scrubbing is used to filter through the different models, and then on the single model to filter through the different chemical elements. Shape is used consistently to also show element, and colour to either show strand or element.
This sort of protein visualisation was always something we thought that Datascape wouldn't be particularly brilliant at, and there are several dedicated apps to do it, but we were impressed at how good the results appear to look.
We've updated our Immersive Visual Analytics White paper - first published in 2012. The updates reflect our experiences with visual analytics through Datascape 1 since then, market developments such as the rise of VR, and the new opportunities opened up by tools such as Datascape 2.
We've updated our Immersive Visual Analytics White paper - first published in 2012. The updates reflect our experiences with visual analytics through Datascape 1 since then, market developments such as the rise of VR, and the new opportunities opened up by tools such as Datascape 2. Download the White Paper today!
By: Sean Vieira
As June was coming to a close there were big decisions to be made. The most important day in a long time was approaching. No, not the EU referendum. The obviously more important DadenU day, on the 1st of July. Once again the decision on what I would do came late but I had a few things that interested me.
One of those things was the Vulkan API, by Khronos Group, which is a modern graphics and compute API that is cross platform and has a focus on parallelisation and efficiency. An evolution on the company’s OpenGL API it aims to be an open standard for devices ranging from PCs, smartphones, games consoles and embedded systems. It was first released in February 2016 and currently has very little software support, though notable game engines such as Unreal Engine 4 (Epic) and Source 2 (Valve) have jumped on the bandwagon. More information on Vulkan can be found here. So I took this opportunity to jump on the bandwagon myself and see what it was all about.
Vulkan’s cross platform support offered up the first choice of the day, but it wasn’t a particularly difficult decision. The graphics card in my work PC is not supported by the API, and is actually the most recently released AMD GPU architecture that is not supported by Vulkan. At the time of writing only AMD cards that are based on GCN (Graphics Core Next) 1.0 or later are supported, so Android was my platform of choice.
Despite having very limited Android development experience (outside of Unity) I took it upon myself to try Vulkan out. Having already prepared my phone for use with Android Studio, I went in search of some examples to test, and running these tests would take up the majority of my day. There were two prominent resources that I found for Vulkan on Android; the first was an official Android NDK tutorial, and the other was the official SDK by Qualcomm, which is for their Adreno GPU.
My attempts to get the former to work were futile. A combination of my development inexperience coupled with, what I later discovered was, an incompatible version of Android made this a frustrating exercise. Most of the issues came from trying to ensure that the Android target versions weren’t incorrect and I spent a lot of time trying to sort it out. In the end, I had gotten nowhere and decided it was best to move on to the latter option.
This proved to be much easier and more productive. I managed to import the basic 'triangle' project from the SDK into Android Studio with little to no hassle, and I had managed to compile it without error. Now that it was compiled I it on my device, only to find that the app popped up on screen and then disappeared in a flash. After a little bit of digging around I found out that it was most likely an issue with the Android version requirement of Vulkan - it required API level 24 (the N-Preview) whereas my device was still only on API level 22 (ie Lollipop).
After failing to update the company phone to the correct version, my final throw of the dice was to forego Android and use the new laptop, which was bought specifically to demo our products. Being able to maintain a high frame rate when running our applications was a priority and for this reason the laptop contains a dedicated NVIDIA mobile GPU - which is fortunately supported by the Vulkan API. To test this, I downloaded the ‘Chopper’ demo from NVIDIA’s website and ran the it. Thankfully the helicopters flew forwards, albeit stuttering fairly frequently, and I could begin.
Unfortunately the laptop wasn’t set up for development, so I had to install the necessary applications before getting started. Once Visual Studio, CMake, and the LunarXChange SDK were installed, I got to work and attempted to run an example project. I used this handy tutorial video by Niko Kauppi to help me along the way.
The SDK first needed to be prepared for development by building the necessary libraries. These had to be built in a specific order, to ensure that each library project has the correct dependencies. Each project required a debug and release build as well. The glslang library needed to be built first, followed by the spirv-tools, and finally the samples that I would run.
Each library required a Visual Studio solution be made so that they could be compiled, so I used CMake through the command line to build the glslang library first. Once the solution had been created I opened it and compiled the 'ALL_BUILD' project within the solution twice - one debug and one release build. I did the same thing for the spirv-tools and samples libraries, in that order.
Now I could run some samples. The very first sample I decided to run was the 'instance' sample, which is the most basic example project, so I set it as the startup project of the solution. It creates an instance of Vulkan… and destroys it. To ensure that it was actually running, I added in a line of code at the end of sample_main() to wait for a keypress. Having done this, I re-built the project and the console window that flashed previously, now sat waiting for my to press a button, with no errors.
Looking at the source file for the project, we see that the sample_main() function contains four major steps. To create an instance the API needs to know some information when creating the instance, and this is stored in an VkInstanceCreateInfo struct. However, this structure requires knowledge of the general application info, so we first fill out a VkApplicationInfo struct with information such as application name, engine name, and api version.
// initialize the VkApplicationInfo structure VkApplicationInfo app_info = {}; app_info.sType = VK_STRUCTURE_TYPE_APPLICATION_INFO; app_info.pNext = NULL; app_info.pApplicationName = APP_SHORT_NAME; app_info.applicationVersion = 1; app_info.pEngineName = APP_SHORT_NAME; app_info.engineVersion = 1; app_info.apiVersion = VK_API_VERSION_1_0; // initialize the VkInstanceCreateInfo structure VkInstanceCreateInfo inst_info = {}; inst_info.sType = VK_STRUCTURE_TYPE_INSTANCE_CREATE_INFO; inst_info.pNext = NULL; inst_info.flags = 0; inst_info.pApplicationInfo = &app_info; inst_info.enabledExtensionCount = 0; inst_info.ppEnabledExtensionNames = NULL; inst_info.enabledLayerCount = 0; inst_info.ppEnabledLayerNames = NULL; VkInstance inst; VkResult res; res = vkCreateInstance(&inst_info, NULL, &inst); if (res == VK_ERROR_INCOMPATIBLE_DRIVER) { std::cout << "cannot find a compatible Vulkan ICD\n"; exit(-1); } else if (res) { std::cout << "unknown error\n"; exit(-1); } vkDestroyInstance(inst, NULL); |
Once we have the VkInstanceCreateInfo, we create the instance by calling the vkCreateInstance method and pass it the info struct and a null VkInstance by reference. Unless there are any issues with drivers and compatibility, this should create the instance. The final step is to destroy the instance, so that the application can safely close. This is achieved by calling the vkDestroyInstance by passing the VkInstance by reference.
This was as far as I managed to get on the day. It was farther than I expected to get but not as far as hoped. I have learned a few things from this, and unfortunately not much to do with coding using the Vulkan API. It was fairly hard to find somewhere to start when it came to the Android version, though most of that can be put down to my inexperience, and I had to cross reference a few tutorials to try and figure out exactly what I needed to do. Getting started with the Windows version was much easier, though that could be down to having more experience.
Learning Vulkan is something that I would certainly like to pursue in the future and having managed to set it up on the laptop provides me a foundation for future exploration. Perhaps next time I can get round to rendering a triangle!
This week's Fieldscapes tutorial shows you how to set-up and use timers in Fieldscapes. Timers would be typically used:
You can use timers in conjunction with variables and rules, so that different messages can be given by the timer based on how far through a task the student is, or a task could finish either when all of a set of locations have been visited (controlled by variables) or when they run out of time (controlled by the timer).
Check out the Timer entry on the wiki for more information.
Note the tutorial is set in Oaker Wood. This is somewhere on the English/Welsh borders. Rather than using a third party height map to create the landscape we used Paint to "colour in" the contour lines of a map in deepening shades of grey, and then imported this as a height map into Unity3D. We then used Unity's terrain smoothing tool to remove the "stepped" heights in the areas we were going to use, and then added some Unity trees and grass, and also a few boxes to show the village centres. As ever more time could be spent but it shows how you can quickly create a landscape using very basic tools - and at no cost (although watch the OS "derivative works" licence!).
This slide-deck provides a useful introduction to 3D data visualisation and Datascape. It covers the main controls for Datascapes and provides examples of some of the main visualisation types that can be created with it.
Datascape Introduction from Daden Limited
And don't forget that you can download a FREE 30 Day Trial copy of Datascape, so as to give it a try yourself.
In this latest issue of the Daden Newsletter we are proud to announce:
We also have:
We've just written a 2 part blog post for the world of Learning Conference blog trying to demystify and de-hype VR in Learning and Development. You can read the blog post here.
Also of interest is this recent survey by Kallidus on the attitudes of L&D professionals to VR:
91pc of L&D professionals plan to implement virtual reality (VR) in their learning organisation, with over a third planning to use VR over the next three years.
95pc of 200 L&D professionals said they see VR as being useful for enhancing training, with 81pc thinking it has 'real potential' for learning.
A further 11% dubbed VR as the 'next big thing', and just eight per cent feel VR is 'just hype'.
Over half (53pc) of the respondents have prioritised VR as the next new mode of learning they most want to implement - ahead of virtual classrooms, mobile learning, games-based learning and social learning - in terms of priority.
The biggest benefits of VR were cited to be; aiding in creating a more engaging learning experience (89pc), making high-risk or impractical training achievable (84pc) and helping the organisation to be more innovative (81pc).
Only two per cent of respondents said their organisation is already using VR for training.
73pc perceive cost, lack of knowledge on how to use VR (61pc) and lack of cultural appetite (38pc) as potential hindrances to adopting VR.
Tim Drewitt, Product Innovator at Kallidus says: "Although only a third of the L&D professionals we surveyed have had any hands-on experience of VR, the vast majority are very excited about its potential to add something special to the learning mix. Time will tell, but it's possible that this exciting immersive technology could be adopted faster than previous new learning approaches and may prove to be as game-changing in learning as the advent of the PC."
Starting from where Tutorial 5 left off, this tutorial shows you how to add shapes, point colour and link colour to improve the visual look of the graph and show more aspects of the data it represents. The video then shows you how to add a web link to each point, so that it can open a URL linked to that point - here we link surnames to a Facebook people search. Finally the video shows you how to create a chord based circular mapping for the data to get an alternative visualisation.
Read more about how to do this on the Datascape Wiki.
By: Nash Mbaya
Upgrading Oculus Rift Unity SDK to Consumer Version 1 SDK
For our third Daden U Day I looked into updated some of our products which had Oculus support to the latest version CV1.
The Oculus Rift CV1 or Consumer Version One is the latest Oculus Rift model which was made available to the public as of March 28th 2016. I began the day by attempting to install the setup software downloaded from the Oculus website. It was all looking promising until the software prompted me to update my display drivers. I installed the latest drivers for the graphics card in my development machine which is an AMD Radeon 6800 series. After installing the drivers I opened the Oculus software installer once again and once again is asked my to update my display drivers. After some research(googling) I discovered that if your machine does not meet the minimum requirements it display that message. It will also not allow you to progress with the installation process until it is satisfied that the machine has a decent graphics card. I knew my graphics card was below par but I was hoping I could install the drivers and use the SDK for Unity for development. I was not interested in actually using to play.
Fortunately for me we are technology company so I found another machine to develop on. After downloading the latest version of the Oculus SDK. It was an easy process to upgrade SDK in the my Unity project (Daden Campus).
I would like to give a big thumbs up to the guys and girls at Oculus who put a lot of thought and effort into making the process simple. They didn't change much of how you add Oculus Rift support to your project but changed the underlying code. You still use the prefabs though the structure has changed slightly most of the classes still exit with the same names. By the end of the day I was able to run the Unity application and play through it. The only problems I was left having were related to the xbox controller which I'm still working on ...
P.S. We'll releasing Daden Campus onto the Rift Store once we've finished working on the controller and have got all the interactions working. You download the non-Rift version of the Campus and have a play to get an idea of what the space is all about.
We have now launched our iCentrum photosphere VR app onto the Apple App Store. Just search for "icentrumvr" to download it to your iPhone. The app has both a VR and non-VR modes. In VR mode you can use it with a Google Cardboard headset for the full immersive experience. In non-VR mode you get a single screen display, and can just move the phone around you in order see the whole of the spherical image.
As with the Android version there 6 images, linked by the red hotspots, and an information zone with more details of the iCentrum building and supporting projects.
This new feature in Fieldscapes is probably better covered by a blog post rather than a video. With Fieldscapes one of the things that we are trying to do is create a 3D immersive learning environment, player app and authoring app that build on all the good parts of systems like Second Life and Unity, whilst avoiding some of their downsides. One key example with this is prop editing.
In Second Life scripts are associated with objects/props. The only way that you could edit the script was to find and go to the prop, and then click on it to view and edit the scripts. Lots of time wasted! With Unity you always have a list of props available, and can edit any prop directly from the UI, but you've lost the sense of spatial context as you only have a god-like camera.
With Fieldscapes we're trying to implement a best of both worlds. As previously shown you place props by just rezzing them in-world, in front of your avatar, just like Second Life. You can then clcik on the prop to edit its behaviours (just as in SL and sort of Unity). But what we've also now implemented is a Props list, accessible from the left hand Control menu (that might get a better name before release). This builds up a list of props used in the exercise, and at any time you can open the window to see the props list. But also against each prop are a couple of actions:
The screenshot below shows the Behaviours window opened for an off-screen prop.
Now that we have the props list we can also begin to change our approach to exercise creation. The most effective way now is to start by just laying out your props, not setting any behaviours, but do name them all. Then you can go and sit your avatar up on some high bluff and just click through each prop in turn to edit it - and with all the other prop names which it may want to reference already being in place.
One by-product of this is that we no longer need a separate location bookmarking system to help with editing. You can just use the props list to jump to anywhere in the exercise, and if you want you could even add some invisible, inactive props at locations that you can to get an edit view from (perhaps an aerial view) but which you don't want to be active elements of the exercise.
We may find other actions to add to the props list in due course (deletion is an obvious one), but even as it stands it significantly speeds the time to develop an exercise.
Several new videos up on our YouTube and Vimeo channels in the last few weeks. The latest ones are:
Multiple Choice in Fieldscapes
How to add a multiple choice question in Fieldscapes. Very much work-in-progress and the final version will be a lot slicker, but the use of the system node approach gives you an incredibly flexible system which can be used for tasks like allowing a student to choose what activity they want to see/do next, as well as using it to give answers to formative or summative tests. Of course since this is a 3D world you can always use objects, or even images, instead of text questions by embedding the question and answer into the environment, rather than use a simple, but familiar, text based option.
Network Data in Datascape
Probably one of the hardest things to deal with in Datascape are network graphs. This video shows how to import and plot network graph data by using two data files, one representing the nodes, and one the links/edges that connect the nodes.
Joe Robbins, Junior Developer writes:
For our recent Daden U day, I decided I would look into Resharper, a suite of tools for improving productivity in Visual Studio. The wealth of features on offer is a bit too great for me to cover exhaustively in this post, but I will be talking about some of the ones that stood out for me, along with my general opinion on the whole package.
I began by watching the video above which gave an overview of the tasks that ReSharper can assist a programmer with. I was impressed immediately, and began to question why some of these features didn’t come with Visual Studio as standard.
Setting up Refactor couldn’t be more straightforward, just sign up for a free trial, download the installer, fire up Visual Studio and you’re good to go.
I then realised that it would be difficult to try out ReSharper in full, as doing so required a pre-existing project. However, I did cobble together some minimalistic C# classes and began playing around with them.
So here’s my list of the top four tasks that ReSharper makes that much easier, although note that this was only based on a few hours of experimentation and I may well discover even better ones over the remainder of my free trial.
Generating constructors
If you’ve spent any significant time doing object oriented programming then you’ll be able to appreciate my first pick. Whenever I create a new class, I tend to start by defining the various properties that need to be included, and then I’ll move onto the constructor. The issue being that most constructors are simply boring, just initialising the properties from the passed arguments.
But thanks to ReSharper, this boredom is a thing of the past.
Simply hit Alt+Enter to bring up the ReSharper menu and select "Generate Code".
Choose “Constructor”.
Tick off the properties that we wish to initialise in our constructor and hit “Finish”.
And there we go, our whole constructor created without a single keystroke.
Surrounding code
Another small touch that could add up to plenty of time saved is the ability to surround code with things such as while loops, or try/catch blocks. ReSharper’s got you covered on that front.
Just highlight the code you wish to surround and press Alt+Enter to bring up the ReSharper menu. Go to "Surround with…" and chose the construct that you want to wrap around the highlighted block.
And ReSharper does the rest, leaving me free to fill in my while condition without having to write any of those pesky braces manually.
Extracting Class from Parameters
Many of us have been in the situation where the parameter list for a method we’re writing just keeps growing and growing. Our programmer conscience nags at us to turn the argument list into its own class, but that’s far too much work.
Let’s say we have a method with a few parameters, like the one here. We can press Ctrl+Shift+R to bring up the ReSharper refactoring menu, and select "Extract Class From Parameters".
We give a name for our new parameter class and describe which parameters we want to include within it.
ReSharper does the rest, producing our new class, along with constructors, getters and setters. It also alters the parameters for our original method to take account of this change.
Move Type to Another File
Sometimes when we code, we will put a small class in the same file as pre-existing one, assuming it will stay small and won’t need to be used elsewhere in the project. Often, however, this is not the case, and when this happens it is good organisational practice to move the second class into its own file.
Simply place the caret next to the name of type we wish to move and press Ctrl+Shift+R to bring up the refactoring menu, then select “Move To Another File”.
Give a name to the new file we’re creating.
Then when we look at our project, we can see that we have a new file which contains the definition of our class.
Conclusion
On the whole, I was very impressed with the power and ease-of-use provided by ReSharper and still feel that I'm only scratching the surface. And that’s before mentioning that the premium version of the plugin comes with tools for performance and memory profiling, unit testing assistance and more.
So do I feel that ReSharper is worth the money (£239 for a year as a new business customer, more info here: https://www.jetbrains.com/resharper/buy/#edition=commercial)? At the moment it is too early to tell, come back to me after I’ve finish my 30-day free trial. What I can tell you is that even during my brief time spent with these tools, I could see how they would fit into my workflow, saving me plenty of time over the course of a project. I wouldn’t be surprised if all that saved time began to cover the cost of getting a licence for this very promising toolbox.
A PDF version of this article with screenshots is available at: http://www.daden.co.uk/docs/ReSharper-DadenUDayBlog.pdf
We've posted up two new videos to show you how to use variables in Fieldscapes. More to follow as this is an important area to create more complex exercises.
The first video gives a basic introduction to variables in Fieldscapes and shows how they can be used as both an interaction counter and to drive exercise events, such as bringing an exercise to an end.
The second video shows you how to use variables to control access to the objects in scene. For instance a student could have to answer a set of questions correctly, or complete some tasks before the system reveals a key piece of information or a reward
This video shows how a dataset can be dynamically filtered (what we call scrubbing) by using a set of sliders to set up a range of values for a field to include, and then also being able to slide that window through the data.
We've put together a short video showing how the Fieldscapes Editor can be used to create a very simple "Hello World" type exercise. Even though it's short it introduces you to most of the basic concepts of how the editor works:
- Everything is tied to a location
- You choose props from inventory lists, and place them as you want
- You define what student interactions a prop responds to
- You define a list of actions that a prop does in response to the student action
That's it really. We'll do more videos to introduce the more advanced features such as variables, rules, multi-choice which added together enable you to create quite complex exercises without resorting to a line of code.
You can read more about the editor on the Fieldscapes Wiki.
We've started posting 30 second Datascape videos up to Twitter. So as to make them more generally available we're now also posting them up to our Datascape channel on You Tube. There are 4 snippets so far:
More to follow at hopefully a rate of about one a week. Make sure you subscribe!
We've posted up our first video of the Explorer app from the Fieldscapes project - showing a field trip to the moon!
Fieldscapes: Apollo Educate (work in progress) from DadenMedia on Vimeo.
It's all very much work in progress but hopefully gives you some idea of the very clean and simple user interface that students can have in Fieldscapes, and how some of the basic interactions will work. Over the coming months we'll post up several more Moon based field-trips to show how we can use the editor to create very different feeling experiences from the same location and props. We'll also post up more Earth-bound field trips of course!
Make sure you follow the Fieldscapes Video Channel to catch further videos.
The video is also available on YouTube.
We're working our way through some of our favourite Datascape1 visualisations as a way of testing out Datascape 2. This one shows some cyber data from an NSA CDX exercise. Each line is a data packet going between two computers. The home network is the bottom "layer", and foreign networks are stacked above. Each node is a computer terminating a packet at a particular time, and is shaped by source/destination, and coloured by port. Lines are sized to data packet size and coloured by protocol.
There are lots of very obvious activity patterns which belie certain types of network behaviour, and anything moving from the home network to another network (any line with a vertical component) is immediately of interest.
As the video clip on Twitter shows if you hover over any node or edge you can see its details.
We're making great progress with the Fieldscapes Editor now. With the Player/Explorer effectively done the focus is now very much on the Editor and the Infrastructure.
The Fieldscapes Wiki already gives an idea of how the Fieldscapes authoring process will work - and it's very much authoring not programming as it's just form filling with no syntax to worry about! In summary to create a Fieldscapes exercise you:
We'll post more updates as the editor develops, and hopefully put up a video of the editing process in the next couple of weeks.
Today we've launched our 3D Photosphere VR app for Carding Mill Valley. You can download it from the Android Play Store (just search "cardingmill") to your Android phone, and it works with or without a Google VR Headset.
This app is intended as a counter-part to our FieldscapesVR app so that people can compare the two approaches to virtual reality field trips. With this app there are three photospheres which give you a photo-realistic view of the valley at 3 key points. However you cannot visit any other part of the valley, or fly around.
By contrast FieldscapesVR provides a 3D model of the valley which is less realistic looking than the photospheres, but which allows you to move anywhere around (and above) the detailed 3D model. In due course we will also be able to provide far more interactivity in the 3D model version than in the Photosphere.
The valley is a popular site for school field trips and the application is primarily intended to allow schools to:
The application is very much intended as a "beta" of the final product, so we'd be delighted to hear your views.
The application can be used without a VR Headset in a normal "single-screen" mode but we recommend buying an Amazon Cardboard to get the full benefit (and it makes it a lot easier to use out on the ground in bright sunlight!). If you do need a headset, ensure that you get one which has a button or trigger, as that is (currently) required to use the app in VR mode.
Do try both versions and let us know how they compare either:
Datascape Tutorial 2 - Visualising GPS Data from DadenMedia on Vimeo.
This second video tutorial shows you how to import a file with GPS data and display it on Open Street Maps in Datascape2XL. The same process will work for any data which has latitude and longitude fields. The video also shows you how to use the vertical axis as either altitude or time.
Also available on YouTube at https://www.youtube.com/watch?v=q3c0W7rKyw4