10 January 2022

Reith Lectures on AI

Thoroughly enjoyed Stuart Russell's Reith Lectures on AI. He just seemed to keep mentioning bits related to what we've done in various projects! 

In particular I liked this exchange from the first episode (https://www.bbc.co.uk/programmes/m001216j) which touched on the approach we've been taking with our Reflexive Agent and our consideration of Wisdom and AI in our DASA/Royal Navy Intelligent Ship project.


"ROLY KEATING: Roly Keating from British Library. It’s wonderful to have

you here. Thank you for the lecture. I was interested in the language and

vocabulary of human intellectual life that seems to run around AI, and I’m

hearing data gathering, pattern recognition, knowledge, even problem solving,

but I think an earlier question used the word “wisdom,” which I’ve not heard so

much around this debate, and I suppose I’m trying to get a sense of where you

feel that fits into the equation. Is AI going to help us as a species gradually

become wiser or is wisdom exactly the thing that we have to keep a monopoly

on? Is that a purely human characteristic, do you think?


STUART RUSSELL: Or the third possibility would be that AI helps us

achieve wisdom without actually acquiring wisdom of its own, and I think, for

example, my children have helped me acquire wisdom without necessarily having

wisdom of their own. They certainly help me achieve humility. So, AI could help,

actually, by asking the questions, right, because in some ways AI needs us to be

explicit about what we think the future should be, that just the process of that

interrogation could bring some wisdom to us."


Spot on!


16 December 2021

Early Data Visualisation Experiments in Second Life



 

David Wortley, once the Director of the Serious Games Institute, has done a nice write-up of my early experiments on Data Visualisation in Second Life. You can read it here:

https://www.linkedin.com/pulse/ancient-tales-from-metaverse-2-david-wortley/?trackingId=rC4zchpQSve9LEZkMhj5%2Fw%3D%3D


11 November 2021

An AI Landscape — What Do You Mean by AI?

 


I've updated my AI Landscape piece as I find myself using it a lot and posted it to Medium at  

https://davidjhburden.medium.com/an-ai-landscape-what-do-you-mean-by-ai-b338908cce99

I think I'll post all my long form pieces to Medium as its quite a nice platform to write and read things in, and probably gets more reach than the blog! But I'll always post links here as things get published.



10 November 2021

3D Immersive Visual Analytics White Paper - Just Updated

 


We last updated our 3D Immersive Visual Analytics White Paper in 2017! A lot has happened since then so it was well due an update - even though 80% of the paper covers the fundamentals which a) haven't changed and b) still seem to be learnt by many people.

 Although we stopped selling and using Datascape a couple of years ago the use of VR for data visualisation is still something that is very close to my heart and that I'd like to do more work on. My first steps in using WebXR are a move in this direction, and the current Traveller Map WebXR demonstrator is actually a data-driven app which has a lot of the key features in it needed for a WebXR IVA system. I'll hopefully get some development time over the next few months to pull a proper demo together.

In the meantime you can revisit some of our immersive 3D visualisations of the past at:

Read the updated 3D Immersive Visual Analytics White Paper now!

8 November 2021

David speaking at CogX on Virtual Humans and Digital Immortality

 


Somehow forgot to blog this when I did the talk at CogX back in 2019 but still highly relevant and been linking to it from the updated website so I thought I might as well post it up here as well.


1 November 2021

Jon Radoff's 7 Layer Metaverse Model and Market Map

 


Jon Radoff has an interesting "7 layer model of the Metaverse" at https://medium.com/building-the-metaverse/the-metaverse-value-chain-afcf9e09e3a7 and a mapping of the recent Facebook/Meta announcements at https://medium.com/building-the-metaverse/facebook-as-a-metaverse-company-d5712198b22d.

Needless to say we all have our own variations/comments on this. For me I think that Infrastructure/Decentralisation/Spatial should be the bottom 3 layers, and really the Human Interface should be the outer one - you don't (or shouldn't) build you virtual experience on the human interface, you should build the experience and then open up to as many interface types as you can. Discovery could perhaps be split between internal discovery (how you find out about stuff in the world) and external discovery (how you find out about experiences, and particularly how one experience links to another).  And should the Creator economy sit on top of the experience as in a true Virtual World prople are using the experience to create the economy, and then enhance the experience (so circular!). Perhaps something more like a network model would be more realistic (if less eye-catching) given the interconnectivities involved - it's not quite the ISO 7 Layer model diagram.

I'll have to dig out one of my old models.

Jon also has a Market Map of the Metaverse at https://medium.com/building-the-metaverse/market-map-of-the-metaverse-8ae0cde89696 - which is likely to be quite dynamic!


Jon also has a Metaverse Canon Reading Guide at https://medium.com/building-the-metaverse/the-metaverse-canon-reading-guide-9eb1b371b505



29 October 2021

Facebook (aka Meta) Connect2021


Me at Connect21 in Venues*

*although you could see the video in-world it deoens't show on the photos you take


I attended the Facebook Connect2021 event in Oculus (now Horizons) Venues yesterday afternoon. Struck me that there were a lot more people in VR than last time, around 1200 were meant to be there, although we were all sharded into ~ 20 people pods (and they were on 2 levels so you really only saw ~ 10).

Needless to say the whole name change thing and the relentless focus on the Metaverse made it a bit more weighty than the last one I went to which just introduced the Quest 2, but apart from Mark Zuckerberg finally "getting" virtual worlds there was really little there that wasn't being talked about 10 years ago. Sure the tech is moving on but there are still lots of areas which need to be sorted.


So here are my highlights/comments on the talk, in roughly timeline order.

  • Yep, pretty much everything talked about in terms of what you could do in the VW element of the Metaverse you could do in SL 15 years ago
  • The one big difference, and which would be cool, is the better integration of your desktop into VR and bringing your phone into VR so that you can stay in world whilst checking "RL" information or having chats with people in "RL"
  • Zuckerberg talked about the metaverse as being the "replacement" for the mobile internet, but I think despite the fancy nature of the "metaverse", even when implemented through MR glasses we'll still find the unobtrusive nature of the smartphone (along with battery life etc) to be valuable, so I think they'll just be parallel streams into the meta-metaverse!
  • He talked about interoperability in terms of being able to bring avatars and virtual objects between worlds - but it wasn't clear if this was between FB/Meta ecosphere applications only, or out to third party virtual worlds. I fear the former. At the ReLive conference in 2011 we highlighted interoperability as the key medium term need - and we're still not there!
  • Some nice quick videos of the various Horizons platforms, with Rooms going from single to multi-user, and it looks like Spaces will have a Scratch type scripting system, which will be great news as long as its powerful enough for decent work.
  • Also some nice hints about better tools to enable 2D progress web applications to display within the 3D/VR environment, further breaking the barrier between the 3D/VR and 2D/RL environments!
  • The whole presentation kept switching from RL to greenscreen to Horizons to smoke and mirrors, by the end I wasn't sure if Mark was really an avatar or not and you certainly couldn't tell what was real tech and what was marketingware.
  • There seemed to be a lot of emphasis on "holograms" and going to real concerts/meetings as holograms with real friends/colleagues. It was at all clear on how that was ever going to work, although the AR glasses would sort of make that feasible. Mind you having tried to stream a bit of video out of  a concert recently I can't see how the bandwidth will ever be there! Oh, virtual after parties - tick, done that, great one after The Wall show in SL "back in the day" actually on the stage set.
  • The lack of a VR keyboard in Venues itself (and most VR apps) was keenly felt as it meant that the dozen or so of us there could really interact whilst the session was going on. Meta may have a solution - see below.
  • A big announcement for the short term was that Oculus products will not now mandate FB accounts, so the concerns that many corporate/academic clients had might go away. They will also continue to allow sideloading onto Quest, so it looks like the Quest will remain a more open platform and not be locked down to FB/Meta - hurrah!
  • Lots of stuff on their "presence" SDKs to help developers better integrate controllers, MR /pas-through features and voice control. "Presence" was emphasised a lot, more than immersion (just like we've been telling a recent client!)
  • Project Cambria will be their next generation high-end headset, non-tethered but more expensive that the Quest, although they hope features will gradually drop down to the Quests. Hi-rez, colour pass-through to better enable MR applications seems the key differences, plus the sensors to detect facial expression and eye-movement for more natural avatar interaction, nice.
  • Project Lazare is their AR/MR glasses project, heading for full Hololens type MR but in a spectacles form-factor. This is what would make their "holograms" believable - as long as you put glasses on all the RL people.
  • A list of futures "breakthrough" areas for VW tech looked very generic, everything talked about was evolutionary, and interestingly was missing neuro-interfaces tech. 
  • Then he talked about neurotech (!), or specifically electromyography (EMG) and a device which looked suspiciously like the old Thalmic Labs MYO. Still have mine (see photo below). This senses what your fingers are doing from the electro-neural impulses picked up at the wrist. Some nice demos (?) of subtle gesture control, including typing. In theory you need hardly actually move your fingers at all. One of the more interesting things shown. Does it work for feet and locomotion too?
  • The Codec photo-realistic avatars looked quite good - volumetric and based on face/body scans, but with the ability to then change hair, clothes etc. The clothing demo and talk about hair modelling reminded me of how much we wrote about those areas in our Virtual Humans book.


My old MYO


Needless to say all the bad press side of Facebook, privacy and corporate greed didn't get a look in (although there was a snide comment about high-taxes (!) stifling growth and innovation), but he did use the words "humble" or "humbling" more times than I could count, and Nick Clegg popped up for about 30 seconds to say not very much. As some of our recent work has highlighted for a client VR offers unparalleled insights into personal behaviour and its vital that our explicit and implicit data is secure and not being exploited, and the "metaverse" is not under the control of one company. If Facebook was really serious about the metaverse they'd open source the whole lot right now.

You can watch the presentation for yourself at https://www.youtube.com/watch?v=Uvufun6xer8&ab_channel=Meta

Full set of Connect21 presentations on the more technical side at https://www.facebook.com/watch/113556640449808/901458330492319/

Update: Looking at the World Building in Horizons video it really is very low rez at the moment, possibly even less than Hubs, and talk of strict performance limits. So good for some stuff but a whole load of use cases wiped right out (for now) :-(