On Tuesday & Wednesday this week I went along to the Augmented Planet conference in London.
I have to say I wasn’t too sure what I was expecting but the event wasn’t anywhere as big as I had expected it to be. As it turned out it was a relatively small event with maybe 150 people attending each day.
The conference was split over the two days with Tuesday being the developers day with presentations from developers and the opportunity to work with a developer to create an Augmented Reality app if you wished.
Wednesday was the general day with more presentations from AR providers which included hardware as well as software.
On Tuesday the conference started with the Augmented Reality Cool Wall, showing what the latest trends were and was no longer seen as cool. Feature recognition is the big cool thing at the moment, using your AR viewer to scan an area and for it to recognise objects within it, these objects could be anything from a soft drinks can to a building then deliver content based on the recognised object.
Feature recognition is hot on the cool wall because it can be monetised fairly easily. Geo-location, after some years of trying clearly isn’t quite as easy to monetise and wasn’t up top of the cool wall for the tech guys. It’s now a fairly mature aspect of AR and there’s huge untapped potential in grassroots content about places. There was talk of AR glasses where you will be able to see your nearest Starbucks, but no mention of seeing your local independent retailers. I pointed this out in the panel discussion and from the developers perspective it was looked at as ‘well who owns cyberspace?’ and talk of guerrilla marketing, where people would overlay their brand icon over that of a competitor, which is all very interesting but sidesteps the point about there is lot and lots of geolocated content out there wasn’t being picked up at AR planet.
The work we have been doing with geolocated content from Hyperlocal blogs is laying the foundation for easy use of AR by mainstream media. But sometimes in technology markets, the shiny technology – branded feature recognition apps etc – is necessary for more prosaic uses to break through.
On Wednesday the presentations covered all manner of things from AR glasses to an AR studio set up from Russia and a DIY AR Glasses project amongst many others, here are links and information to just a few of the days presenters.
Total immersion showed their Try Live product built with their D’Fusion Studio software. Try Live lets viewers ‘try on’ different products using AR here is a video of the Try Live Eyewear
Vuzix gave a very powerful presentation on the AR products and in particular an AR monocle which is due to be launched to industry customers in early 2013 with a consumer product coming out soon after. The monocle can be head or helmet mounted
The Vuzix monocle will be able to stream content from your smart device to the eyepiece. David from Vuzix made it clear that they were not software developers they are interested in the hard wear and would aim to make it as compatible as possible for developers.
Will Powell gave a presentation on his DIY project Glass real time translation and gave a very insightful presentation in to how he sees AR hardware working.
One thing that Will did bring up in his presentation which was seemingly missed by all the other presenters was security. If you are going to use AR for things like translation, what about the security of your conversation? For translation to work the conversation will need to be recorded, probably sent off into the cloud, translated and returned to you, Will pointed out that it could be possible to search your conversations to find out exactly what you said to someone on a particular day and time. OSX and iOS already have speech to text built in so this security risk is a lot closer than you may think.
Finally http://www.eligovision.com gave a demonstration of their Augmented Reality Studio which allows a presenter to use AR on stage while presenting.
Using a series of cameras, a feed of the presenter is sent to projection screens around the stage like the screens used in arenas, another set of cameras are set up to look for markers held by the presenter. When the camera picks up a marker it tracks it so the presenter can move around and the software sends the AR projections to the screens synced to the live cameras showing the presenter and you get see the presenter interacting with the AR.
Think election night on BBC and you get the idea, very smart stuff. This is just an overview of what I saw, there were other presentations on research projects and other AR platforms I’ll try and get hold of a full list of all the presentation and post it here.
All in all I enjoyed the conference, there are some great things coming to market soon. I think with the current top end innovation and development we will see AR become far more mainstream in the next year or so and this will start the drive for content from people other than developers.
I think there could be a place for some kind of joint event between Talk About Local and the Augmented Planet guys where we bring some of our hyperlocal community and interested friends along to meet with the developers that Augmented Planet know to tease out how to make best use of the user generated & geolocated content.
- Let sleeping hyperlocals lie? - 20th February 2017
- #TAL16 - 13th September 2016
- Digital Inclusion & Participation - 2nd March 2014