Of the many emerging mobile technologies that libraries are looking at one that has always appealed to me is augmented reality (AR). Compared to other technologies that are discussed AR has:
- fewer introductory barriers to overcome
- is virtually cost-free
- does not require specialised technical staff
- the general public will increasingly have some familiarity with it.
- can also be a lot of fun.
So I committed myself to turning some of these ideas into practical demonstrations for a group of interested colleagues.
I used the Aurasma platform as it’s free, straightforward to use, and has considerable market penetration. It works by having a pre-prepared image – a trigger – uploaded to their servers. Then when a device using the Aurasma browser focuses on one of these triggers information in the form of images and movies are overlaid onto the image in a predetermined way. Digital information is ‘superimposed’ onto what you are seeing through the devices camera. The big advantage of this optical approach compared to location based AR is that you can be precise with the location and it can be used over multiple floors without interference. There was a steep learning curve initially, learning what worked well (formats, sizes, scales) as a trigger and overlay, but after some trial and error using the software is actually quick and easy. Development forums provided some useful advice but a thorough introductory ‘best practice’ guide would have been welcome.
I came up with 9 possible categories of uses for AR and put together a demonstration for each of these. The focus was on provoking ideas rather than fleshed-out practical application:
- Video demonstration Pointing mobile device at the screen of the self-service issue machines automatically plays a video guiding the user on how the machine operates. There is also a button beneath this video saying ‘Need PIN?’ – when tapped this takes the user to a website with information on this.
- Enhanced publicity/directional map Pointing a mobile device at a floor plan map (either on a plinth at the library entrance or in hand-held form) overlays a re-coloured map indicating areas that can be tapped. When they are at a photo of that location there is a pop up giving users a ‘virtual tour’ and more information on that area.
- Help on a screen-based service Pointing a mobile device at the Summon discovery tool overlays guidance arrows and notes onto the screen– pointing out the where to enter the search, where to refine filters & then view results
- Virtual bay-ends Pointing mobile device at a particular image (perhaps located near catalogue PCs) overlays directional arrows to where resources are located – giving users an initial idea of where to find what they are looking for.
- Enhanced instructional guide Pointing a mobile device at a leaflet about accessing our online resources automatically plays a video with screenshots showing the stages that they need to go through. To the right are buttons that could be tapped to directly call, email and complete a form if further help was needed.
- Induction/Treasure Hunt Students could scan a ‘frame’ placed in an area of the library. Once scanned a video would play introducing them to that area and how to use it – alongside the video a new question would appear that would guide them to another area to continue the ‘game’.
- Enhanced publicity material Pointing a mobile device at our main library introduction guide which is enhanced with pictures, videos and extra information beyond what could be included on a physical copy. Also all telephone numbers, email addresses and hyperlinks are made into tappable live links.
- Staff assistance/reminder. Pointing a mobile device at the borrower registration screen of the LMS that we use overlaid with extra information to show the various fields that need completing. It is designed as a quick check for staff to ensure that it is completed accurately.
- ‘Book Locator’/directional video Using a mobile device to scan an image near to a catalogue PC to bring up a virtual table containing dewey ranges, i.e. 000 – 070. Tapping one of these would make a simple video pop-up directing the user from that location to the approximate shelving run. Technically this does not use AR at all, but was an interesting use of the software.
The demonstrations went well and generated some interesting debate amongst my library colleagues. Some brief thoughts after the demonstrations:
- Point of need content – The way that triggers work allows them to be highly context specific, you are essentially just ‘looking’ at the thing that you want help with, i.e. a room, a screen or leaflet. Could there be a future where users just get used to pointing their device at things and getting assistance and extended content?
- AR vs QR codes – The AR feels a lot more immediate than QR codes. Whereas scanning a code sometimes feels like an additional step and takes you away from what you are doing the extra information from AR is more integrated into your activity. Aurasma allows extra functionality too.
- Getting library users onboard – Is an issue whenever something new is introduced. Some level of training would be required. People have to download the app, subscribe to a particular channel and then know where to scan. Technological improvements may mitigate some of this – for example Aurasma allow the possibility of integrating their software into an existing app, meaning that users will not need anything new or have to subscribe to channels.
- Ease of development – As described above, the platform is not as intuitive as it might be initially but after a brief explanation I could see colleagues from across the service creating content, all it takes is some very basic image manipulation. I was creating these rough demos in about 15 minutes. The technical barrier is very low.
- Range of devices – The demos all worked equally well on iOS and Android smartphones that I tested. They looked great on larger tablet devices.