Tag Archives: Augmented Reality

Research Bazaar Presentation 2017

8/2/2017

Jeff and I spoke at Res Baz 2017, in Victoria University’s Kelburn, Hub, about this project thus far. Our presentation and transcript can be found below.

IMG_2405

Slide 1  

Res Baz PresentationIntroduction

I’m Jeff and I’m Jono we are both post graduate students from the Victoria University school of design. We are in the process of creating a mobile application for the Wai-Te-Ata Press that we like to call a Literary Atlas of Wellington.

Our brief was to create an application that augments the Victoria University Library Collections into physical space making their content interactive for public consumption. So far through our development process we have completed an Augmented reality mobile platform which will host these interactions.

Our Literary Atlas app will allow users to walk to locations along the Wellington Waterfront in order to gather information about well-known Victoria University Writers from their sculptures on the Wellington Writers Walk.  This is achieved through a combination of Geo Location Tracking and Augmented Reality.

 

Slide 2 

Res Baz Presentation (1)Why Augmented Reality?

We chose to use augmented reality technology as it is a powerful new way of communicating ideas effectively. Think technology like the holograms from Star Wars.

For those who don’t know Augmented reality, or AR, is defined as having the following three properties;

– Combines real and virtual objects in a real, physical environment

– Runs in real time

– Aligns real and virtual objects with each other.

In our application we are using Video AR in which we use a video feed of the real world on a mobile device upon which virtual images are overlaid. This is the same concept which Pokemon Go uses.

However what sets us apart is outdoor natural feature tracking (NFT) through ARToolkit 5 which uses natural feature points to position Virtual Objects onto real world surfaces. Geolocation Positioning (GPS) to understand which writer the user is near and therefore initiate the corresponding AR Experiences

 

Slide 3 60 sec

Res Baz Presentation (2)What is AR toolkit 5, NFT’s

How are we making it?

AR toolkit 5

Unity

Open Street Maps

AR toolkit 5 is an open source project that can run in Unity’s development environment. It gives developers a multitude of functionality for developing AR applications right out of the box and because it is open source allows you to examine the code and add customize your own features as well.

Unity 5 is a gaming engine that allows you to write code once and port to many different devices. It also comes with a range of functionality to help speed up development is is pretty user friendly

And Open Street Maps for pulling in real world geometry data and constructing a mesh that the user can then navigate around based on their geolocation. We have also implemented functionality that allows us to position things on that map based on its geolocation in the real world.   

One aspect of AR toolkits 5 functionality, as jono has already touched on are NFT’s, the ability to turn images of real world things into Natural Feature Points and then recognize these points in real time through the mobile devices video feed. We can then display virtual scenes that appear to be tracked to physical objects.

As well as scenes tracked in space we are using the gyro and accelerometer to give users instanced AR (a scene that surrounds a static point). This allows the user to have a full 360 experience but does not allow virtual objects to “stick” or be tracked in physical space

 

Slide 4 

Res Baz Presentation (3)Cultivate a habit of mistakes and become a connoisseur of them – this is our design ethos

It’s important to consider app development in terms of an evolutionary process. Its full of trial and errors. This is why testing frequently becomes an essential part of development – Testing whether it still compiles and runs on your mobile device, testing whether the UI is best suited for a mobile screen size, testing the how interactions work.

It usually highlights problems or errors and if you test frequently you are able to isolate these issues rather than having to sift through a multitude of changes to find the cause.

It allows your development team to find mistakes, understand them and then implement improved changes that are informed by them.

You should also consider and use other apps that relate to yours. Analyze them and carefully critique them. Ask yourself what makes them good and what makes them bad and how you can apply these insights to your app.

 

Slide 5

Res Baz Presentation (4)Our next steps are as follows ;

We are about to develop augmented reality experiences into the application based on our Writer’s poems.These will be virtual representations of the poems and will be constructed to embody the main themes and ideas of each poem.

We have also begun the process of incorporating a function which allows users to write their own poems and add them to a database for other uses to enjoy.

If you would like to follow our development process or see in more detail what we have accomplished thus far please visit our development blog by following the link on the slides.
Thank you

IMG_2017

Site Visit Feb 10 NFT Overlay Tests

These tests were based around the creation of an overlay which specifies to the user what to focus the camera on to generate the NFT content. The overlay disappears once the NFT has been tracked. IMG_0509

Once the tracking works a small square was programmed to appear this will be replaced with interactive content now that we know it works. This stingray NFT were successful except for the sizing of the NFT on the UI. This will be amended. IMG_0510

The Text NFT was unsuccessful however although the overlays sizing was better than the stingray.
IMG_0515

Below is the closest we could get to matching the Text NFT to the marker.

IMG_0516

 

SITE VIST 20 JAN ICON REFERENCE AND MANHIRE NFT

For this site visit we aimed to test the functionality of the geo location tracking as well as live testing the natural feature tracking at each of the three writers walk sculptures ( except Bill Manhire).

IMG_0368

At Vincent O’Sullivan’s Sculpture we tested the NFT, with a small cube appearing at the bottom of the NFT marker to show it was working. It did take a while for the tracking  to work however so we took another hi res image of the stingray to make into a better NFT marker image for tracking.

IMG_0369

We then proceeded to Bill Manhire’s Sculpture to test the location tracking, this worked without any problems (the 3D object spun faster depending on how close we were to the sculpture).

IMG_0372

Our next stop was Katherine Mansfield and the geo location tracking worked well with this sculpture too. The pop up that says augmented view is what the user will tap to initiate the AR views(360 and NFT).

IMG_0376

The NFT AR also worked once initialize but this still needs work. so as before we have re taken some NFT Images to revisit and refine this tracking.  Overall this Site Visit yielded some successful testing  and has given us information to build on from this point onwards.

Initial Augmented View UI Design

IMG_8301

360 View, onscreen interactive 360 AR instance.

  • Back top Map Icon in top Left
  • Information Screen Icon top right
  • Sawp to NFT Mode Icon bottom right

NFT View, Transparent Image overlay so the user can get in the right position to initiate the AR View.

  • Back top Map Icon in top Left (same as 360)
  • Information Screen Icon top right (same as 360)
  • Swap to 360 Mode Icon bottom right (same position as swap to NFT )

Initial UI Way-finding Design

IMG_7266

This is the white board sketch Jeff and I worked on today concerning the base UI of the Map/ Way-finding, including what all the buttons do and where they link to.

  • 3D AR Views are initiated by tapping on speech bubbles that appear over icons when these fall within the animated  radar surrounding the player
  • All Main Menu Icons can be found along the right hand side of the map so the phone can be held and operated with one hand.
  • GPS strength is also included in the top right.
  • We  came up with rough categories for the Information Screen/CMS for the users
  • Finally there is a music icon however this is a placeholder for no until we figure out how this will work

Third Meeting 20th December

Back End

Link to Slideshow included:

https://docs.google.com/presentation/d/18ZH8SSkTCBNYMoy3YmAe1uga2C0y_MWAuNIFAaeCqrE/edit?usp=sharing

Moving forwards with our CMS for the writer’s data we discussed refining our Data Fields/ Categories for writers.

  • The Short story field should be worded as “featured work/s” as short story implies a genre.

 

Relationship Network for CMS

Could we create a relationship network visualisation, to show the relationships between different pieces of information?  (for example known associates/influencers, linking the writers with each other or other prominent figures) this gives the user/researcher context.

 

Fields

As we are not completely familiar with classifying literature we should look at other models that categorise literature such as:

  • Book council
  • Wikipedia?
  • other sites?
  • NZ etc cross reference
  • Cultivating creative capital/ cultural legacy

 

NFT Tracking

  • Typeface for the etched Wellington Writer Walk sculptures (Kathrine Mansfield) Sydney will follow up on the Auckland Typographer so we can source the exact typeface

 

Sound for App

Start to source the sound effects, for the App sooner rather than later, so we can integrate them into the App

  • We should create a list of places we would need sound for feedback especially

 

Victoria Marketing Department

Victoria University’s Marketing Department will need to see the App around March for Branding and such. We need to keep this in mind and discuss it at each meeting.

 

Domain names URL

For our static website we need to think of a domain name / list of domain names to lock down, this needs to appeal to the teenager demographic and be memorable.

 

Send link to Sydney and Matt

The link to the online sandstorm blog need to be linked to Matt and Sydney for revision.

 

Next Steps 

  • Angle testing NFT onsite at the Waterfront to make sure the initialisation of NFT markers is fluid
  • Begin Research on Mansfield, Manhire and O’Sullivan
  • Paper Concepts for AR experiences completed by the 9th Jan 2017 ready for discussion and digital development
  • UI Concepts
  • General Bug fixes and refinement of App
  • Further integration of Back end and Front End
  • Map Development

NFT/APP Field Test 2

I was out for a walk on Friday 16th and I decided to test the application at the Katherine Mansfield Sculpture. The Location services worked well and mapped my location well (screenshot below).IMG_0265

The initialisation of an NFT on the sculpture itself worked but took too long and i needed to adjust my viewing angle to make it work. But once it did work the tracking was great and worked seamlessly.

For our next test we should work on the initialisation of the AR.

Precedent – Pokemon Go

What is Pokémon GO?

Travel between the real world and the virtual world of Pokémon with Pokémon GO for iPhone and Android devices! With Pokémon GO, you’ll discover Pokémon in a whole new world—your own! Pokémon GO uses real location information to encourage players to search far and wide in the real world to discover Pokémon.

The Pokémon video game series has used real-world locations such as the Hokkaido and Kanto regions of Japan, New York, and Paris as inspiration for the fantasy settings in which its games take place. Now the real world is the setting!

The Pokémon video game series has always valued open and social experiences, such as connecting with other players to enjoy trading and battling Pokémon. Pokémon GO’s gameplay experience goes beyond what appears on screen, as players explore their neighbourhoods, communities, and the world they live in to discover Pokémon alongside friends and other players.

Pokémon GO is developed by Niantic, Inc. Originally founded by Google Earth co-creator John Hanke as a start-up within Google, Niantic is known for creating Ingress, the augmented reality mobile game that utilizes GPS technology to fuel a sci-fi story encompassing the entire world. Ingress currently has 12 million downloads worldwide.

Source: https://pkmngowiki.com/wiki/Main_Page

How it works from our perspective?

Way-finding

IMG_0248Way-finding works by using the phones location services to know the user position in space. A radar is then used to activate points of interest around a user, such as Poke Stops and Pokemon encounters. The player is represented through an avatar on the map itself.

AR  View

IMG_0249In my opinion Pokemon go is not true AR.  I uses the camera view with a UI Overlay that is positioned using a phones gyro / accelerometer. This creates the illusion of a virtual object in physical reality however this can be debunked by moving the phone around in space. When this is done virtual objects retain the same distance from the phone and move position in space, where as if they were true AR they would retain a “fixed”position in space.
Another thing to understand is that not many people use this view as it complicates the Pokemon catching Pokemon. Many just use the default 3D view that keep the Pokemon in view at all times.

What we will use

This precedent will be used as a basis and inspiration for way finding and the AR instance using the phones gyro sensor.

However for our map we are leaning towards a 2D style, opposed to the 3D style of Pokemon Go. We also need to be careful not to make the visuals too much like Pokemon Go as it is a well known application, but we can use it to inform the mechanics of the way finding as these are already well known.

Precedent – Wallame

What is it

WallaMe is a free iOS and Android app that allows users to hide and share messages in the real world using augmented reality.

Users can take a picture of a surface around them and write, draw and add stickers and photos on them. Once the message (called Wall) is completed, it will be geolocalized and will remain visible through WallaMe’s AR viewer by everyone passing by. A Wall can also be made private, thus becoming visible only to specific people.

UNADJUSTEDNONRAW_thumb_ceAll the Walls created worldwide can be seen in a feed similar to those of social networks like Facebook and Instagram, and can be liked, commented on, and shared outside the app.

WallaMe is mostly used to create digital graffiti and for proximity messaging.

Source: https://en.wikipedia.org/wiki/WallaMe

 

How it works

Wallame allows you to create your own markers based o photos and map AR instances to them.

Content can be in the form of;

  • Images
  • Doodles(drawn using the app)
  • Text

An example video is below

Source: http://sites.gsu.edu/cetl/2016/07/28/cool-tools-wallame/

 

What will we use 

The ability to have a user created content is exciting as users could leave their own messages/ marks on the overall experience.

Other interactions such as the ability to doodle/ place content in space could be very powerful.

One limitation is that the app can only see one AR instance at a time, which is chosen by the user.