Category Archives: Site Vist

NFT/APP Field Test 2

I was out for a walk on Friday 16th and I decided to test the application at the Katherine Mansfield Sculpture. The Location services worked well and mapped my location well (screenshot below).IMG_0265

The initialisation of an NFT on the sculpture itself worked but took too long and i needed to adjust my viewing angle to make it work. But once it did work the tracking was great and worked seamlessly.

For our next test we should work on the initialisation of the AR.

Unable to Dynamically Add Markers (Limitation of ARToolKit 5)

When developing for mobile applications its important to be aware of how intensive your code is and to try and optimise it at every point possible. One of the optimisations I have been looking into is dynamically loading NFT markers so as not to create a huge wait when the application first loads.

From testing I have deduced that each NFT marker added into the scene adds around 1.5-2.5 seconds of load time. This may not seem like much but when you consider that we are planning on having 3 NFT’s for each writer, and that we may end up having more than 3 writers in total those numbers quickly add up.

Unfortunately ARToolKit 5 doesn’t allow for dynamic NFT loading, so after some testing I have come up with a cleaver way to get around the problem. Instead of loading a new scene each time the player switches MapView to ARView (or vice versa) I am instancing an ARcontroller that is relevant to each writer and only stopping and starting this when the user needs those specific NFTs.  To put simply this allows for the software to only load the NFT’s relevant to the writer in the users location. It also then allows for very quick transitions between MapView and ARview by utilising camera culling masks

Precedent – Pokemon Go

What is Pokémon GO?

Travel between the real world and the virtual world of Pokémon with Pokémon GO for iPhone and Android devices! With Pokémon GO, you’ll discover Pokémon in a whole new world—your own! Pokémon GO uses real location information to encourage players to search far and wide in the real world to discover Pokémon.

The Pokémon video game series has used real-world locations such as the Hokkaido and Kanto regions of Japan, New York, and Paris as inspiration for the fantasy settings in which its games take place. Now the real world is the setting!

The Pokémon video game series has always valued open and social experiences, such as connecting with other players to enjoy trading and battling Pokémon. Pokémon GO’s gameplay experience goes beyond what appears on screen, as players explore their neighbourhoods, communities, and the world they live in to discover Pokémon alongside friends and other players.

Pokémon GO is developed by Niantic, Inc. Originally founded by Google Earth co-creator John Hanke as a start-up within Google, Niantic is known for creating Ingress, the augmented reality mobile game that utilizes GPS technology to fuel a sci-fi story encompassing the entire world. Ingress currently has 12 million downloads worldwide.


How it works from our perspective?


IMG_0248Way-finding works by using the phones location services to know the user position in space. A radar is then used to activate points of interest around a user, such as Poke Stops and Pokemon encounters. The player is represented through an avatar on the map itself.

AR  View

IMG_0249In my opinion Pokemon go is not true AR.  I uses the camera view with a UI Overlay that is positioned using a phones gyro / accelerometer. This creates the illusion of a virtual object in physical reality however this can be debunked by moving the phone around in space. When this is done virtual objects retain the same distance from the phone and move position in space, where as if they were true AR they would retain a “fixed”position in space.
Another thing to understand is that not many people use this view as it complicates the Pokemon catching Pokemon. Many just use the default 3D view that keep the Pokemon in view at all times.

What we will use

This precedent will be used as a basis and inspiration for way finding and the AR instance using the phones gyro sensor.

However for our map we are leaning towards a 2D style, opposed to the 3D style of Pokemon Go. We also need to be careful not to make the visuals too much like Pokemon Go as it is a well known application, but we can use it to inform the mechanics of the way finding as these are already well known.

Developer Meeting

Today Jeff and I caught up with Seb, in Evans Bay, to discuss the CMS in it relation to the app we are developing. We mainly wanted to gauge whether our unity game engine could talk to his server which he thinks it can. We plan to get a test of this working by next week.

We also may not be using xml files as they do not work well with his silver stripe CMS. The file type we may be using is called ssh.


NFT Field Test 1

Jeff and I went for a walk to test out his fix NFT. To do this he adjusted the tracking of the feature points that the software tracks to only be the text (and not the texture of the stone).

To do this we developed a simple scene in which we could test the tracking and see if it worked on the actual typographic sculpture.

The result, seen below, worked well and tracked nicely using three lines of text on the sculpture.UNADJUSTEDNONRAW_thumb_a8

We also had a look for other natural features we could track. One of the points of interest we found were hidden QR Codes which we later scanned and found they were markers for Orienteering on the Waterfront.


Furthermore we will need to check the typographic sculptures in different conditions if possible as below is the James K Baxter sculpture at high tide. This would not be ideal for NFT Tracking at all.


NFT Marker Clean Up

Screen Shot 2016-12-12 at 5.01.54 PM

This top photo demonstrates an uncleaned up marker. You can see that the track points are chaotic and although follow the text have a lot of “noise”.  To fix this I decided to digitally paint out the noise and only focus on the text. This way I would “help” the software realise what was important to track.

Screen Shot 2016-12-06 at 4.28.01 PMScreen Shot 2016-12-06 at 4.27.39 PM

Here is my first attempt at that. Only using one line. It worked well but I realised that one line of text didn’t offer the enough tracking data to get a clean track so I went up to 3 lines. Important to note here is I chose the the bottom 3 lines – the three that would be closest to the user. This worked very well and is definitely usable with some further refinement. Those refinements are

  • changing angle of initialising photo to better fit point of view of human looking at poem
  • reducing number of words (width) to something that can fit into FOV (field of view) of cellphone camera when turner horizontal

Image top shows half way through cleanup process

KM_Cleanup   Screen Shot 2016-12-12 at 5.03.13 PM

Creating Feature Sets – Whats Best?

A few technical test that have been done were understanding the appropriate amount of track points to extract when converting an image into a natural feature set.

AR tool kit allows you to specify a few different parameters when it comes to customising the output result. The first parameter is DPI. Its important to note that the higher the DPI the larger the feature set and slower it is to load. It should also be noted that mobile camera’s have a maximum resolution (DPI) so it is redundant to go over this. With all of these factors and after testing the best DPI was 150.

The second parameter that can be defined is initialisation threshold. This has a range of 0 – 4. 0 meaning that only a few points have to be detected for NFT to load and track and 4 meaning that a high amount of points must be detected. I have found that a value of 1 works best for this. It allows the users a little bit of give in getting the scene initiated.

The third parameter is amount of track points. This also ranges from a value of 0 – 4. The best option for this varies on the image. As a general rule of thumb if an image has a lot of “noise” then a lower number of track points is recommended. However if an image is clean or has been digitally created  then a setting of 3 is recommended. For the best results of “realworld NFT” I have found that digital clean ups then extracting at level 3 is the best. – see NFT clean up blog post for more info