Today I visited the sites of each AR interaction to bug test aspect of the map design to see what improvements and fixes needed to be made. Each paragraph will refer to the image above it.
Some of the building shapes are misaligned so this will need to be fixed in the overlay or we will need to rethink our implementation of buildings in the open street maps API
The solace in the wind title is pixelated. Also the Icon could be changed? 3D, or increased in size.
Small 3D trees were added to this build, Jeff and I like how they add depth to the map so we will redesign then to suit the aesthetic and revisit them later.
The text in the sea/water looks great and complements the loos of the map. Also the fade effect that Jeff has added is a nice touch (on the horizon).
These tests were based around the creation of an overlay which specifies to the user what to focus the camera on to generate the NFT content. The overlay disappears once the NFT has been tracked.
Once the tracking works a small square was programmed to appear this will be replaced with interactive content now that we know it works. This stingray NFT were successful except for the sizing of the NFT on the UI. This will be amended.
The Text NFT was unsuccessful however although the overlays sizing was better than the stingray.
Below is the closest we could get to matching the Text NFT to the marker.
For this site visit we aimed to test the functionality of the geo location tracking as well as live testing the natural feature tracking at each of the three writers walk sculptures ( except Bill Manhire).
At Vincent O’Sullivan’s Sculpture we tested the NFT, with a small cube appearing at the bottom of the NFT marker to show it was working. It did take a while for the tracking to work however so we took another hi res image of the stingray to make into a better NFT marker image for tracking.
We then proceeded to Bill Manhire’s Sculpture to test the location tracking, this worked without any problems (the 3D object spun faster depending on how close we were to the sculpture).
Our next stop was Katherine Mansfield and the geo location tracking worked well with this sculpture too. The pop up that says augmented view is what the user will tap to initiate the AR views(360 and NFT).
The NFT AR also worked once initialize but this still needs work. so as before we have re taken some NFT Images to revisit and refine this tracking. Overall this Site Visit yielded some successful testing and has given us information to build on from this point onwards.
Here is a screen shot of the dynamic rope script with the 360 AR view.
The anchor box is a place holder and will be swapped for a knob like model that will be able to rotate left or right to “reel” the rope and bring the content closer or push the content further away from the user allowing for an interactive zoom and spacial organisation mechanic.
The “nice to have” for this element would be tying it into a wind direction API so that the ropes would get pushed in the direction of the wind
I was out for a walk on Friday 16th and I decided to test the application at the Katherine Mansfield Sculpture. The Location services worked well and mapped my location well (screenshot below).
The initialisation of an NFT on the sculpture itself worked but took too long and i needed to adjust my viewing angle to make it work. But once it did work the tracking was great and worked seamlessly.
For our next test we should work on the initialisation of the AR.
Today Jeff and I caught up with Seb, in Evans Bay, to discuss the CMS in it relation to the app we are developing. We mainly wanted to gauge whether our unity game engine could talk to his server which he thinks it can. We plan to get a test of this working by next week.
We also may not be using xml files as they do not work well with his silver stripe CMS. The file type we may be using is called ssh.
Jeff and I went for a walk to test out his fix NFT. To do this he adjusted the tracking of the feature points that the software tracks to only be the text (and not the texture of the stone).
To do this we developed a simple scene in which we could test the tracking and see if it worked on the actual typographic sculpture.
The result, seen below, worked well and tracked nicely using three lines of text on the sculpture.
We also had a look for other natural features we could track. One of the points of interest we found were hidden QR Codes which we later scanned and found they were markers for Orienteering on the Waterfront.
Furthermore we will need to check the typographic sculptures in different conditions if possible as below is the James K Baxter sculpture at high tide. This would not be ideal for NFT Tracking at all.
This top photo demonstrates an uncleaned up marker. You can see that the track points are chaotic and although follow the text have a lot of “noise”. To fix this I decided to digitally paint out the noise and only focus on the text. This way I would “help” the software realise what was important to track.
Here is my first attempt at that. Only using one line. It worked well but I realised that one line of text didn’t offer the enough tracking data to get a clean track so I went up to 3 lines. Important to note here is I chose the the bottom 3 lines – the three that would be closest to the user. This worked very well and is definitely usable with some further refinement. Those refinements are
- changing angle of initialising photo to better fit point of view of human looking at poem
- reducing number of words (width) to something that can fit into FOV (field of view) of cellphone camera when turner horizontal
Image top shows half way through cleanup process
Precedent- Pokemon Go
Augmented Reality 360 Testing