Category Archives: Documentation


-Generates poem as they walk? the closer they get the more the found poem builds up.

-User Testing who can we test with? We should look into this.

-Construct user generated poems through jumbling up the original words of the poems they are near

AR Concepts

We should aim to populate the world with new virtual objects that are not possible in the physical world to create a new experience. This experience should be intractable. Also should play with typography.

-Words become the landscape

-Words intractable able to jumble them up/ rearrange them, create your own poem

-Poetry forming into new poems, shimmering

45e31edd45b1a9da47c4f153616d3228-Concrete Poetry, forming words into visual shapes.

Poetry in which the meaning or effect is conveyed partly or wholly by visual means, using patterns of words or letters and other typographical devices.

We need to remember that difficult to understand technology is not always easy to understand and we shouldn’t treat people like they don’t know anything to make it easier. 


UI Redesign

Thanks to Seb for this. The new Icons are  bolder and more appropriate for mobile


The final app could be put forward as a Digital Futures Proposal



Jeff’s plan to give the user some way finding is as follows:

There is an arrow at the base of the user icon, it points towards the writer that the user wants to go and see (based on its location data, Longitude and Latitude)

Users can pick which writers they want to get directions to on a tab on the main map screen, all writers will be able o be selected at all times.

We believe this will allow for easy navigation bot far away and in close proximity to the writer locations. This solution is also scalable so we can add more writers easily to it (right now we just have 3).

IMG_0572AR Content Concept

This AR content would be generated 360 degrees around the user.

A road would circle around the user in the center and a car would be driving around that road. Like a popup book 3D assets such as trees and houses would appear next to this car (1). Then as the car moved into the city more dense 3D assets and buildings would appear(2).

This is to simulate the car driving into Wellington. Hills and landmarks such as the Beehive could be created to symbolise this.


These are notes taken during the meeting and are published here to keep a record

Icons– Icons should be added to the map that are designed to suit the target audience of teenagers to young adults. This could include coffee shops skate parks and other activities / locations that would appeal to this audience.

Signal Icon-Could we define this better? We are using a recognised icon (in terms of cellphone reception) but we will need to test if this translates to the user in the future?

Wayfinding – If users are not at the waterfront they will need directions. We discussed the options of an arrow or a guiding lines that snaked through the streets. How accurate do we need it to be?Side note-Do we fill the users time with audio recordings as a playlist? Could we do this if they are far away to fill ther journey time?

Radar Ping- Adding a radar around the user/ the Points of interest could help the user know whether they are in range of the experience(like Pokemon Go)

Information of each writer fields?


AR Content

-Visual representations of the poem?

-Transforming Typography, The words of the poem animate around the user?

-Creating a library of poems based off one poem, like a book that the user can page through based off the sculpture/ in the same style as the sculpture?





Site Visit Thursday 9th

Today I visited the sites of each AR interaction to bug test aspect of the map design to see what improvements and fixes needed to be made. Each paragraph will refer to the image above it.


Some of the building shapes are misaligned so this will need to be fixed in the overlay or we will need to rethink our implementation of buildings in the open street maps API


The solace in the wind title is pixelated. Also the Icon could be changed? 3D, or increased in size.


Small 3D trees were added to this build, Jeff and I like how they add depth to the map so we will redesign then to suit the aesthetic and revisit them later.


The text in the sea/water looks great and complements the loos of the map. Also the fade effect that Jeff has added is a nice touch (on the horizon).


Research Bazaar Presentation 2017


Jeff and I spoke at Res Baz 2017, in Victoria University’s Kelburn, Hub, about this project thus far. Our presentation and transcript can be found below.


Slide 1  

Res Baz PresentationIntroduction

I’m Jeff and I’m Jono we are both post graduate students from the Victoria University school of design. We are in the process of creating a mobile application for the Wai-Te-Ata Press that we like to call a Literary Atlas of Wellington.

Our brief was to create an application that augments the Victoria University Library Collections into physical space making their content interactive for public consumption. So far through our development process we have completed an Augmented reality mobile platform which will host these interactions.

Our Literary Atlas app will allow users to walk to locations along the Wellington Waterfront in order to gather information about well-known Victoria University Writers from their sculptures on the Wellington Writers Walk.  This is achieved through a combination of Geo Location Tracking and Augmented Reality.


Slide 2 

Res Baz Presentation (1)Why Augmented Reality?

We chose to use augmented reality technology as it is a powerful new way of communicating ideas effectively. Think technology like the holograms from Star Wars.

For those who don’t know Augmented reality, or AR, is defined as having the following three properties;

– Combines real and virtual objects in a real, physical environment

– Runs in real time

– Aligns real and virtual objects with each other.

In our application we are using Video AR in which we use a video feed of the real world on a mobile device upon which virtual images are overlaid. This is the same concept which Pokemon Go uses.

However what sets us apart is outdoor natural feature tracking (NFT) through ARToolkit 5 which uses natural feature points to position Virtual Objects onto real world surfaces. Geolocation Positioning (GPS) to understand which writer the user is near and therefore initiate the corresponding AR Experiences


Slide 3 60 sec

Res Baz Presentation (2)What is AR toolkit 5, NFT’s

How are we making it?

AR toolkit 5


Open Street Maps

AR toolkit 5 is an open source project that can run in Unity’s development environment. It gives developers a multitude of functionality for developing AR applications right out of the box and because it is open source allows you to examine the code and add customize your own features as well.

Unity 5 is a gaming engine that allows you to write code once and port to many different devices. It also comes with a range of functionality to help speed up development is is pretty user friendly

And Open Street Maps for pulling in real world geometry data and constructing a mesh that the user can then navigate around based on their geolocation. We have also implemented functionality that allows us to position things on that map based on its geolocation in the real world.   

One aspect of AR toolkits 5 functionality, as jono has already touched on are NFT’s, the ability to turn images of real world things into Natural Feature Points and then recognize these points in real time through the mobile devices video feed. We can then display virtual scenes that appear to be tracked to physical objects.

As well as scenes tracked in space we are using the gyro and accelerometer to give users instanced AR (a scene that surrounds a static point). This allows the user to have a full 360 experience but does not allow virtual objects to “stick” or be tracked in physical space


Slide 4 

Res Baz Presentation (3)Cultivate a habit of mistakes and become a connoisseur of them – this is our design ethos

It’s important to consider app development in terms of an evolutionary process. Its full of trial and errors. This is why testing frequently becomes an essential part of development – Testing whether it still compiles and runs on your mobile device, testing whether the UI is best suited for a mobile screen size, testing the how interactions work.

It usually highlights problems or errors and if you test frequently you are able to isolate these issues rather than having to sift through a multitude of changes to find the cause.

It allows your development team to find mistakes, understand them and then implement improved changes that are informed by them.

You should also consider and use other apps that relate to yours. Analyze them and carefully critique them. Ask yourself what makes them good and what makes them bad and how you can apply these insights to your app.


Slide 5

Res Baz Presentation (4)Our next steps are as follows ;

We are about to develop augmented reality experiences into the application based on our Writer’s poems.These will be virtual representations of the poems and will be constructed to embody the main themes and ideas of each poem.

We have also begun the process of incorporating a function which allows users to write their own poems and add them to a database for other uses to enjoy.

If you would like to follow our development process or see in more detail what we have accomplished thus far please visit our development blog by following the link on the slides.
Thank you


Site Visit Feb 10 NFT Overlay Tests

These tests were based around the creation of an overlay which specifies to the user what to focus the camera on to generate the NFT content. The overlay disappears once the NFT has been tracked. IMG_0509

Once the tracking works a small square was programmed to appear this will be replaced with interactive content now that we know it works. This stingray NFT were successful except for the sizing of the NFT on the UI. This will be amended. IMG_0510

The Text NFT was unsuccessful however although the overlays sizing was better than the stingray.

Below is the closest we could get to matching the Text NFT to the marker.




For this site visit we aimed to test the functionality of the geo location tracking as well as live testing the natural feature tracking at each of the three writers walk sculptures ( except Bill Manhire).


At Vincent O’Sullivan’s Sculpture we tested the NFT, with a small cube appearing at the bottom of the NFT marker to show it was working. It did take a while for the tracking  to work however so we took another hi res image of the stingray to make into a better NFT marker image for tracking.


We then proceeded to Bill Manhire’s Sculpture to test the location tracking, this worked without any problems (the 3D object spun faster depending on how close we were to the sculpture).


Our next stop was Katherine Mansfield and the geo location tracking worked well with this sculpture too. The pop up that says augmented view is what the user will tap to initiate the AR views(360 and NFT).


The NFT AR also worked once initialize but this still needs work. so as before we have re taken some NFT Images to revisit and refine this tracking.  Overall this Site Visit yielded some successful testing  and has given us information to build on from this point onwards.