After developing the experience with Unity 3D, I began user testing it with students and faculty on-site at the library using talk aloud protocol and rigorous note taking.
Observation: Users didn't intuitively understand the walkway concept. Most users didn't know to stand on it initially. Some interpreted it as a wall they shouldn't step on, others become distracted by it and missed important content that appeared on the exhibit case.
Iteration: We decided to remove the walkway and replaced it with instructions at the start about how each side had different content. The instructions were text based and used a miniature model of the exhibit case to show where people needed to stand.
Observation: Some users thought the floating footprint signs meant something was happening and waited to be shown something.
Iteration: The floating signs were removed and replaced by arrows that pointed to the special stepping spots only if the users didn't step on them after 15 seconds. The teleportation idea didn't communicate either. It was replaced by an immediate reaction similar to stepping on a button. In addition audio cues were added to communicate getting close to a button, stepping on a button, and stepping off a button.
Observation: Users didn't understand why they needed to look at a tracking marker to initially align the virtual and real worlds.
Iteration: This observation was expected and helped us confirm our predictions. Because this was an early iteration, and it's best to test early and often, it was ok for some features to not be entirely implemented. In the next iteration we replaced the traditional marker with an image of a portal with the words "Start" over it. It aims to convey that they're entering the experience.
Observation: The text instructions still didn't communicate well.
Iteration: Upon seeing the instructions at the start of the exhibit, most users immediately looked for the AR content before it was actually there. In the next iteration the instructions appear during the experience instead of frontloading it. This also helps with directing users from side to side in the intended order.
Observation: Users saw the arrows but didn't look down to see the footprint circles.
Iteration: I expected the arrows above each circle would lead users to notice them, but it didn't work consistently. The main issue with the footprint symbol circles is that the Hololens' field of view cuts them off because they're too low to the ground, making them hard to notice. I revisited the findings from the observational study and developed a new interaction system around that. Rather than buttons, the new iteration simply detects if a user is close enough to a specific edge on the exhibit case to trigger that content on that side of the exhibit.
Observation: The environment could change and make it challenging to demo the app.
Iteration: If it's too sunny the large windows in the library can make it difficult for users to see the AR content on the Hololens' additive display. This was handled by angling some of the content lower to counteract glare. Another challenge is quickly restarting the app when repeatedly testing it with users. This was solved by triggering the experience to reset if the Hololens was set down and not moved at all for a few seconds.
Observation: Users really enjoyed the 3d models and additional media.
Iteration: In the next iteration I replaced some of the normal informational text with audio clips from a voice actor impersonating Washington Roebling. I also updated the models and lighting in the scene because so many users enjoyed walking around to look at them.