My capstone project involves the development of a dynamic multisensory interface to provide accessible biological aiagrams for blind and low vision students.
TorchLight© “An Assistive Tool to Locate People and Objects with a Multimodal Thermogram Interface”
SizeFu – Sizing For You
A mobile application proposal, interaction diagram and commercial.
In the late Summer of 2016 I helped run a sample of both sighted participants and blind participants through a study evaluating the use of a real-time audio route description interface supplemented with location-specific iBeacon information.
Mixin’ Pixels is an exhibit designed in a group project for a New Media Project Design Workshop.
The Cognitive Map Transfer Study aims to explore how technology can help blind or visually impaired people navigate an indoor space.
I was part of a group challenged to create a mobile navigation interface for travelers with non-functional vision. We focused on indoor navigation within hotels, and designed a mobile application around the use of bluetooth beacons placed at decision points around the hotel.
I collaborated with VEMI co-worker Sam Gates to develop a haptic-scene access interface to determine a usable interface design for the location of objects within a scene in relation to the user’s position. The interface utilized a head-mounted tracking system and external server to provide real-time updating information about object locations relative to the user’s location and head rotation.
A site designed to represent the community homepage of the future.