My capstone project involves the development of a dynamic multisensory interface to provide accessible biological aiagrams for blind and low vision students.
TorchLight© “An Assistive Tool to Locate People and Objects with a Multimodal Thermogram Interface”
In the late Summer of 2016 I helped run a sample of both sighted participants and blind participants through a study evaluating the use of a real-time audio route description interface supplemented with location-specific iBeacon information.
Experimental Evaluation of Virtual Reality Experiences As a Supplemental Treatment in Patients with Seasonal Affective or Generalized Anxiety Disorder
The Cognitive Map Transfer Study aims to explore how technology can help blind or visually impaired people navigate an indoor space.
I collaborated with VEMI co-worker Sam Gates to develop a haptic-scene access interface to determine a usable interface design for the location of objects within a scene in relation to the user’s position. The interface utilized a head-mounted tracking system and external server to provide real-time updating information about object locations relative to the user’s location and head rotation.