This project, developed with art historian Glenn Gunhouse, was a follow-up to a previous experiment that tested if it was possible to make virtual spaces accessible to the blind using audio cues keyed to the position of a head-mounted tracker. One outcome of that earlier experiment was the discovery that the blind test subjects would have preferred to interact with the virtual space using a hand-held device comparable to a physical cane. With the release of the Vive and Oculus hand controllers, it became possible to construct such a device. The project involved writing the code required to use a Vive or Oculus hand controller so that the controller sends out a ray (like a hand-held laser beam) until it hit an object, then send back a spatial sound from the point of intersection with the object. Clicking on a button triggers the spoken name of any clickable object that the virtual cane happens to be touching. SIF fellows who worked on the project gained competence in Unity scripting, and in applying coding skills to real world problems.

The Archive is being developed by Elon Lang, Robin Wharton, and other Hoccleve scholars, with help from the Texas Digital Library, the Texas ScholarWorks (The University of Texas Digital Repository), UT LAITS and LAITS Media Development Lab, and our Student Innovation Fellows.

PROJECT Team


Thomas Breideband
Sydney Mathis Adams
Zane Blalock
Wasfi Momen