
#consumer electronics #gaming #human-machine interaction #learning #storytelling #interfaces #social science #machine learning #affective computing #ethics #industry #interactive #computer science #interfaces #learning #entertainment #art #artificial intelligence #design #creativity #history #technology #archives
Vision Simulation, 2025
Image of a full custom room installation. Components: Herbert Simon’s chairs, Allen Newell “standing desk” re-created with cement blocks, multiple embedded audio speakers, 8 empty standard-issue filing cabinets, multiple bulletin boards (with removable facsimile documents from the Newell Simon archives), a Perq workstation, a Mac Plus, a Commodore Pet, an iPad and camera system running custom augmented reality software, two Tektronix Oscilloscopes with a live feed of speech-to-text files of artificial intelligence research read by standard text-to-speech software, various period-specific accouterments, plants, vintage TV monitor with video cassette of Herbert Simon lecturing, found 1970s chalkboard with a recreation of a writing from a photograph of Newell’s original chalkboard.
To augment or expand sensory experiences requires a combination of sophisticated hardware and deep understanding of the neuroscience underlying human perception. Recent advances in XR pass-through technology have proven to be a powerful tool to explore how the brain perceives the world around us. Such tools hold great potential to impact how cognitive scientists can approach vision rehabilitation through the introduction of vision augmentation systems.
Working with a transdisciplinary team at the MIT Media Lab and the MIT McGovern Institute, I designed five real-time spatial frequency filters (Pierce et al., 2025)., each of which remove or enhance certain spatio-temporal frequencies of visual information, enabling wearers to appreciate new dimensions of perception in their experience. A tangible input mechanism enabled users to toggle between various vision simulations, altering their environment and perception of objects.




Vision Simulations
The limestone of Iron Mountain was re-created in the installation with a 3D printed relief as the backdrop of the space, extruded from an image of an actual office at Iron Mountain.
The image was used to generate a 3D digital model, which was cut with a CNC router into a large multi-paneled relief, painted silver for lining the back wall.

Museum Exhibitions
This project, showcased to hundreds of visitors at the MIT Museum, demonstrated the implications of vision simulations on well-being and vision rehabilitation, exploring the integration of multisensory AI within vision simulation systems.
Future work will expand this approach toward adaptive, personalized vision-augmentation systems that integrate physiological and affective feedback for clinical and creative applications.

Applications in XR
As “Image targets” when viewed through a tablet or cameras embedded in the room, each document triggered an augmented reality simulation of paper simulations, filling the office.
The application was coded to recognize each document and trigger physics-based animations based on the angle and movement of the tablet in the environment.







