top of page
01132024_1056.jpg

220 Jefferson Drive Simulation

#consumer electronics #gaming #human-machine interaction #learning #storytelling #interfaces #machine learning #affective computing #ethics #industry #interactive #computer science #interfaces #learning #entertainment #art #artificial intelligence #design #creativity #history #technology #archives

PAGE UNDER CONSTRUCTION

Top of Page

This project created a sensory simulation designed to conjure feelings of presence in past places.

 

This research project investigates how immersive systems can preserve autobiographical memory by integrating tangible objects, 3D scans of real environments, and virtual artifacts, grounded in lived experience. The project aims to create emotionally resonant linkages that evoke vivid memory recollection. Designed to support intergenerational memory sharing, these systems show promising potential for enhancing emotional well-being, strengthening familial and social bonds, and fostering a sense of telepresence, enabling communication and connection across time.

 

We designed a study exploring interaction with memory reconstructions and complex personal spaces using gaussian splatting paired with sensory stimuli. Using a dataset of object-specific 3D scans, home movies rendered in depth, and personal anecdotal data, we built digital twins to explore the therapeutic potential of AI-driven autobiographical simulations, with the goal of aligning an individual’s sensory experience with generative models by fusing multi-modal data. Our user study of n=16 family members showed combining sensory stimuli with gaussian splats enhanced memory vividness. Future work would lead to dynamically generated world modeling, providing users with personal multi-sensory experiences on demand.

Polycam Capture - Download Free 3D Model on Polycam - Google Chrome 2023-10-04 18-23-02.gi
Recording 2024-01-03 020948.gif

3D Scanning Personal World Models

I collected public domain 2D animations comprising a geneology of CGI from 1963 to 1987, and used fragments of the animations to create a 3D environment for storytelling.

 

Interacting with Multi-Modal AI

Users explored a chronology of CGI history by retargeting their hand gestures via a Leap Motion Controller onto Catmulls’ “hand”.


The dialogue of the story was told by two bookended iMac G3 monitors, each animated by Parke’s faces, re-animated with the original Apple text-to-speech engine, “MacinTalk”.

1 copy.png

Multi-modal fusion of Sensory Stimuli

441475226_753762153621174_9142619802420307888_n.png
215_2.jpg
215_1.jpg
Grid 3_2.gif
full_360_jefferson.jpg
scan_two (1).jpg
sequence.jpg

Experience on Demand

3D Home Movies

When displayed on a stereoscopic 3D display, the interactive animation is also rendered as "3D".

Interestingly, this process involves extracting 2D content from originally 3D animated projects, placing them in 3D space, and using modern stereoscopy to allow them to become "three-dimensional" again.

Comp 11.gif
bottom of page