top of page
SD_1.jpg

Live Cinema Simulator

#consumer electronics #gaming #human-machine interaction #learning #storytelling #interfaces #machine learning #affective computing #ethics #industry #interactive #computer science #interfaces #learning #entertainment #art #artificial intelligence #design #creativity #history #technology #archives

Real-time Stable Diffusion for Virtual Production: Room Scale Generative AI towards the Ultimate Display
Daniel Pillis, Phillip Cerner, Martin Sawtell
Emerson College / MIT — Dell EMC — Generative AI for Design Workshop (2025), Extended Abstract

Top of Page

This project integrates real-time Stable Diffusion into TouchDesigner for dynamic virtual production. Using a high-resolution LED wall, we explore how generative AI enhances set design, props, and storytelling. Case studies show how AI streamlines workflows, enabling rapid prototyping and immersive environments for in-camera VFX and filmmaking.

 

The use of diffusion models will enable filmmakers to shift from manually intensive design processes to AI-assisted creative workflows, facilitating the development of virtual production environments that enhance storytelling. This in-progress case study contributes to the growing discourse surrounding AI-enhanced performance-based design by demonstrating the applicability of real-time generative AI to creative workflows in an active virtual production studio environment. Innovations in generative AI for virtual production enable better resource allocation, reduced costs, and efficient prototyping and visualization, significantly streamlining the filmmaking process.

Immersive Artificial Intelligence

Through a series of case studies using a 2240 x 1440 1.9mm pixel virtual production display, we are exploring the performance benefits of real-time stable diffusion as a way to generate immersive virtual environments, showcasing how generative AI enhances artificial set design, prop creation, and rapid prototyping for storytelling.

Room Scale Generative AI

Room scale generative artificial intelligence enables human scale proportionate interactions. The richness of the display coupled with the impact the transformation of generative AI has on physical presence has proven to have an impact on the sensory experience of the users.

Full Project Overview

Our in-progress results demonstrate the promising potential of these technologies to transform virtual production. Using real time Stable Diffusion, a Touch Designer system enables real time prompting to dynamically change objects, environments, and actors in an immersive state of the art virtual production environment.

 

This project integrates real-time diffusion models into virtual production, enhancing in-camera VFX and set design through generative AI. By streamlining environment creation and prototyping, it reduces costs and improves efficiency. Our case studies show how AI-driven workflows can transform filmmaking, enabling adaptive, performance-based design in immersive storytelling

bottom of page