An Immersive, Interactive Exhibit
Using various technological mediums such as audio, projections, physical computing, and virtual reality to guide a participant through an emotionally compelling story about visiting our inner child and healing childhood trauma.
Duration: 4 weeks, completed January 2023
Tools: A-Frame VR, HTML/CSS/JS, P5.js, Projection Mapping, Physical Computing, Blender
Focus: Empathy-Driven Storytelling, Exhibition Design, and Technical Implementation in various mediums
Context: Final project fromGray Area's Creative Code Intensive education program
Featured: Gray Area Artist Showcase Winter 2023
TODO: I wrote the original concept for the story in a college creative writing course and modified it for this project.
TODO. While including a little bit of everything we learned in the Creative Code Intensive.
Modeled in Blender.
Participants open this website on their phone to begin the experience.
Try for yourself!
I used a force-sensitive resistor to detect pressure by measuring increases in output voltage. Lower resistance indicated higher pressure, meaning the sensor detected when a person was standing on it. The ESP32 sent this data to a p5.js script via OSC messaging, triggering the image fade-in. To align the projection with my installation space, I calibrated the image using the p5.mapper library and projected my screen onto the ground.
Since this was a self-guided public exhibition, I prioritized making interactions as intuitive and user-friendly as possible. I eliminitated the need for VR controllers by implementing gaze detection for interactions. The WebVR scene was created using A-Frame, with environments modeled in Blender.
Click and drag the mouse to look around.