Immersive Art Experience - Mood Cinema

PROJECT OVERVIEW

This project is making an installation with LED strips triggered by emotion which is an input for the lights. By showing a video of memories 2022 Ixd all commonly have, the installation could collect the data of emotion and show the result of the data in the visual method, both fabrication, and graphics.

MY ROLE

· Implemented physical computing using LED strips with Arduino.
· Wrote a JavaScript code to interactively display user activity.
· Used Serial communication by connecting Arduino, P5.js Web editor, and Teachable machine data.

TEAM

Jinhee Jung, Jihye Kim, Sunwoo Park

ADVISOR

Carrie Kengle, Bruno Kruse, Eric Forman

Timeline

Dec 2022 (4 weeks)

Tool

Arduino, P5.js Web editor (JavaScript library), Meteor, Teachable Machine, Premiere pro
Introduction

Recapturing happy memories with fast reacting LED lights based on facial expressions

Looking back on the first semester at SVA, these memories were beautified after time had passed. And definitely, felt some joy and excitement, which extended to happiness at the moment of everyday life this semester. I and our team wanted to recapture this slightest moment of happiness from the past to the current by using fast reacting led lights that change based on the person’s facial expression while watching the video.

Material

Software: Arduino, P5.js, Meteor
Languages: Javascript
Materials for fabrication: LED strips, jumper wires, Black foam boards, Webcam

Draft concept

I and our team initially wanted to make an LED wall that reacts to emotions, however, we soon realized that the concept lacked a clear narrative. In response, we pivoted to creating a small movie theater where viewers could watch a video about memories from the first semester, while LED colors changed in response to the user's emotions.

After receiving feedback, we further refined the project to focus on capturing even the smallest moments of happiness, using machine learning to detect subtle facial movements.
Process

Teachable machine

We divided the emotions into categories; Happy, sad, upset, surprised, and neutral. We collected the people’s facial movements using the teachable machine and trained model to detect facial emotions.
Collecting facial data
Training machine learning model

Arduino

We re-evaluated our approach for implementing the five emotions, with a focus on clarity and ease of use for the user. Ultimately, we determined that using a single emotion was the most efficient and effective approach to design. As a result, we simplified the code accordingly. Once we confirmed that the fabrication and array were functioning properly, we tested the Arduino code to ensure that it would correctly light up the LED strips.
LED strip test
LED strip test

P5.js

I uploaded the data of the teachable machine to P5 based on the collected facial expression data. And I expressed the ‘happy’ value with a simple bar graph. I successfully checked that the bar graph moved according to the smile.
Sketch
P5.js test

Connecting between Arduino & P5.JS & Teachable machine

And, to change the light of LED based on the data of P5, I need to connect P5.js, teachable machine and Arduino. I combined data from P5 with Arduino code to adjust the LED lights according to the changing P5 value. I shed tears of joy when all the connections were successful and the LED lights changed depending on the expression.
Sketch
P5.js test
After encountering issues with unstable serial communication and messy code when combining serial communication, we switched to using Meteor, which is a web framework using JavaScript, as our communication tool.
P5.JS
Arduino
Meteor
Meteor

Fabrication

We tested to find out the better connection between serial and parallel. After testing both connections with codes, we figured out that the serial connection is way more elegant and clear to display our output.
Serial connection
Parallel connection
And we made a video that consists of images and short videos of classes and our classmates that we’ve taken this semester. We wanted people can recall their blurred memory and feel small happiness while watching it.
Mood cinema with video
Ixd mood cinema guideline

final

As our final project, we made 3 outputs; Fabrication with LED, Graph via P5.js and video. The user scenario is like this below.

1. User watch a video
2. The cinema box with LED lights turn on and off based on the user’s face which is captured by the webcam above the iPad(Video)
3. Along with the LED lights on cinema box, the graph on the monitor also changes depending on the user’s facial expression.

By triggering the user’s emotion with a video(input), we wanted to detect user’s emotion in visual way with LED lights and the graph.

Up next