Using sensors and AI to transform inputs into outputs
For this part of the task, we connected Wekinator, an AI training programme, with FaceOSC, a facial recognition model. Our task consisted in training Wekinator with a wide variety of facial expressions gathered with computer vision to recognise whether we were surprised or not. The output was transformed in a numerical value 0 = nothing, 1 = surprised, which we then used later on to link our facial expressions to a certain kind of output.
We developed a wearable prototype that detected pressure through touch. The wearable was made of conductive textile which was folded in two and velostat which detects pressure intensity. It was important to ensure that the two sides of the conductive textile did not touch to avoid breaking the circuit. By training the Wekinator model with two distinct movements— pressure or no pressure—we aimed to translate these into visual changes, such as activating a led on an Arduino or changing the size a digital ball thanks to the software Processing.
In the 2nd module of interactive design, we also experimented with the app ZigSim which accesses all the different sensors of your phone, from the accelerometer to the gyroscope etc. to send that information over OSC messaging (among other ways) to another device to process that data. We linked ZigSim to Max to be able to link movement to visual changes in a digital ball which would increase or decrease depending on the acceleration of our movements, detected by the phone.
Exploring interaction design and translating various inputs into different outputs through technology opened up a new world of possibilities. It felt like creating new languages between the unseen and the seen, new forms of communication and interaction through technology, allowing me to visualize and interpret data in innovative ways.
Understanding how to transform physical movements into digital signals using sensors and AI tools showed me how to bridge the gap between physical and digital realms, creating meaningful interactions demonstrating how technology can be used to enhance human experiences.
This task provided me with the technical knowledge and resources to delve deeper into these concepts in the future, building on my previous explorations making the imperceptible perceptible.
HTML Maker