Embodied Impressions | Prototype 5
I went through many trains of thought in the development of this project - interactivity, artistry, engagement, visuals and sound. The project ended up being an exploration of creation and the sense we get when engaging in the act.
I want this project to illustrate how the most important aspects of creation lie in the act of engaging in it, not the production of a measurable output; in other words - it’s about the process and the journey, not the product at the destination. This project is an art piece that only exists when a user is interacting with it. There is no finished product that the user can record and reflect on - the art that users create when interacting with it is only present when the interaction is happening; it is unique and cannot be replicated.
The system works by using a Microsoft Azure Kinect to capture the user in the space. The video is then fed to ml5.js, which processes images and detects different body parts in human poses. Different parts - shoulders, nose, arms, face - are then used to alter parameters of a generative art piece in real time.
The art itself is built with p5.js (for visuals) and tone.js (for audio). Moving visuals are controlled by the position, speed, and pose of a user. Different musical notes play based on where and how the user moves in the space.