Concept Statement | Part 2

I am exploring how I can affect and direct people’s gestures and behaviour as they interact with each other in a public space. The method for this understanding is intended to be a model that can recognise and capture key focus points in a human action/interaction and convert them into a live dataset, which are then used as data points to control an interactive abstract sketch.

Users can play around with the model and each other to discover how they can affect it, and how it affects their behaviour in return. The system will use motion tracking and/or machine learning to recognise data points and convert then into a usable real time dataset that can be used to alter parameters of a dynamic art piece. 

Previous
Previous

5-in-5

Next
Next

Way Forward