Primary Research Documentation | Part 2

Full tabulation on Google Sheets (in the Primary Research Plan tab)

Observing and recording interactions

Relation to project: Input

Issue: What kind of behaviours/actions do people instinctively perform in interaction with each other?

Aim: Observe and record interactions of people in a public space.

Findings:

  • Mapping more complex interactions might be hard for people to follow, understand, and reproduce while interacting with the project.

  • Simpler gestures would lead to better outcomes.

XBOX Kinect

Relation to project: Technology

Issue: How can I record and extract data points from chosen action?

Aim: Use a Kinect to see if it can recognise patterns and postures and pull data out of them.

Findings:

  • The XBOX 360 Kinect was not compatible with the system on my laptop.

  • This Kinect doesn't have advanced tracking either.

Leap Motion Sensor

Relation to project: Technology

Issue: How can I record and extract data points from chosen action?

Aim: Use a Leap Motion Sensor to track finger and hand movements and use it to generate a real time dataset.

Findings:

  • Can track hands and fingers very well.

  • Since I am considering using simpler, more full-body gestures, this might not be suitable for this project.


OpenCV

Relation to project: Technology

Issue: How can I record and extract data points from chosen action?

Aim: Explore the viability of Computer Vision through OpenCV for tracking data points.

Findings:

  • Computer Vision through OpenCV is good for tracking facial expressions and facial recognition.

  • Since I am considering using simpler, more full-body gestures, this might not be suitable for this project.

  • Additionally, this system works most suitably with Python.


tone.js

Relation to project: Output

Issue: What kind of output do I want to create?

Aim: Map sound/music to user actions.

Findings:

  • Music is very intuitive and natural as a reactant and response.

  • Including music in the output gives users something instinctual to play with.

Try prototype here. Prototype test below:


tone.js + ml5.js

Relation to project: Output

Issue: What kind of output do I want to create?

Aim: Prototype a series of musical notes that can be controlled by the body tracking data of the user interacting with it.

Findings:

  • Musical notes mapped to user positions are a promising start.

  • Can be expanded into a larger 12-note set depending on space and notation.

Try prototype here. Prototype test below:

Previous
Previous

Practical Considerations

Next
Next

Midterm Reflection