Rachel-Final Project

The Interactive Development Environment: bringing the creative process to the art of coding.

TIDE is an attempt to integrate the celebrated traditions of the artistic process and the often sterile environment of programming. The dialogue of creative processes exists in the process of coding as frustration: debugging, fixing, and fidgeting over details until the programmer and program are unified. TIDE attempts to turn this frustration into the exhilaration of dialogue between artist and art.
The environment uses a Kinect to find and track gestures that correspond to values. Processing code is written with these values and saved in a .txt file. The result code creates a simple shape on screen. I want to expand this project to be able to recognize more gestures and patterns, allowing for much more complicated systems to be implemented. Ironically, I found the process of implementing and testing this project generating the very frustration and sterility I was trying to eradicate with the intuitive, free flowing motions I could get with the Kinect.
 photo Sketches9a_zps66e2853c.jpg

 photo FinalTxtFileScreen_zpsc25421bb.jpg
 photo FinalProcessingFileScreen_zps52cd8bed.jpg

One comment

  1. admin

    Uses a Kinect to produce …code?
    Definitely see this example, “bodyfuck” by nik hanselmann.
    http://nikhanselmann.com/?/projects/bodyfuck/
    https://vimeo.com/7133810
    I recommend to take the failure, the misdirection of using your body as a keyboard, to an absurd degree.

    Ali: see the work of Jesse Sugarman

    About the documentation, I’d like to see more movements and maybe just a minute of solid interaction with the art piece.

    So good that you call it The Interactive Development Environment 😛

    I’d like to see this working with a dancer, would be interesting to see how their movements affect the code

    You say your project addresses the frustrating process of debugging – how does it do this?

    ^ process looks like writing code, not debugging

    video too short, cannot tell too much what’s going on

    why save to txt file? saving to a pde file so feed into processing directly? how does the txt file get feed into processing?

    I”d like to see the objects coded to match with your movements.

    I would like to see more video documentation and a demo. It’s a great idea. You could even grab the color of the shirt to use as a color in the shape you draw.

    It could be interesting conceptually to look at why it is so important to have movement generate code. RSI is definitely something to be aware of as a programmer — could this be some kind of motivational video, where the message is “If you don’t want to have to try and program like this in your day-to-day life, get outside and exercise!”. (Imagine trying to write photoshop of something by waving your arms — it’s an impressive idea, just not necessarily easy to translate into “real-world”, “practical” programming.) Just a thought.

    Interesting because programming is an activity that requires more mental involvement than physical involvement (if that makes sense) – and by using a kinect you’re adding a physical component to coding

    what Ali said reminds me of hackertyper (http://hackertyper.com/)