Category: Uncategorized

Arousal vs. Time

Arousal vs. Time from Miles Peyton on Vimeo.

Arousal vs. Time: a seismometer for arousal, as measured by facial expressions.

Overview

One way to infer inner emotional states without access to a person’s thoughts is to observe their facial expressions. As the name suggests, Arousal vs. Time is a visualization of excitement levels over time. The more you deviate from your resting expression, the more excited you are presumed to be. An interesting context for this tool is in everyday social interactions. Watching the seismometer while talking to a friend can generate insights into the nature of that relationship. It might reveal which person tends to lead the conversation, or who is the more introverted of the two. Watching a conversation unfold in this visual manner is both soothing and unsettling.

Inspiration

Arousal vs. Time is the latest iteration in a series of studies. After receiving useful feedback on my last foray into face tracking, I decided to rework the piece to include sound, two styrofoam heads, and text for clarity. Daito Manabe’s and Kyle McDonald’s face-related projects – ”Face Instrument”, “Happy Things” – informed the sensibility of this work.

“Face Instrument” – Daito Manabe

 

“Happy Things” – Kyle McDonald

Implementation 

A casual conversation between myself and a friend was recorded on video and in XML files. I wrote the two software components of this artwork – the seismometer and the playback mechanism – in openFrameworks 0.8. I used the following three addons:

  1. ofxXMLSettings – for recording and playing back face data
  2. ofxMtlMapping2D – projection mapping
  3. ofxFaceTracker – tracking facial expressions

 

The set The set

The projection mapping on the styrofoam heads was carried out on two laptops with two pico projectors. I stored facial data in XML files, and recorded video and audio with an HD video camera and an audio recorder.

The audio file was manipulated in Ableton Live to obscure the content of the conversation. I used chroma keying in Adobe Premiere to remove the background of the video, such that the graphs would seem to emerge from behind the heads, and not from  some unseen bounding box. Finally, the materials – a video file, two XML files, and an audio file – were brought together in a second “player” application, also built in openFrameworks.

Reflection

Regarding a conceptual impetus for this project, I keep thinking back to a point professor Ali Momeni made when I showed an earlier version of this project during critique. He questioned not my craft, but my language:  the fact that I used the word ”disingenuous” to describe my project. I’m still don’t have a satisfying response to this, just more speculation.

Am I trying to critique self-quantification by proposing an alienating use of face tracking? Or am I making a sincere attempt to learn something about social interaction through technology? The ambivalence I feel toward the idea of self-quantification leads me to believe that it is worthwhile territory for me to continue to explore.

Rachel-Interactive Development Environment

 

 
 photo TIDE_zps1ac62372.jpg
The Interactive Development Environment: bringing the creative process to the art of coding.
TIDE is an attempt to integrate the celebrated traditions of the artistic process and the often sterile environment of programming. The dialogue of creative processes exists in the process of coding as frustration: debugging, fixing, and fidgeting over details until the programmer and program are unified. TIDE attempts to turn this frustration into the exhilaration of dialogue between artist and art.
The environment uses a Kinect to find and track gestures that correspond to values. Processing code is written with these values and saved in a .txt file. The result code creates a simple shape on screen. I want to expand this project to be able to recognize more gestures and patterns, allowing for much more complicated systems to be implemented. Ironically, I found the process of implementing and testing this project generating the very frustration and sterility I was trying to eradicate with the intuitive, free flowing motions I could get with the Kinect.
 

 photo Sketches9a_zps66e2853c.jpg

 photo FinalTxtFileScreen_zpsc25421bb.jpg
 photo FinalProcessingFileScreen_zps52cd8bed.jpg

"""
Rachel Moeller
EMS2 Asignment 9 
The Interactive Development Environment
"""
from pykinect import nui
from pykinect.nui import JointId, SkeletonTrackingState

import pygame
from pygame.color import THECOLORS
from pygame.locals import *
import os
import random

KINECTEVENT = pygame.USEREVENT



def writeFile(filename, contents, mode="wt"):
    #Got this from Kosbie's 15-112 Class
    fout = None
    try:
        fout = open(filename, mode)
        fout.write(contents)
    finally:
        if (fout != None): fout.close()
    return True

def post_frame(frame):
    """Get skeleton events from the Kinect device and post them into the PyGame event queue"""
    try:
        pygame.event.post(pygame.event.Event(KINECTEVENT, skeletons = frame.SkeletonData))
    except:
        # event queue full
        pass

def commitWidth(data):
    """This function adds the width to the file contents."""
    width=data.sizeWidth
    data.contents+=str(width)+","

def commitLength(data):
    """This function adds the width to the file contents."""
    length=data.sizeLength
    data.contents+=str(length)+");\n}\nvoid draw()\n{"

def commitShape(data):
    """This function adds the type of shape to the file contents."""
    data.contents+="\n"
    if(data.shape=="ellipse"):
        data.contents+="ellipse("
        data.shape="ellipse"
    elif(data.shape=="rect"):
        data.contents+="rect("
        data.shape="rect"

def commitShapeLoc(data):
    data.contents+=str((data.sizeWidth/2)-data.radius)+","+str((data.sizeLength/2)-data.radius)+","

def commitRadius(data):
    """This function adds the radius in to the shape definition."""
    radius=data.radius
    data.contents+=str(radius)+","+str(radius)+");\n}"
    data.isComplete=True

def computeShapeLoc(data,r):
    """This function figures out where to begin drawing the shape away from the center
       of the screen."""
    c=data.shapeColor
    x=400-r
    y=300-r
    data.shapeX=x
    data.shapeY=y

def drawShape(data):
    """This function draws the shape into the interface."""
    c=data.shapeColor
    if(not data.hasRadius):
        r=getRadius(data)
        computeShapeLoc(data,r)
    else:r=data.radius
    if(data.shape=="ellipse"):
        pygame.draw.ellipse(data.screen,c,[data.shapeX,data.shapeY,r,r])
    if(data.shape=="rect"):
        pygame.draw.rect(data.screen,c,[data.shapeX,data.shapeY,r,r])

def commitColor(data):
    """Sets the color in the file contents."""
    if(data.color=="red"):
        data.contents+="\nfill(255,0,0);\n"
    if(data.color=="green"):
        data.contents+="\nfill(0,255,0);\n"
    else:
        data.contents+="\nfill(0,0,255);\n"

def commitBG(data):
    """Writes the background command to the file contents."""
    data.contents+="\nbackground(255);\n"

def initBools(data):
    """This funtion inits the boolean variables controlling when
       code pieces are written."""
    data.hasWidth=False
    data.hasLength=False
    data.hasSetup=False
    data.hasBackground=False
    data.hasColor=False
    data.hasShape=False
    data.hasLocation=False
    data.hasRadius=False
    data.isComplete=False

def initJoints(data,skeleton):
    """Defines the Kinect Joints."""
    data.head=skeleton.SkeletonPositions[JointId.Head]
    data.rightHand=skeleton.SkeletonPositions[JointId.HandRight]
    data.leftHand=skeleton.SkeletonPositions[JointId.HandLeft]
    data.hip=skeleton.SkeletonPositions[JointId.HipCenter]

def init(data):
    data.contents="/*TIDE shape*/\nvoid setup()\n{\nsize("
    data.x=10
    data.y=10
    data.space=20
    data.font=pygame.font.Font(None,20)
    data.typeWords=["void"]
    blue=0,0,255
    green=0,255,0
    data.typeColors=[blue,green]
    data.plainTextColor=0,0,0
    data.lineNums=2
    data.sizeWidth=500
    data.sizeLength=500
    data.shapeColor=0,0,0
    data.shape=None
    data.backgroundColor="white"
    data.tracked=False
    data.frameCount=0
    data.headThresh=0.8
    data.displayText="User not detected"
    data.radius=100
    initBools(data)

def redrawAll(data):
    """This function handles display screens"""
    c=255,255,255
    pygame.draw.rect(data.screen,c,[0,0,800,600])
    c=0,0,0
    msg=data.displayText
    font=pygame.font.Font(None,28)
    text=font.render(msg,True,c)
    data.screen.blit(text,[20,20])
    if(data.hasShape):
        drawShape(data)
    pygame.display.flip()

def checkForComplete(data):
    """This function checks to see if every checkpoint in the code has been reached."""
    return data.hasWidth and data.hasLength and data.hasSetup and data.hasBackground and data.hasColor and data.hasShape and data.Location and data.hasRadius


def getBGColor(data):
    """This function sets a background color"""
    data.backgroundColor="white"


def getRadius(data):
    """This function gathers radius information from the kinect."""
    if(not data.hasRadius):
        data.radius=200*abs(data.hip.y-data.head.y)
        data.hasRadius=True
        return 200*abs(data.hip.y-data.head.y)

def getColor(data):
    picker=random.randint(0,4)
    print picker
    if(picker==1):
        data.hasColor=True
        data.shapeColor=255,0,0
        data.color="red"
    if(picker==2):
        data.hasColor=True
        data.shapeColor=0,255,0
        data.color="green"
    else:
        data.hasColor=True
        data.shapeColor=0,0,255
        data.color="blue"

if __name__ == '__main__':
    WINSIZE = 800, 600
    pygame.init()
    class Struct: pass
    data = Struct()
    init(data)
    data.screen = pygame.display.set_mode(WINSIZE,0,16)    
    pygame.display.set_caption('Interactive Environment.')
    data.screen.fill(THECOLORS["white"])

    with nui.Runtime() as kinect:
        kinect.skeleton_engine.enabled = True
        kinect.skeleton_frame_ready += post_frame
        # Main game loop    
        while True:
            e = pygame.event.wait()
            frame = kinect.skeleton_engine.get_next_frame()
            for skeleton in frame.SkeletonData:
                if skeleton.eTrackingState == nui.SkeletonTrackingState.TRACKED:
                    data.tracked=True
                    initJoints(data,skeleton)
                    data.displayText="Need a Shape"
                    if(not data.hasShape):
                        getColor(data)
                        if(data.head.y

Ralph-Assignment-09-Sketches

This is a piece by a kinetic sculptor called Zimoun. What I find enjoyable about this piece is how accessible it is. It has no baggage of social commentary or historical references. Any viewer of any background can jump in and appreciate the piece for what it is: something silly and fun that you can appreciate for just its formal qualities like the cacophonous sounds and simple motions. It feels like new media arts is somewhat saturated with works where the form of the piece serves the content or the technology, so it’s refreshing for me to see this piece where the form is the focus.

[REDACTED]

This is a work by Nils Volker. It is an installation placed on a wall that reacts to the movements of a viewer. It’s interesting how the artist was able to take an object as mundane as plastic bags and create something beautiful and intriguing. The plastic bags have an inherent property of flexibility and transience — that is, they wrinkle easily and don’t keep definite form — so it’s striking how the artist managed to make the bags move in an orderly fashion.

Ideas:

20131122_193543

The Rainbox is a compact cube with a sensor that extends under the mattress of one’s sleeping space. When it senses that someone is lying on the bed, the motor in the cube will turn a rain stick within the cube, simulating the sound of rain.
I used to have a bit of a sleeping problem, where I would be lying in bed for hours on end trying to fall asleep, but my fear of the dark, the complete silence, and random thoughts about death would keep me awake to no end. During those times, the gentle sound of rain would relax me and distract my senses enough to allow me to fall asleep. Other times, I would just leave the television on and muted to keep the room dimly lit — the television would make me feel like there are others awake at that time and let me feel not as alone.
I want to create a piece that hearkens back to these times and provide me, and hopefully others, with an immense sense of comfort before sleep. The motor will stop turning after about 30 minutes, and it may even be used as a gentle alarm clock. If possible, I would also like to attach a small LCD screen with a looping video of raindrops on a glass pane to keep the room dimly lit.

20131122_193553

The New Media Art is a realization of Jim Campbell’s “Formula for Computer Art”. This abomination will have a wide variety of cheap sensors and data collecting algorithms hooked up to the computer inside the piece as well as various different media of outputs. When the user presses the “Make Art” button, the computer will choose a random method of input and a random form of output and link them together. For example, pressing the button may trigger the proximity sensor, and the data may be transferred to the motor and make the joints in the piece dance. I want this to be an experiment in how meaning might come out of random generation.

Get Out of My Space!

[vimeo 79357294]

by Chloe and Kristina,

with many thanks to Michelle Ma for acting for us 🙂

The following is out Fritzing diagram: 

 

Clearly the epitome of style, this personal space sensor is all you’ll ever need to let those people who’ve been in your bubble too long it’s time for them to go — or at least to take a step back.

Screen Shot 2013-11-14 at 2.30.19 PM

 We used: the Pico Buzzer from our Arduino Kits, the 7-segment display provided for the assignment, and an arbitrary InfraRed sensor in order to sense the distance between a person and the machine.

The following images are excerpts from our planning:

20131114_155939 20131114_155928

And here is our code:



#include <Wire.h> // Enable this line if using Arduino Uno, Mega, etc.
#include "Adafruit_LEDBackpack.h"
#include "Adafruit_GFX.h"

int irReader = 2;
int irVal = 0;
int buzzerPin = 9;
int songLength = 8;
int frequencies[] = {
330,262,330,440,330,262,330,440};
int tempo = 150;
int lastCountDownTime = 0;
boolean play = false;
int displayNum = 10;

void playBuzzer(boolean play){
if (play == true){
int i,tempo;
for ( i = 0; i < 2; i++){
tone(buzzerPin,frequencies[i],tempo);
delay(tempo);
}
}
while(true){
}
}

Adafruit_7segment matrix = Adafruit_7segment();
int myNum;
void setup() {
pinMode(buzzerPin,OUTPUT);
myNum = 10;
Serial.begin(9600);
Serial.println("7 Segment Backpack Test");
matrix.begin(0x70);

}

void loop() {
play = false;
int now = millis();
irVal = analogRead(irReader);
Serial.println(irVal);

if (irVal > 250){
int elapsed = now-lastCountDownTime;
if (elapsed >= 1000)
{
myNum = (myNum + 1)%11;
int displayNum = 10-myNum;
lastCountDownTime = now;
matrix.print(displayNum, DEC);
matrix.println(displayNum);
matrix.writeDisplay();
if (displayNum == 0) {
if (irVal > 250){
//play = true;
displayNum = 0;
//playBuzzer(play);
tone(buzzerPin,1000,tempo);
delay(200);
tone(buzzerPin,1000,tempo);
delay(200);
tone(buzzerPin,1000,tempo);

}
else
{
play = false;
playBuzzer(play);
}
}

}
play = false;
}
else
{
myNum = -1;
}

}

 

 

Ticha-Arduino Exercises

EDIT: NEVER MIND about the issue I had with #12 – it was really dumb…

In general, I found these exercises to be relatively enjoyable (albeit a little dull) and instructive in allowing me to get acquainted with the Arduino. I actually felt that the circuit diagrams were more useful than the pictorial representations because they clearly showed how the components related to each other.

Also I couldn’t find a way to properly document #12, so please assume that since it moved somehow, it must have worked…

Finally to prevent too many people from getting Rickrolled, I decided to use my own tune for #11. 😀

Arduino Tutorials

Circuit 1:
2013-10-31 18.27.37

Circuit 2:
2013-10-31 18.26.20

Circuit 3:
2013-10-31 18.22.03

Circuit 4:
2013-10-31 15.01.12

Circuit 5:
2013-10-31 15.16.51

Circuit 6:
2013-10-31 15.44.01

Circuit 7:
2013-10-31 16.17.48

Circuit 8:
2013-10-31 16.30.11

Circuit 9:
2013-10-31 17.20.54

Circuit 10:
2013-10-31 17.29.35

Circuit 11:
2013-10-31 17.38.09

Circuit 12:
2013-10-31 17.59.00

Arduino Circuit Exercises

Yay! Arduinos!

Circuit 1
IMG_2909

Circuit 2
IMG_2910

Circuit 3 – Combined the RGB thing with the Potentiometer thing and practiced making Fritzing diagrams
circuit rainbow_bb

Circuit 4
IMG_2912

Circuit 5
IMG_2913

Circuit 6
IMG_2915

Circuit 7 – Learned not to confuse the transistor with the temperature sensor
IMG_2916

Circuit 8
IMG_2917

Circuit 9
IMG_2918

Circuit 10
IMG_2920

Circuit 11
IMG_2921

Circuit 12
IMG_2922

*Forgot to document 13 and 14

First Soldering attempt… Collaborated with Rachel!
IMG_2926

Rachel-Arduino Practice Circuits

Circuit 1
 photo Circuit1_zpsf3317656.jpg

Circuit 2
 photo Circuit2_zps35bb141a.jpg

Circuit 3
 photo Circuit3_zpsbd158798.jpg

Circuit 4
 photo Circuit4_zps2645b768.jpg

Circuit 5
 photo Circuit5_zpsd32cffdc.jpg

Circuit 6
 photo Circuit6_zpsf1a3ded4.jpg

Circuit 7
 photo Circuit7_zps787308de.jpg

Circuit 8
 photo Circuit8_zpsd08703ab.jpg

Circuit 9
 photo Circuit9_zpsc84a4496.jpg

Circuit 10
 photo Circuit10_zpsbea352de.jpg

Circuit 11
 photo Circuit11_zpsebb663ae.jpg

Circuit 12
 photo Circuit12_zps07ea80bb.jpg

Circuit 13
 photo 100_2251_zpsd344002f.jpg

Circuit 14
 photo 100_2252_zpsba3a2d64.jpg