Category: Uncategorized

Elizabeth Agyemang-FinalProject: SubTweets

 

 

 


As I discussed in my last post, seen here, on twitter the culture of ‘subtweeting’ has been the source of many public arguments, rages and tension between users–from everyday people, to celebrities, organizations and even politicians. The project SubTweets looks to comment on this widespread practice by parodying the act of subtweeting through the voice of submarine sandwiches. In the project submarine sandwiches are weighed on a scale, resulting in a tweet, or a ‘subtweet’, being sent to twitter through a Processing program that utilizes Temboo in order to tweet.

SubTweets SubTweets: The Project

The piece seeks to generate a conversation on how people communicate in the modern digital world. More and more it seems, when social and personal tensions arise, people turn to forums like Twitter to vent their frustration and anger. However, rather than resolving such issues, subtweeting instead heightens them, resulting in rage wars and twitter battles.  Subtweets looks to comment on this culture through humor and food.

 

How it works:

The sandwiches are weighed by a digital scale and, depending on the weight and a list of properties stored in the computer, the program will recognize the type of sub and will generate the subtweets based on other parameters.

20141117_200254-e1416272887526

 

Implementation

Surprisingly, the process of creating this program was a lot more complicated than I initially anticipated.

The Scale(s)

When I first began to prepare for this project, I knew I needed to purchase a scale that would be able to communicate with a computer, via a usb port, and send and receive bytes of information. I browsed through  varying catalog ads and looked at programmers who had also used a scale to interact with a computer. I purchased a scale, a DYMO 10 lb that had a usb port capabilities with intention of getting the HUI device to interact with my computer in the same means the other programmer sources had suggested–through an open source program called libusb which gave a HUI device or any simple device the ability to interact and be read by a computer. Well, as things turned out, getting the computer to read the scale wasn’t my only problem, it was getting the scale to communicate back to the computer consistently. Though I was able to garner some communication from the computer and the scale, and after a lot of consultation and help from my professor, in the end another scale needed to be purchased. In the meantime I went to work on creating the code.

 

The other scale

This scale, an older, discontinued model,  connected to my computer as I had initially assumed any usb scale would, through serial communication rather than HUI. After simply installing the driver the scale was able to communicate reliably and completely with my computer.

Failures and Success

In the ended, I remained slightly disappointed that I wasn’t able to get the DYMO scale working with my computer however, the issue may in fact have stemmed from an issue from the scale rather than my computer.

Code wise, when I imagined this program I envisioned something that would be better able to distinguish from different types of sandwiches. However, because it uses a scale, I quickly came to realize that the ways in which I might gain information from the sandwich was much more limited than I assumed. At the moment, the program only differentiates between 6 inch subs and footlongs and the proceeding code and comments go along with this. Also, because I was creating the comments the subs could tweet,  there weren’t as much variety, at least in the hashtagging as I would like. Even so, I am still very proud of the way the program turned out.

During critiques I got a lot of really positive feedback on my program and I was really enthused to see people reacting so well to the humorous nature of the piece. I really feel that this was one of my most successful projects in that I was able to create a program, that in some way, had it’s own personality.

The project can be found on twitter here

 

Code

//Importing the libraries

// Processing code for a 7010sb-scale

import processing.serial.*;
Serial myPort;  // Create object from Serial class
import com.temboo.core.*;
import com.temboo.Library.Twitter.Tweets.*;

int myData[];
int byteCount = 0; 

float oldsixtweets=0;
float oldfoottweets=0;


// Create a session using your Temboo account application details
TembooSession session = new TembooSession("marinesub", "myFirstApp", "8d605622f29f4b77b4e783db227d880a");




//checking sixinch tweets

float checktweetssix=0;
float checktweetsfoot=0;


////
//tracking changes
float minimumAcceptableSandwichWeight = 10; 
float timeAveragedAbsoluteChangeInWeight = 0; 
float prevWeight = 0;
float subWeight = 0; 
float weightAtLastTweet; 
int lastTimeITweeted = -10000; 
float currWeight = 0; 

//if the sub is a sixinch or a footlong
boolean isSixInch= false;
boolean isFootlong=false; 
float sizeSixInchSub;
float  footLongSUB;
//StringList insult; 
String [] insult=  new String [31];
String[] boasting = new String[10];
String[] hashtag = new String[7];
String[] people = new String[4];
String [] finaltweet= new String[2];
String[] both = new String[3];

//what to tweet
String [] maintweet = new String[3];
//



//generating tweets
int [] sixTweets = new int[5];
int [] footTweets = new int[5];

//making sure not the same tweet is said
int currentInsult;
int oldInsult;
int changeInsult; 
//changng if to tweet about foot longs or six inch
int update=0;

//------------------------------------
void setup() {
  size (1000, 500);
  // Run the StatusesUpdate Choreo function

  runStatusesUpdateChoreo();
  //looking at ports, choose 0
  String portName = Serial.list()[0];
  myPort = new Serial(this, portName, 2400);

  myData = new int[10]; 
  byteCount = 0;

  //-----------------------------------

  //  insult = new StringList();
  //six inch general 
  insult [0] = " ";
  insult [1] = "Every sub has to be okay with the fact that some subs taste better than others";
  insult [2] = "Man these wiches be delicious";
  insult [3] = "It's not a subtweet if you're not a sandwich";
  insult [4] = "You subtweet like a bro. And by bro, I mean burger";
  insult [5] = "nothing annoys me more than when people assume a sandwich is for them";
  insult [6] = "Eat your sub right or someone else will";
  insult [7] = "Don't assume a sub is just for you . They can literally be eaten by anyone, then you look HUNGRY.";
  insult [8] = "Did you know that you can actually execute a brutal subtweet through the ancient art of sharing?";
  insult [9] = "I never tasted something so imperfectly perfect could exist until I tried you.";
  insult [10] = "You're so hungry, you probably think this sub is for you...";
  insult [11] = "Big bites come in little wrappers";
  insult [12] = "Idk about you, but i think it's pretty stupid if you don't buy a sub just because it's smaller";
  insult [13] = "I love when people bite me or eat other things to try and make me mad. I'm an sub, but keep going because it's amusing.";
  insult [14] = "if you're really  going to eat  a sub for its size  then you need to get your priorities straight.";
  insult[15] ="I can't stand when people  share subs like why don't you just eat your own";
  //foot long general insults
  insult[16] = "Sometimes I type out a subtweet about how unheahlty u are then I delete it bc I'm like what's the point I'm gonna eat you anyway";
  insult[17] = "Bigger is always better";
  insult[18] =  "Keep munching, in the end it's always the bigger sub that comes out stronger";
  insult[19] =  "You wish you could make them full like I do";
  insult[20] = "Two of you can't measure up to me" ;
  insult[21] =  "Some Subs are bigger than others";
  insult[22] =  "If you're gonna subtweet me at least get your ingredients about me straight first ";
  insult[23] =  "Sandwiches only rain on your paddy because their jealous of your crust and tired of their crumbs";
  insult[24] =  "I'm a sandwich that never ends, you ";
  insult[25] =  "OMG get over it, they buy me cus they have stomachs that you can't fill" ;
  insult[26] =  "If you're gunna throw out your sixinch or footlong it better be sublovin. There's no time for subhaters in a relationship.";
  insult[27] =  "I am a tasty sub & don't need to subtweet I am a tasty sub & don't need to subtweet I am a tasty sub & don't need to subtweet";
  insult[28] =  "people will tell you how you taste in a subtweet before they take a bite or have a taste";
  insult[29] =  "No I don't wanna munch @you that's the point of a sandwich, to eat it";
  insult[30] = "";
  //-----------------------------------
  //String[] boasting = new String[10];

  boasting [0]="";
  boasting[1] ="Girls will subtweet and put little music emojis after it to make it look like a song "  ; 
  boasting[2] ="I can't stand when couples subtweet each other like why don't you just text them the whole world doesn't need to see your drama";

  //-----------------------------------
  //String[] hashtag = new String[6];

  hashtag[0]="";
  hashtag[1]="#yeahiateit";
  hashtag[2]="#saucystuff";
  hashtag[3]= "#subterfuge";
  hashtag[4]= "#mineisbiggerthanyours";
  hashtag[5]= "#itsmysandwichandiwantitnow";
  hashtag[6]= "#BiteMe";

  //-----------------------------------
  //String[] people = new String[4];

  people[0]= "";
  people[1]="@subway";
  people[2]= "@quickcheck";
  people[3]= "@suboneelse";
  //String[] combinations = new String[30];

  //-----------------------------------
  //String both a hashtag or/and at people
  //using concat() function 
  //  Concatenates two arrays. For example, concatenating the array { 1, 2, 3 } 
  //and the array { 4, 5, 6 } yields { 1, 2, 3, 4, 5, 6 }. Both parameters must be arrays of the same datatype. 
  both = concat(hashtag, people) ;


  ///
  size(640, 360);

  noStroke();

  //
}

int testing(int rest) {
  rest=0;
  if ((sixTweets[1])!=0) {
    sixTweets[2]=rest;
  } else {
    sixTweets[2]=sixTweets[2];
  }

  return rest;
}


//-----------------------
void draw() {
  background(0);
  //-----------------------------------
  //six INCH sub Tweets
  //a six inch sub weighs from 100 to 290 grams

  //say nothing
  sixTweets[0] = 0;

  //insults
  sixTweets[1] = (int)random(1, 16);
  //boastsing
  sixTweets[2] = (int)random(2);

  //#hashtag
  sixTweets[3] = (int)random(6);

  //@aperson
  sixTweets[4] = (int)random(3);

  //----------------------------------
  //a footlong is from 291 to 500grams

  //int [] footTweets = new int [4];
  footTweets [0] = 0;
  //insults
  footTweets[1]= (int)random(16, 30);
  //boastsing
  footTweets[2] = (int)random(2);
  //#hashtag
  footTweets[3] = (int)random(6);
  //@aperson
  footTweets[4]= (int)random(3);

  ///



  // Read data from the scale until it stops reporting data. 
  while ( myPort.available () > 0) {  
    int val = myPort.read();         
    if (val == 2) {
      // Figure out when the message starts
      byteCount = 0;
    }
    // Fill up the myData with the data bytes in the right order
    myData[byteCount] = val; 
    byteCount = (byteCount+1)%10;
  }


  int myWeight = getWeight();//most recent value of the weight
  subWeight= myWeight;
  float changeInWeight = (subWeight - prevWeight);
  float absoluteChangeInWeight = abs(changeInWeight); 

  float A = 0.97;
  float B = 1.0-A; 
  timeAveragedAbsoluteChangeInWeight = A*timeAveragedAbsoluteChangeInWeight + B*absoluteChangeInWeight; 


  //saying that you can't boast and insult at the same time for sixinchtweets
  if ((sixTweets[1])!=0) {
    sixTweets[2]=0;
  } else {
    sixTweets[2]=sixTweets[2];
  }
  if ((sixTweets[3]!=0)) {
    sixTweets[4]=0;
  } else {
    sixTweets[4]=sixTweets[4];
  }


  //saying that you can't boast and insult at the same time for footlong
  if ((footTweets[1])!=0) {
    footTweets[2]=0;
  } else {
    footTweets[2]=footTweets[2];
  }
  if ((footTweets[3]!=0)) {
    footTweets[4]=0;
  } else {
    footTweets[4]=footTweets[4];
  }

  //----------------------------------------------

  //checking if the same tweets are being said
  oldInsult= sixTweets[1];
  changeInsult = (sixTweets[1]-oldInsult);

  //telling it to tweet if it is in this area
  sizeSixInchSub= constrain(subWeight, 100, 290);
  footLongSUB = constrain(subWeight, 291, 600);
  //sixInch();
  //  } else {
  println(" ");
  // }


  fill(255);
  strokeWeight(5); 




  //If the sandwich weighs more than some negligible amount, then bother tweeting;
  //  // Don't bother tweeting about an empty scale, even the weight has settled down (to nothing)
  if (subWeight > minimumAcceptableSandwichWeight) {
    //
    //    // If the change in weight has settled down (the weight is no longer changing)
    if (timeAveragedAbsoluteChangeInWeight < .02) {//was.02

        oldsixtweets= oldsixtweets;
      oldfoottweets= oldfoottweets;
      //if the weight of the sub is in the range that a six inch is, tweet what a six inch sub is
      checkconstraints();

      int now = millis(); 
      int elapsedMillisSinceILastTweeted = now - lastTimeITweeted; 
      //if weight change has past the oldtweets value changes

      //      // and if it has been at least 10 seconds since the time I tweeted
      if (elapsedMillisSinceILastTweeted > 10000) {//was 10000

          float weightDifference = abs(subWeight - weightAtLastTweet);
        if (weightDifference > 10) {
          //          // Then send the tweet
          if (isSixInch) {

            //if it is a six inch sub, check by making checktweet the value of the tweet
            if (sixTweets[1]!=oldsixtweets) {
              update=1;


              sixInch();
              runStatusesUpdateChoreo();


              checktweetssix=sixTweets[1];
            }
            println("maintweet", maintweet, "sixinchtwee/", sixTweets, "foottwees:", footTweets);
            checktweetsfoot=0;
            isSixInch=false;
          } 



          //if the sub is a footlong call the function for footlong tweets
          if (isFootlong) {
            update=2;
            //if the old footlong tweet is not new to the new footlong tweet, call the function
            if (oldfoottweets!=footTweets[1]) {

              footlong();
              println("They arent equal");
              runStatusesUpdateChoreo();
            } else {

              println("They are equal");
            }
          }
          isFootlong= false;
          //
        }
      }
    }
  }

  prevWeight = subWeight;


  println("checktweetsix:", checktweetssix, "checktweetsfoot:", checktweetsfoot, maintweet[update] );
}

//function that returns the value of the string the scale is giving out in ints
int getWeight() {

  /*
  // https://learn.adafruit.com/digital-shipping-scales/using-a-7010sb-scale
   STX character - Hex 0x02, indicates "Start of TeXt"
   Scale indicator - Hex 0xB0 for lb/oz and 0xC0 for grams
   Hex 0x80 (placeholder)
   Hex 0x80 (placeholder)
   First character of weight, ascii format
   Second character of weight, ascii format
   Third character of weight, ascii format
   Fourth character of weight, ascii format - single Ounces in Lb/Oz weight
   Fifth character of weight, ascii format - 1/10th Ounces in Lb/Oz weight
   Finishing carriage return - Hex 0x0D
   */

  String weightString = ""; 
  if (myData[1] == 192) { // grams yo
    for (int i=4; i<=8; i++) {
      if ((myData[i] >= 48) && (myData[i] <= 57)) { // only want characters between '0' and '9'
        weightString += (char)myData[i];
      }
    }
  } else {
    println ("Scale is using pounds/ounces, yuck");
  }

  int aWeight = 0; 
  if (weightString.length() > 0) {
    aWeight = Integer.parseInt(weightString);
  }
  return aWeight;
}




//function that tweets when the sixinch sub is called
void sixInch() {
  println("SubWeight:", subWeight, "sizeSixInchSub", abs(sizeSixInchSub));
  fill(255, 180);

  text((insult [sixTweets[1]]) + (boasting[sixTweets[2]])+(hashtag[sixTweets[3]])+(people[sixTweets[4]]), 50, 50);
  maintweet[1]= (insult [sixTweets[1]]) + (boasting[sixTweets[2]])+(hashtag[sixTweets[3]])+(people[sixTweets[4]]);
  //  println("changeInsult:", changeInsult, "oldInsult:", oldInsult,"subWeight:", subWeight, "mouseX:", mouseX);

  lastTimeITweeted = millis();
  weightAtLastTweet = subWeight;
  oldsixtweets=sixTweets[1];


  //making it so that a six inch is no longer true
}

void footlong() {
  println("SubWeight:", subWeight, "sizeSixInchSub", abs(sizeSixInchSub));

  fill(255, 180);
  text((insult [footTweets[1]]) + (boasting[footTweets[2]])+(hashtag[footTweets[3]])+(people[footTweets[4]]), 50, 50);

  maintweet[2]=(insult [footTweets[1]]) + (boasting[footTweets[2]])+(hashtag[footTweets[3]])+(people[footTweets[4]]);
  //  println("changeInsult:", changeInsult, "oldInsult:", oldInsult,"subWeight:", subWeight, "mouseX:", mouseX);

  lastTimeITweeted = millis();
  weightAtLastTweet = subWeight;
  oldfoottweets=footTweets[1];
}

void checkconstraints() {

  if (subWeight==sizeSixInchSub) {

    isSixInch=true;
    isFootlong=false;
  } else if (subWeight==footLongSUB) {
    //if the weight of the sub is in the range that a footlong is, tweet what a footlong sub is
    isFootlong=true;
    isSixInch=false;
  } else if (subWeight==0) {
    isSixInch=false;
    isFootlong=false;
  }
  if (subWeight==0) {
    println("it's zero");
  }
}
void noWeight() {
  if (subWeight==0) {
    isSixInch=false;
    isFootlong=false;
  }
}




void runStatusesUpdateChoreo() {
  //  // Create the Choreo object using your Temboo session
  StatusesUpdate statusesUpdateChoreo = new StatusesUpdate(session);
  //
  //  // Set credential
  statusesUpdateChoreo.setCredential("subtweetingprogram");
  //
  //  // Set inputs
  statusesUpdateChoreo.setAccessToken("2874650849-kXG8QbNDe68kp1eFJFzTmQ70n7QN0RgH7txPTCL");
  statusesUpdateChoreo.setAccessTokenSecret("wlyBaIu491BgvjH18uHG1ZeDptl6jvrfCg02NwhZ5EwI9");
  statusesUpdateChoreo.setConsumerSecret("g8KCtyJjKDI5MciTvYfRLWhO8kYfP7o6UhRPM7HZkvNrT2KrFo");
  statusesUpdateChoreo.setStatusUpdate(maintweet[update]);
  statusesUpdateChoreo.setConsumerKey("t0BheAwlLdNESJqwmVJD2mHaK");
  //  // Run the Choreo and store the results
  StatusesUpdateResultSet statusesUpdateResults = statusesUpdateChoreo.run();
  //
  // Print results
//  println(statusesUpdateResults.getResponse());
  }

//

 

Sources of other programers communicating with scales and other HUI devices

http://tim.cexx.org/?p=470

https://code.google.com/p/java-simple-serial-connector/

http://coffeefueledcreations.com/blog/?p=131

Reading a Dymo USB scale using Python

New Media Manifesto – Will Taylor

The verses below were created with the help of a program, written in Python, that generates statements of intent for the “New Media” artistic movement.

They filter lasting proximities for one that is frontier.

Filter that sensation beneath next data.

Immersive light past thine silence,

Positively filtering physical senses.

Any and all frontiers personify through metaphysics.

Us trolls detect all arts and culture within thine sound.

Processing silence behind fewest lights,

New Media senses the cultural silence.

New Media senses the ethereal arts and culture alongside evaluating generations.

Filtering apparatuses, it senses light amongst every moment.

New Media uncovers intimacies.

New Media denotes the evocative vision and modernity,

Past personifying sensations.

Straddling post-humanism, it will one day uncover glimpses near lasting simulation.

New Media hopes to substantiate divine materialists.

 

Screen Shot 2014-12-09 at 6.37.36 AM

Screen Shot 2014-12-09 at 5.54.37 AM

 

1.     A background for each statement is randomly chosen from a selection of long-exposure photographs I shot in early December 2014.

2.    The title, ‘New Media’, is placed at the top of the image.

3.    The statement of intent is now generated. Three lines of text are created using various sentence-structure templates. The computer selects a string for each syntactic component of a template, and combines them to create a line of text. Each selection is taken from a list of strings containing words of the indicated part of speech. These lines of text are combined into a verse, and placed onto the image.

4.   The image is now saved as a (.jpg) file to a folder in DropBox titled, “New Media Manifesto”. The name of the file is determined by the month, day, hour, minute, and second the program is run, in order to avoid duplicate filenames.

5.   When a new image is saved into the folder “New Media Manifesto, IFTTT (If This, Then That) automatically uploads the image to my twitter (@WTVisual) and a Tumblr account I created for the assignment (NewMediaManifesto.tumblr.com).

 

Screen Shot 2014-12-09 at 6.27.30 AM

 

 

'''
Will Taylor
New Media Manifesto
EMS Final
'''
import random
backdrops = [ "bg1.jpg", "bg2.jpg", "bg3.jpg", "bg4.jpg",
"bg5.jpg", "bg6.jpg", "bg7.jpg", "bg8.jpg",
"bg9.jpg", "bg10.jpg"]

def getNewPhoto():
photo = backdrops[random.randint(0,len(backdrops)-1)]
return photo

def setup():
global img
photo = getNewPhoto()
print photo
img = loadImage(photo)
size(img.width, img.height)


def draw():
h,m,sec = hour(), minute(), second()
mon, d = month(), day()
smooth()
image(img,0,0)
drawTitle()
drawLetterhead()
save("/Users/WillT/Dropbox/NewMediaManifesto/NM%d%d%d%d%d.jpg" % (mon, d, h, m, sec))
noLoop()

def drawTitle():
strokeWeight(2)
stroke(255)
f = createFont('Superclarendon-BlackItalic', 115)
f2 = createFont('Superclarendon-BlackItalic', 115)
textFont(f2)
fill(255)
textAlign(CENTER)
text("NEW MEDIA", width/2 - 2, width/5, 100)
textFont(f)
fill(0)
text("NEW MEDIA", width/2, width/5, 50)

def drawLetterhead():
letterhead = newLetterhead(pronouns, nouns, adjectives, verbs, adverbs, prepositions, determiners)
f = createFont('Futura', 28)
f2 = createFont('Futura', 28)
textFont(f2)
fill(0)
textAlign(CENTER)
text(letterhead, width/2 - 2, 3*width/9, 100)
textFont(f)
fill(255)
text(letterhead, width/2, 3*width/9, 50)
singular, plural = 0 , 1
active, capitalized = 1, 2

# nouns[0] = singular, nouns [1] = plural
nouns = [ ["silence", "light", "sound", "gear",
"mechanic", "internet", "technology",
"network", "sensor", "aesthetic", "frontier",
"sensation", "magic", "interaction",
"glimpse", "moment", "immersion", "space",
"Digital Space", "interconnectivity", "telepresence", "transmission",
"simulation", "metaphysics", "data", "post-humanism",
"Concept", "machine", "metaphor", "intimacy",
"proximity", "approach", "arts and culture", 'the embodiment',
'optimism', 'body', 'fanatics',

'ontology','material','materiality','environment',
'simulated environment','media-realism','temporality','phenomenological approach',
'complexity','paradox','apparatus','aurality',
'evolution','progress','vision','modernity',
'vision and modernity','inner speech','essence','ideology',
'objectivity','sense','senses','future',
'present','technoculture','invocation','meme',
'transubstantiation','divine transubstantiation','normativity','occularcentrism',
'singularity','sonority','impermanence','elite',
'synesthesia','amplitude','fluidity','evanescence',
'artificial intelligence','artificial life','generativity','imagination',
'visionary','objectivity','object','materialism', 'elitism',
'foundation','root','originality','the Digital Age', 'the Digital Era',
'society', 'experience', 'fanatic', 'oxymoronic concept of "inner speech"'
'uniformity'],

[ "silences", "lights", "sounds", "gears",
"mechanics", "internet", "technologies",
"networks", "sensors", "aesthetics", "frontiers",
"sensations", "magics", "interactions",
"glimpses", "moments", "immersions", "spaces",
"Digital Spaces", "interconnectivities", "telepresences", "transmissions",
"simulation", "metaphysics", "data", "post-humanism",
"concepts", "machines","metaphors", "intimacies",
"proximities", 'approaches', 'arts and culture', 'embodiments',
'optimisms', 'bodies', 'fanatics',

'materials','materialities','environments','simulated environment',
'media-realisms','temporalities','phenomenological approaches','complexities',
'paradoxes','apparatuses','auralities','evolutions',
'progressions','visions','visionaries','modernities',
'inner monologues','ideologies','essences','objects',
'objectivity','objectives','senses','futures',
'technocultures','invocations','memes','transubstantiations',
'divine transubstantiations','normativities','occularcentrisms','concepts of singularity',
'sonorities','impermanences','elites','elitists',
'synesthetics','amplitudes','fluidities','artificial intelligences',
'generations','imaginations','dreams','visionaries',
'materialists','materialisms','foundations','roots',
'societies', 'cultures', 'experiences', 'uniformities']
]

pronouns = ["New Media", "New Media", "New Media", "New Media", # (New media == life)
"New Media", "New Media", "New Media", "New Media",
"New Media", "New Media", "New Media", "New Media"]

adjectives = [ "new", "cold", "interactive", "magical",
"ubiquitous", "mystical", "immersive", "digital",
"auditory", "total", "metaphysical", "conceptual",
"further", "beyond", "disembodying", "simulated",
"physical", "emotional", "aural", "cosmic",
"ethereal", "magical", "connected", "dynamic",
"intimate", "cultural", "significant", "virtual",
"synthetic", "philisophical", "intellectual", "personified",
"hacktivist",

'synesthetic','technocultural','phenomenological','paridoxical',
'ontological','transcendental','technological','progressive',
'impermanent','dominant','ideal','sensitive',
'present','modern day','evocative','ephemeral',
'embodied','metaphysical','imaginative','fluid',
'artificial','presented','amplified','exemplified',
'exclamative','occularcentric','societal','cultural',
'divine','sonorous','elitist','conceptual',
'environmental','digital','spatial','electric',
'energetic','abstract','implied','associative',
'inquisitive','','','',]

verbs = [
[ "senses", "connects", "catches", "immerses",
"envelops", "points", "clicks", "transmits",
"transcends", "points towards", "manuevers", "paves",
"moves", "processes", "restores", "aborts",
"deletes", "detects", "disembodies", "filters",
"straddles", "derives", "trolls", "",
"hacks", "personifies", "operates", "destroys",
'reapproaches', 'massacres', 'redefines', 'defines',

'questions','ponders','approaches','justifies',
'seeks to find','discovers','finds','denotes',
'searches for','evaluates','rediscovers','uncovers',
'develops','derives from','is','was',
'will be','hopes to find','hopes to maintain','maintains',
'objectifies','glorifies','complicates','reimagine',
'imagines','will one day uncover','hopes to substantiate','portrays'],

#active verbs
["sensing", "connecting", "catching", "immersing",
"enveloping", "pointing",'pointing at', 'pointing towards', "clicking", "transmitting",
"transcending", "manuevering through", 'manuevering past', "paving",
"moving", "processing", "restoring", "aborting",
"deleting", "detecting", "disembodying", "filtering",
"straddling", "deriving", "trolling", "reassessing",
"hacking", "personifying", "operating", "destroying",
'reapproaching', 'the massacre of', 'redefining', 'defining',

'questioning','pondering','approaching','justifying',
'seeking to find','discovering','finding','denoting',
'searching for','evaluating','rediscovering','uncovering',
'developing','deriving','being',
'hoping to find','hoping to maintain','maintaining',
'objectifying','glorifying','complicating','reimagining',
'imagining','the uncovering of','the hope of substantiating','portraying'],

##capitalized
["Sensing", "Connecting", "Catching", "Immersing",
"Enveloping",'Pointing at', 'Pointing towards', "clicking", "transmitting",
"Transcending", "Manuevering through", 'manuevering past', "paving",
"Moving", "Processing", "restoring", "aborting",
"Deleting", "Detecting", "Disembodying", "Filtering",
"Straddling", "Deriving", "Trolling", "Reassessing",
"Hacking", "Personifying", "Operating", "Destroying",
'Reapproaching', 'The massacre of', 'Redefining', 'Defining',

'Questioning','Pondering','Approaching','justifying',
'Seeking to find','Discovering','Finding','Denoting',
'Searching for','Evaluating','Rediscovering','Uncovering',
'Developing','Deriving','Being',
'Hoping to find','Hoping to maintain','Maintaining',
'Objectifying','Glorifying','Complicating','Reimagining',
'Imagining','The uncovering of','The hope of substantiating','Portraying']

]

prepositions = ["aboard", "about", "above", "across",
"below", "beneath", "beside", "from",
"inside", "into", "like", "near",
"over", "past", "within", "among",
"amongst", "interior", "on", "til",
"toward", "under", "underneath", "upon",
"after", "before", "behind", "except",
'through','throughout','contained in','held within',
'defined by','in association with','in its relation to','by questioning',
'in the questioning of','in its explorations','while exploring',
'in its exploration of']

adverbs = [ "quickly", "bravely", "promptly", "silently",
"slowly", "stylishly", "abnormally", "non-conventionally",
"boldly", "astonishingly", "currently", "mechanically",
"materially", "literally", "magically", "negatively",
"positively", "vivaciously", "vigilantly", "unemotionally",
"scientifically", "structurally", "spectacularly", "visually",
"yearningly", "truthfully", "figuratively", "metaphorically",
"usefully", "uselessly", "systematically", "poetically",
"ominously", "overconfidently", "negatively", "positively",

'capriciously','inpermanently','permanently','sonically',
'socially','culturally','transcendentally','phenomenally',
'intelligently','progressively','synesthetically','abstractly',
'occularcentrically','temporarily','materially','conceptually',
'technologically','simply','easily','swiftly',
'exponentially','deeply','thoroughly','briefly',
'completely','entirely','uniformly','meticulously',
'broadly','specifically','efficiently','productively']

determiners = ["hella", "my", "certain", "any and all",
"enough", "next", "neither", "other",
"thine", "many", "each and every", "fewer",
"fewest", "most", "some kind of", "some sort of",
"last", "lasting", "every",
"every", "this and that", "that and this", "this of a",
"one that is", "yonder", "several", "that",
'this','each','all that is','everything that is',
'everything that contains','worldly','an','all',
'the entirety of','whichever','whatever','our',
'any','the most pronounced','obvious cases of','cases of',
'situations involving','involvements in','pertaining to',
'savory','our',"the world's",'American', 'does not',
'aims to']



def newLetterhead(pronouns, nouns, adjectives, verbs, adverbs, prepositions, determiners):
singular, plural, active = 0, 1, 1
# v - adj - n(singular) - prep - ve - n(plural)
line1 = "\n %s %s the %s %s \n %s %s %s..." % ( pronouns[random.randint(0, len(pronouns) -1 )],
verbs[0][random.randint(0, len(verbs[0])-1)],
adjectives[random.randint(0, len(adjectives)-1)],
nouns[singular][random.randint(0, len(nouns[singular])-1)],
prepositions[random.randint(0, len(prepositions)-1)],
verbs[active][random.randint(0, len(verbs[active])-1)],
nouns[plural][random.randint(0,len(nouns[plural]) -1)]

)

# v - adj - n(plural) - v - n(plural) - prep - det - n(plural)
line2 = "%s %s %s, \n it %s %s %s %s %s." % ( verbs[capitalized][random.randint(0, len(verbs[capitalized])-1)],
adjectives[random.randint(0, len(adjectives)-1)],
nouns[plural][random.randint(0, len(nouns[plural])-1)],
verbs[0][random.randint(0, len(verbs[0])-1)],
nouns[plural][random.randint(0, len(nouns[plural])-1)],
prepositions[random.randint(0, len(prepositions)-1)],

#nouns[singular][random.randint(0,len(nouns[singular]) -1)],
determiners[random.randint(0, len(determiners)-1)],
nouns[plural][random.randint(0, len(nouns[plural])-1)]
)

# pn - v - adj - n(plural)
line3 = " %s %s %s %s" % ( pronouns[random.randint(0, len(pronouns) -1 )],
verbs[0][random.randint(0, len(verbs[0])-1)],
adjectives[random.randint(0, len(adjectives)-1)],
nouns[plural][random.randint(0, len(nouns[plural])-1)],
)
letterhead = "%s \n %s \n %s." % (line1, line2, line3)
return letterhead

newLetterhead(pronouns, nouns, adjectives, verbs, adverbs, prepositions, determiners)

 
Written by Comments Off on New Media Manifesto – Will Taylor Posted in Uncategorized

Final Project : Voices In Your Head Beginnings

My final project consists of a chunk of my larger project, Voices In Your Head. This project utilizes a Soundlazer to project sound to unsuspecting passers-by without revealing its location. The properties of parametric speakers allows one to ‘bounce’ sound off of surfaces to the ears of the intended recipient. Using a Kinect and site-specific trigonometric calculations, the speakers position themselves to target specific people within its range. The recipient’s ears can only track the sound to the last surface it bounced off of, giving the illusion that a wall or some other inanimate object is talking to them. A diagram of this phenomenon is displayed below:

20141205_150938

The sound will be controlled via a PureData sketch run on the Raspberry Pi. The subject matter of the audio will be different depending on the number of people within audible proximity of the installation. Even if there are people nearby, the speakers will only emit sound for a small fraction of the total number of people walking by. Also, if anyone approaches the perceived origin of the sound, then the speaker will stop emitting sound, so as not to be discovered. If a group of two or more people are near the speaker, the Kinect will . However, a person by themselves will have a voice whisper in their ear. The voice will whisper messages intended to cause paranoia, shame, or megalomania (eg. “(s)he’s cheating on you” , “you are God” , “i hate you” etc). If someone brings someone else to hear it, they will hear nothing. Not only this, but even when people are alone, the speaker will only trigger a very small percentage of the time that its conditions are met.

Final Project Alex Walker

Finger Painting: The Painter’s Technological Aid

For this final project, I wanted to create something that I could potentially use in my painting practice. The first day we talked about sensors, I was fascinated by the idea of being able to control color with the movements of one’s body. Its original intent was to be a library for color that the painter had at his or her disposal in cause they could not think of the right color to use. It then progressed into a light source that cause the pigmentation in the painting to change color as you control the color with your hand. This hopefully would create interesting compositions while the artist is painting because the shapes and colors being shown would be changing constantly. I used the LeapMotion hand tracking sensor as my primary tool.  Using the code that translated the information from the sensor to processing, I used the coordinates of the palm to be connected to the elves of red, green, and blue.  Red is controlled by the movement on the x axis, Green by the movement ion the y axis, and Blue by the movement on the z axis.

import de.voidplus.leapmotion.*;
boolean sketchFullScreen() {
return true;}

LeapMotion leap;

void setup(){
  size(displayWidth, displayHeight, OPENGL);
  background(255);
  // ...

  leap = new LeapMotion(this);
}

void draw(){
  background (255);
  // ...
  int fps = leap.getFrameRate();


  // ========= HANDS =========
    
  for(Hand hand : leap.getHands()){


    // ----- BASICS -----
        
    int     hand_id          = hand.getId();
    PVector hand_position    = hand.getPosition();
    PVector hand_stabilized  = hand.getStabilizedPosition();
    PVector hand_direction   = hand.getDirection();
    PVector hand_dynamics    = hand.getDynamics();
    float   hand_roll        = hand.getRoll();
    float   hand_pitch       = hand.getPitch();
    float   hand_yaw         = hand.getYaw();
    boolean hand_is_left     = hand.isLeft();
    boolean hand_is_right    = hand.isRight();
    float   hand_grab        = hand.getGrabStrength();
    float   hand_pinch       = hand.getPinchStrength();
    float   hand_time        = hand.getTimeVisible();
    PVector sphere_position  = hand.getSpherePosition();
    float   sphere_radius    = hand.getSphereRadius();

  
    // ----- SPECIFIC FINGER -----
        
    Finger  finger_thumb     = hand.getThumb();
    // or                      hand.getFinger("thumb");
    // or                      hand.getFinger(0);

    Finger  finger_index     = hand.getIndexFinger();
    // or                      hand.getFinger("index");
    // or                      hand.getFinger(1);

    Finger  finger_middle    = hand.getMiddleFinger();
    // or                      hand.getFinger("middle");
    // or                      hand.getFinger(2);

    Finger  finger_ring      = hand.getRingFinger();
    // or                      hand.getFinger("ring");
    // or                      hand.getFinger(3);

    Finger  finger_pink      = hand.getPinkyFinger();
    // or                      hand.getFinger("pinky");
    // or                      hand.getFinger(4);        


    // ----- DRAWING -----
        
    hand.draw();
    // hand.drawSphere();
    // println(hand.getPosition());

    float myhandx = hand.getPosition().x;
    float myhandy = hand.getPosition().y;
    float myhandz = hand.getPosition().z;
    //println ("myhandx = " + myhandx); 
    //println ("myhandy="+myhandy);
    println ("myhandz="+myhandz);
    float myred = map(myhandx, 0,displayWidth, 0,255); 
    float mygreen= map(myhandy, 0,displayHeight, 0,255);
    float myblue= map(myhandz, 110,-25, 0,255);
    background (myred, mygreen, myblue);
    //fill (myred, mygreen,myblue); 
    //rect(0,0, displayWidth, displayHeight); 
    //textSize(displayHeight/2);
    //text("color", 100, displayHeight/2); 
    //fill(mygreen,myblue,myred);

    // ========= ARM =========
        
    if(hand.hasArm()){
      Arm     arm               = hand.getArm();
      float   arm_width         = arm.getWidth();
      PVector arm_wrist_pos     = arm.getWristPosition();
      PVector arm_elbow_pos     = arm.getElbowPosition();
    }
    


    // ========= FINGERS =========
        
    for(Finger finger : hand.getFingers()){
      
      
      // ----- BASICS -----
            
      int     finger_id         = finger.getId();
      PVector finger_position   = finger.getPosition();
      PVector finger_stabilized = finger.getStabilizedPosition();
      PVector finger_velocity   = finger.getVelocity();
      PVector finger_direction  = finger.getDirection();
      float   finger_time       = finger.getTimeVisible();


      // ----- SPECIFIC FINGER -----
            
      switch(finger.getType()){
        case 0:
          // System.out.println("thumb");
          break;
        case 1:
          // System.out.println("index");
          break;
        case 2:
          // System.out.println("middle");
          break;
        case 3:
          // System.out.println("ring");
          break;
        case 4:
          // System.out.println("pinky");
          break;
      }


      // ----- SPECIFIC BONE -----
            
      Bone    bone_distal       = finger.getDistalBone();
      // or                       finger.get("distal");
      // or                       finger.getBone(0);
            
      Bone    bone_intermediate = finger.getIntermediateBone();
      // or                       finger.get("intermediate");
      // or                       finger.getBone(1);
            
      Bone    bone_proximal     = finger.getProximalBone();
      // or                       finger.get("proximal");
      // or                       finger.getBone(2);

      Bone    bone_metacarpal   = finger.getMetacarpalBone();
      // or                       finger.get("metacarpal");
      // or                       finger.getBone(3);
            
            
      // ----- DRAWING -----
            
      // finger.draw(); // = drawLines()+drawJoints()
      // finger.drawLines();
      // finger.drawJoints();


      // ----- TOUCH EMULATION -----
            
      int     touch_zone        = finger.getTouchZone();
      float   touch_distance    = finger.getTouchDistance();
      
      switch(touch_zone){
        case -1: // None
          break;
        case 0: // Hovering
          // println("Hovering (#"+finger_id+"): "+touch_distance);
          break;
        case 1: // Touching
          // println("Touching (#"+finger_id+")");
          break;
        }
      }
    
    
      // ========= TOOLS =========
        
      for(Tool tool : hand.getTools()){
      
      
        // ----- BASICS -----
            
        int     tool_id           = tool.getId();
        PVector tool_position     = tool.getPosition();
        PVector tool_stabilized   = tool.getStabilizedPosition();
        PVector tool_velocity     = tool.getVelocity();
        PVector tool_direction    = tool.getDirection();
        float   tool_time         = tool.getTimeVisible();
      
      
        // ----- DRAWING -----
            
        // tool.draw();
      
      
        // ----- TOUCH EMULATION -----
            
        int     touch_zone        = tool.getTouchZone();
        float   touch_distance    = tool.getTouchDistance();
      
        switch(touch_zone){
          case -1: // None
            break;
          case 0: // Hovering
            // println("Hovering (#"+tool_id+"): "+touch_distance);
            break;
          case 1: // Touching
            // println("Touching (#"+tool_id+")");
            break;
        }
      }
  }
  
  
  // ========= DEVICES =========
    
  for(Device device : leap.getDevices()){
    float device_horizontal_view_angle = device.getHorizontalViewAngle();
    float device_verical_view_angle = device.getVerticalViewAngle();
    float device_range = device.getRange();
  }
    
}

// ========= CALLBACKS =========

void leapOnInit(){
  // println("Leap Motion Init");
}
void leapOnConnect(){
  // println("Leap Motion Connect");
}
void leapOnFrame(){
  // println("Leap Motion Frame");
}
void leapOnDisconnect(){
  // println("Leap Motion Disconnect");
}
void leapOnExit(){
  // println("Leap Motion Exit");
}

Looking Outward Final Project

“Nomis is a musical instrument created with the aim of making loop based music more expressive and transparent through gesture and light.” For this project, the artist, Jonathan Sparks, uses Maxmsp to allow a viewer to became a music creator by using their hands to activate certain position on a circle and two vertical columns.

“Voice Lessons is an electronic, audio device that interrogates the popular myth that every musical instrument imitates the human voice. Touching the screen allows the participant to manipulate the visuals and vocalizations of the “voice teacher” as he recites vocal warm up exercises”  For this piece, even though we already have viewed it in class, is important to my final project because it engages the viewers hand movement in controlling the different aspects of  sound of the video.  These aspects include, speed, pitch, and vowel sound.  However, it is not as successful for me because the viewer is force to engage a screen and does not quite as free of a range of motion and I would like my piece to have.

“Move is a technology garment designed by Electricfoxy that guides you toward optimal performance and precision in movement in an ambient, precise, and beautiful way. “I chose this piece because I enjoyed how it was a wearable object, similar to the glove I’m plan on using, that coordinates the performers movements and visual displays their movement.  Through these displays, the performer is also able to see improvements that the cloths calculate you could make in order to add precision to the performers form.

All of these other projects are  interesting to me because they actively engage the viewer and force a gesture or movement of their body in order to create a sound.  This translation from a tactile sense to a auditory/visual sense is a theme I am looking for in my final project.

 

 

Looking Outwards : Mostly Virtual Networked Objects

INSPIRING

Orange Soapstone : Dark Souls
While this networked object exists solely within the virtual realm of the video game Dark Souls, its core functionality is very applicable to the real world. The description of the object is as follows:

“In Lordran, the flow of time is distorted, and messages allow Undead to assist (or deceive) one another.”

The item allows players to mark messages onto a specific location in the game environment and leave them for other players. Mind you, the game is primarily one player, as each character exists in their own instance of the Dark Souls universe. However, the messages cross the border between these instances, allowing players to communicate via these marks.

In game, these amount to messages like “watch out for the falling boulder ahead” when one is about to encounter a booby trap, or something along those lines. However, real life lends itself to much more variety in terms of location specific data (“this guy is a creep” marked in front of someone’s house, or “the wings are bomb” in front of a diner).

DISAPPOINTING

Red and Blue Teleporter Pyramids : Divinity : Original Sin

This object exists in pairs. Activating one pyramid teleports the user to the other. One can leave one pyramid in a safe position, venture out into the wilderness, and use the other one to teleport back to safety when needed. My only disappointment with this object is that it does not exist in real life, but the simple way that the pair of objects relate inspires an array of different strategies and uses. Perhaps teleportation of the body back and forth between objects is impossible, but other forms of exchange are fair game.

SURPRISING

Songdo International Business District

The Songdo International Business district is the first actual smart city; that is, a city built from scratch mostly out of networked components. I was surprised that a networked project on this scale is nearing completion at this very moment. The city features voice and facial recognition sensors for security, among other city spanning networked content.

final sketches

IMG_8302

Here are some very crude sketches of what I’ve been thinking about making. I want to make a rectangular grid of led lights the will wave washes of gradient colors depending on the sounds around it. Upon Completing said grid I wish to use mirrors to transform the space the light will travel between.

:)

Scribbles for the Final Project: The Synesthesia Machine*

*or at least, that’s the name until I come up with something cooler

IMG_2340 IMG_2341

 

the assignment page still doesn’t have details at the moment about this, so I’m just going to play it safe and chatter a bit about what my plan for this is. Essentially, I want to take audio input from any music source (your computer, an mp3 player, iphone, youtube etc) and analyze the sound in some way. I want to then take the data from the audio analysis and map it to the rgb values of a bunch of LEDs, which I’ll attach to a pair of headphones, thus dramatically increasing their awesomeness factor. If I somehow find it in me to unexpectedly ascend to god tier in programming skills, I’d ideally want to map the pitch of the audio to the color it outputs. As it stands however I can barely code my way out of a wet paper bag, so I’m probably going to map volume to color somehow. Or something on the same level of complexity as that. Unless I can find the code for literally exactly what I already want to do somewhere online already, but that’s probably not possible.

Anyway, my handwriting is nigh illegible and I wrote all that down before I talked to anyone about what I’d actually need to execute the project, so the mp3 shield is not actually a part of things and I’m gonna use a headphone splitter to attach the arduino to the audio device of my choice.

HEYY it works now

So with thanks to Matt, this works now:

Arduino:

const int buttonPin = 2;     // the number of the pushbutton pin

int buttonState = 0;         // variable for reading the pushbutton status

void setup() 
{
//initialize serial communications at a 9600 baud rate
Serial.begin(9600);

// initialize the pushbutton pin as an input:

pinMode(buttonPin, INPUT); 
}

void loop(){
  // read the state of the pushbutton value:
  buttonState = digitalRead(buttonPin);
  Serial.print(buttonState);
  
  delay(10);
}

Processing:

////with help from Miranda; thank you muchly I am still bad at serial stuff

import com.temboo.core.*;
import com.temboo.Library.Twitter.Tweets.*;

import processing.serial.*;


Serial myPort;
int currentvalue;
int previousvalue;

// Create a session using your Temboo account application details
TembooSession session = new TembooSession("natroze", "myFirstApp", "05001793540547fab231aefa984084c0");

void setup() {
  String portName =Serial.list()[2];
  myPort =new Serial(this, portName,9600);
}

void draw(){
  previousvalue = currentvalue;
  if (myPort.available() == 0){  //takes the available data
    currentvalue = myPort.read(); //and stores it in the current value
    myPort.clear();
  }
  
  if (currentvalue == 0){
    sendtweet();
    
  }
}

void sendtweet(){
  runStatusesUpdateChoreo();
}
  

void runStatusesUpdateChoreo() {
  // Create the Choreo object using your Temboo session
  StatusesUpdate statusesUpdateChoreo = new StatusesUpdate(session);
  
  //creds
  statusesUpdateChoreo.setCredential("twitterapp");

  // Set inputs
 
  statusesUpdateChoreo.setStatusUpdate("CEASE");


  // Run the Choreo and store the results
  StatusesUpdateResultSet statusesUpdateResults = statusesUpdateChoreo.run();
  

}

An object that tweets.

Well, for this assignment I really had a block. I din’t know what to tweet from my arduino so I… Well… I set up a simple button which completes a circuit telling my arduino to send a serial message which processing reads and thenuses to run my temboo choreo. I updated my choreo to take any string and tweet it. I then created a list of words to explain my project. I apologize for the sorry nature of it, truly.

A messy image of a simple button.

November 17, 2014 at 1027AM

Here’s a fritzing diagram.

BasicButton

My Twitter Feed.

Snap 2014-11-17 at 11.30.17

My Arduino Code.

int digPort = 8;
int buttonPushed = 0;
int buttonDown = 0;
int count = 0;

void setup() {
  Serial.begin(9600);
  pinMode(digPort, INPUT);
}

void loop() {
    buttonPushed = digitalRead(digPort);
    
    if (buttonPushed == HIGH && buttonDown == 0) {
      Serial.println("A");
      Serial.println(count);
      buttonDown = 1;
      count += 1;
    } else if (buttonPushed == LOW && buttonDown == 1) {
      buttonDown = 0;
    }
    delay(100);

} 

My Processing Code.

import processing.serial.*;
import com.temboo.core.*;
import com.temboo.Library.Twitter.Tweets.*;

String not[] = {"nothing",
                "I have absolutely nothing",
                "to make"};
                
Serial myPort;

// Create a session using your Temboo account application details
TembooSession session = new TembooSession("luxapodular", "myFirstApp", "d731d22a81a14393a5be783117d2a8ce");

ArrayList serialChars;      // Temporary storage for received serial data
int whichValueToAccum = 0;  // Which piece of data am I currently collecting? 
boolean bJustBuilt = false; // Did I just finish collecting a datum?

int notIndex = 0;

void setup() {
  int nPorts = Serial.list().length; 
  for (int i=0; i < nPorts; i++) {
  println("Port " + i + ": " + Serial.list()[i]);
  }
  String portName = Serial.list()[0]; 
  myPort = new Serial(this, portName, 9600);
  serialChars = new ArrayList();
}

void draw() {
  doEt();
}

void doEt() {
   while (myPort.available () > 0) {
    char aChar = (char) myPort.read();
 
    if (aChar == 'A') {
      String thingToSay = not[notIndex];
      notIndex = (notIndex + 1) % 3;
      println(thingToSay);
      runStatusesUpdateChoreo(thingToSay);
    }
  }
}

void runStatusesUpdateChoreo(String string) {
  // Create the Choreo object using your Temboo session
  StatusesUpdate statusesUpdateChoreo = new StatusesUpdate(session);

  // Set inputs
  statusesUpdateChoreo.setAccessToken("2768964284-CkF5gMrUdhc07hXBJybfH06MVVvgUrgJfDuncfd");
  statusesUpdateChoreo.setAccessTokenSecret("5pm0mlgBuhdx2LVn5Gcj9ehldJtj8AhsI5jyffv25mphb");
  statusesUpdateChoreo.setConsumerSecret("yKVbJn9fnP2oArvBVhE94yiiyoj8FgpiqYy0NKZew3McpxACLe");
  statusesUpdateChoreo.setStatusUpdate(string);
  statusesUpdateChoreo.setConsumerKey("AlVDRqLH5OIL6XWrHyPi9Ikjf");

  // Run the Choreo and store the results
  StatusesUpdateResultSet statusesUpdateResults = statusesUpdateChoreo.run();
  
  // Print results
  println(statusesUpdateResults.getResponse());
}