## White is a color.

That’s right folks – we’ve been lied to this entire time by our art professors and the like that white is not a color, but a tone (or maybe it was just me). I had recently received an email from Golan informing me that, despite how it appeared to be suitable for cutting, my laser cut design failed. Because white is a color.

I am assuming what happened with my design was that, because I constructed the arcs with the ‘arc’ primitive and used strokeWeights / color manipulation to create the illusion of an outlined curve, the laser cutter misinterpreted the white strokes – which were meant to be holes – as filled-in shapes.

My code will have to undergo some revisions; and the code for creating the arcs will not be as short and sweet as I hoped it would be. Thankfully, Golan gave me some reference code (below) to assist me with making the necessary changes. Regardless of the additional work that must be done to make the design compatible with the laser cutter, this is certainly a valuable learning experience that will be useful for my future laser cutting endeavors.

I find this incident to be funny, because while we can make judgments on how a program or machine behaves based on what we see visually, things can be interpreted in a completely different way. My high school computer science teacher once told us that our programs are only as smart as we are, but I think there are some cases where they can be just a little bit dumber.

I hope none of you guys ran into the same issue that I had. 😛

```/*
Ticha
White is a color. Your lasercut failed.
Run this program to understand the solution.
Please write a blog post explaining why this is so.
*/

void setup() {
size(450, 400);
}

void draw() {
background (220, 255, 220);
noFill();

// Properties of the arc
float arcCenterX = 300;
float arcCenterY = 200;
float startAngle = min(angleA, angleB);
float endAngle   = max(angleA, angleB);
float innerDiam  = 160;
float outerDiam  = 200;

float averageDiam = (innerDiam + outerDiam)/2;
float littleArcDiam = (outerDiam - innerDiam)/2;

// draw the center points of the endcap arcs
float p1x = arcCenterX + averageRadius * cos (startAngle);
float p1y = arcCenterY + averageRadius * sin (startAngle);
float p2x = arcCenterX + averageRadius * cos (endAngle);
float p2y = arcCenterY + averageRadius * sin (endAngle);
stroke (255, 120, 0, 120);
ellipse (p1x, p1y, 3, 3);
ellipse (p2x, p2y, 3, 3);

// draw the spine, which is just for reference
stroke (255, 120, 0, 120);
arc(arcCenterX, arcCenterY, averageDiam, averageDiam, startAngle, endAngle);

// draw the main arcs (inner and outer), in lblue
stroke (0, 0, 255);
arc(arcCenterX, arcCenterY, innerDiam, innerDiam, startAngle, endAngle);
arc(arcCenterX, arcCenterY, outerDiam, outerDiam, startAngle, endAngle);

// draw the endcap arcs, in purple
stroke (255, 0, 255);
arc(p1x, p1y, littleArcDiam, littleArcDiam, startAngle-PI, startAngle );
arc(p2x, p2y, littleArcDiam, littleArcDiam, endAngle, endAngle+PI );
}
```

## Ticha-Sound Repulsion (updated)

UPDATE: Hopefully the first one will lasercut now.

When I approached the project initially, I really wanted to create a cellular simulation of sorts – I got as far as creating hideous ‘blobs’ formed by attaching various Spring objects to a centroid particle… but they looked so terrible (and defied too much physics) that I decided to scrap the idea completely and try something fresh. I was interested in making another semi-interactive piece – and as I had recently discovered that Processing supported audio, I decided to scour the internet for java libraries that potentially supported audio input. Fortunately, there existed an AudioInput Class that was easy to understand and was suitable for my purposes (kudos to CreativeCoding.org for introducing it to me).

Because I was creating a system that was affected by sound, I wanted to make a structure that visually cohered to the concept of sound, audio, or music. The final design ended up resembling a stylized vinyl record, which I am quite satisfied with. Additionally when the arcs get propagated outwards they somewhat resemble radio waves, which is another interesting effect. But despite how I am satisfied with how it turned out visually, I have mixed feelings towards its functionality. To maintain a certain degree of order, I initially intended to create springs that connected each arc to the centroid, so that after they had stretched out to a certain extent they would regain their initial positions. Perhaps it can be an adjustment I will work on in the future.

The code is here, but the application itself must be downloaded to work properly. I should also add that the PDF recording option was adapted from Michelle’s clever recording trick, which comes in handy for ‘screenshotting’ Processing output windows. 😀

## Ticha-LookingOutwards-3

Pixelate by Sures Kumar

Well, I must say I have seen a fair share of games involving food and eating, but never have I seen one that involved eating food the way Pixelate by Sures Kumar does. Albeit simple in its concept, Pixelate can be a surprisingly thought-provoking piece of work; in our current culture that is dominated by cheap fast foods and deliciously unhealthy meals, we tend to be less mindful of eating a balanced diet. As the more wholesome foods tend to be costly and not as easily accessible as the non-wholesome ones, it is understandable for many individuals to not feel inclined to make healthier choices. Pixelate is a simple statement of encouraging people to eat nutritious meals, but what I find most endearing about it is that since it a game it causes the act of eating healthy foods to feel rewarding. This characteristic is very much like the Wii Fit games.

The way the game is able to use pressure to detect which foods are being eaten is a particularly useful feature. However, this may be prone to error or limit the variety of foods. Additionally, the gameplay can get tedious, so the developers will have to adding more features to make the game more engaging.

Listen Carefully by Jonas Breme

Listen Carefully is an interesting project by Jonas Breme that makes us reflect on how we experience music today. It speaks to me on a personal level in particular, as I listen to music on a regular basis – but only as a secondary task. I cannot recall the last time I truly focused on listening to a song without simultaneously doing something else.

This project also touches upon the concept of multitasking, its prevalence in our daily routine, and how it prevents us from giving a single activity our full attention. What is poetic about Listen Carefully is that it forces the user to pause and unwind, commanding their full attention – just so that they may experience a moment of slow listening in a fast-paced life.

A Beat of Your Heart by Jason Mingay

Jason Mingay’s A Beat of Your Heart uses an Arduino and a pulse sensor to evoke certain emotional reactions from the viewer as they watch a sequence of video clips. What interests me about A Beat of Your Heart is that, while conventional art is made to passively induce emotional responses from the observer, this project almost forcefully injects certain reactions into the observer by provoking changes in heart rate. This project can possibly be seen as a complement to Listen Carefully, due to their similar objective of holding the user’s attention but different emotional reactions.

To me, A Beat of Your Heart  still feels more like a ‘detailed sketch’ than a final product. Regardless, I feel that this idea has a lot of potential and would like how it can be applied in different contexts.

## Ticha-Como se llama

Initially, I wanted to create a full-bodied, trippy-looking, fantastical creature whose motions reacted to the user’s facial expressions. I researched different character designs found on Pictoplasma and deviantART, but mostly drew inspiration from the Faun in Guillermo del Toro’s Pan’s Labyrinth. The Faun was a particularly interesting subject for me because not only was its design a well-crafted fusion of plant and animal, but also its movements and gestures strongly complemented its plantlike characteristics. My ‘final’ creature design, with its wooden ram horns, frog body, bushy hair, human hands, and venus flytrap for a tail, was inspired by the Faun and a bizarre legendary creature called the chimera.

Soon I realized that I was getting too entangled in the design and aesthetics that I did not give much thought into how the creature would move, or how it would interact with its environment. When I was unable to formulate an algorithm that would allow it to walk in a convincing manner, I decided to scrap my ram-frog-tree-bush-human-flytrap creature entirely and go with something more simple.

The ‘final‘ final creature design is a slightly cuter, llama-esque creature (because as I was painting it on Tuesday Jun noted that it resembled a llama). Its movements are controlled by the eyebrows, mouth, and head rotation – though I wanted it to be affected by the head tilt as well but ran into technical difficulties. I am more or less satisfied with the result, and found the extra ladybug feature to be an effective way to get the user more engaged with the character. If I had more time, I would make the ladybug fly more convincingly and try to work out the technical issues I had with the head tilting.

```// a template for receiving face tracking osc messages from
// Kyle McDonald's FaceOSC https://github.com/kylemcdonald/ofxFaceTracker
//
// 2012 Dan Wilcox danomatika.com
// for the IACD Spring 2012 class at the CMU School of Art
//
// adapted from from Greg Borenstein's 2011 example
// http://www.gregborenstein.com/
// https://gist.github.com/1603230
//
import oscP5.*;
OscP5 oscP5;

// num faces found
int found;

// pose
float poseScale;
PVector posePosition = new PVector();
PVector poseOrientation = new PVector();

// gesture
float mouthHeight;
float mouthWidth;
float eyeLeft;
float eyeRight;
float eyebrowLeft;
float eyebrowRight;
float jaw;
float nostrils;

//fetus
PImage face;
PImage body;
PImage lefteye;
PImage righteye;
PImage leftlid;
PImage rightlid;
PImage upperlip;
PImage llamajaw;

int x = 0;

boolean moving = false;
boolean flying = true;

void setup() {
size(600, 525);

oscP5 = new OscP5(this, 8338);
oscP5.plug(this, "found", "/found");
oscP5.plug(this, "poseScale", "/pose/scale");
oscP5.plug(this, "posePosition", "/pose/position");
oscP5.plug(this, "poseOrientation", "/pose/orientation");

noStroke();
fill(0);
}

void draw() {
background(255);

//I tried to allow the character to tilt its head
//but due to high memory usage Processing just decided not to work. :/
//translate(314,514);
//rotate(poseOrientation.z);

if (found > 0) {
//pushMatrix();
//translate(-314,-514);

image(lefteye, 0, 0);
image(righteye, 0, 0);

float eyeDir = map(poseOrientation.y, -0.16, 0.04, -5, 5);
ellipse(238+eyeDir, 337, 30, 30);
ellipse(371+eyeDir, 337, 30, 30);

/* for whatever reason, the eyebrow tracking is much more
float blinkleft = map(eyebrowLeft, 7.3, 8.350, -5, 45);
float blinkright = map(eyebrowRight, 7.3, 8.350, -5, 45);

image(face, 0, 0);
image(body, 0, 0);

float mouthopen = map(mouthHeight, 1.0, 7.0, 0, 30);

moving = mouthopen > 4.1;

image(llamajaw, 0, mouthopen);

image(upperlip, 0, 0);

//popMatrix();
}

}

// OSC CALLBACK FUNCTIONS

public void found(int i) {
println("found: " + i);
found = i;
}

public void poseScale(float s) {
println("scale: " + s);
poseScale = s;
}

public void posePosition(float x, float y) {
println("pose position\tX: " + x + " Y: " + y );
posePosition.set(x, y, 0);
}

public void poseOrientation(float x, float y, float z) {
println("pose orientation\tX: " + x + " Y: " + y + " Z: " + z);
poseOrientation.set(x, y, z);
}

println("mouth Width: " + w);
mouthWidth = w;
}

println("mouth height: " + h);
mouthHeight = h;
}

println("eye left: " + f);
eyeLeft = f;
}

println("eye right: " + f);
eyeRight = f;
}

println("eyebrow left: " + f);
eyebrowLeft = f;
}

println("eyebrow right: " + f);
eyebrowRight = f;
}

println("jaw: " + f);
jaw = f;
}

println("nostrils: " + f);
nostrils = f;
}

// all other OSC messages end up here
void oscEvent(OscMessage m) {
if (m.isPlugged() == false) {
println("UNPLUGGED: " + m);
}
}

float posX;
float posY;
float spdX;
float spdY;

posX = random(40, 400);
posY = random(60, 400);
spdX = random(-2, 2);
spdY = random(-2, 2);
}

void fly() {
if (!moving && 264 < = posX && posX <= 303
&& 300 <= posY && posY <= 342) {
flying = false;
spdX = 0;
spdY = 0;
}

else {
if (spdX == 0) {
//it'll move away from the mouth
spdX = random(-2, 2);
spdY = random(-3, 0);
}

posX += spdX;
posY += spdY;
flying = true;

if (posX < 40 || posX > 400) {
spdX *= -1;
}

if (posY < 60 || posY > 400) {
spdY *= -1;
}
}
}

void display() {

if (flying) {
}

else {
}
}
}

< \pre>```

## Ticha-Creature designs

Sorry for the photo spam – I just wanted to create a separate post for documenting the initial designs for my FaceOSC creature/character. While these aren’t exactly faces, they can still be viewed as ‘avatars’ whose movements correspond to the user’s facial expressions.

A disgruntled llama:

## Ticha-Organclock

While the human body itself has made a number of useful contributions to humanity (e.g. procreation), it was never regarded as a tool that can be used for common, day-to-day purposes (e.g. telling time). Although for many people the word ‘tool’ is not the most desirable adjective to use when concerning the human body, it could be argued that it is not justified to be so offended by the notion while readily accepting the capitalization on other organisms’ bodies to make utilitarian and/or decorative objects. Bearskin rugs, cowhide shoes, and alligator leather watches are some examples of this.

Nevertheless, this clock is not meant to be an animal rights statement – but simply a means to offer a non-conventional perspective on the function of a human body. Additionally, it provides the juxtapositional image of using something as volatile and unpredictable as a body to construct a device that requires utmost precision.

I believe that this simple clock communicates my idea to some extent – but more attention should be given to the design/functionality for it to be even more effective. I do plan to possibly tweak the clock further to add more useful features (such as making the stomach glow during mealtimes) and do some touch-ups on the drawings.

EDIT: Shortened the code so that it’s not as repetitive (thanks for the suggestions Dave!) and played with sine waves to make the heart beat less regularly after talking to Golan. It’s still not behaving the way I would like it to, so if I were to make any future revisions I would definitely try to find some reliable equations that model heart beats. It would also be interesting to extend the project further by creating a ‘human clock’ of sorts, as a means to more strongly represent the biorhythmic nature of our bodies.

(A side note: It appears that the screen flashes white occasionally for some reason – not completely sure why. Also, part of the image will be cut off unless you zoom in.)

```PImage heart;
PImage heart;
PImage lungs;
PImage bg;

ArrayList oxyleft;
ArrayList oxyright;

void setup() {
size(692,804);

oxyleft = new ArrayList();
oxyright = new ArrayList();

smooth();
}

void draw() {
background(0);
image(bg, 0,0);

for(int i = 0; i < hour(); i++) {
}

for(int i = 0; i < minute(); i++) {
}

//beats once per second
float x = millis()/1000.0;
float c01 = cos(2*x);
float c02 = cos(1+5*x);
float c03 = 1+((c01+c02)/6);

float heartPulse = pow(c03,5.0);

float heartH = map(heartPulse, 0, 3,  320,345);

image(heart, 200, 130, 279, heartH);

for(int i = 0; i < hour(); i++) {
Oxy l = (Oxy) oxyleft.get(i);

l.run(288,56, 407,171);
l.display();
}

for(int i = 0; i < minute(); i++) {
Oxy l = (Oxy) oxyright.get(i);

l.run(630,424, 430,160);
l.display();
}

image(lungs,0,0);

}

//oxygen molecules
class Oxy {
float posX;
float posY;
float size;
float xspeed;
float yspeed;

Oxy(float loX, float hiX, float loY, float hiY) {
posX = random(loX,hiX);
posY = random(loY,hiY);
xspeed = random(-1,1);
yspeed = random(-2,0);

}

void run(float hiX, float loX, float hiY, float loY) {
posX = posX + xspeed;

if(hiX < posX || posX < loX) {
xspeed *= -1;
}

posY = posY + yspeed;

if(hiY < posY || posY < loY) {
yspeed *= -1;
}
}

void display() {
noStroke();
fill(#b10c0c, 220);
ellipse(posX, posY, 5,5);

}

}```

So, after talking to Golan (thanks!), I decided to do have another go at the GIF assignment. I admit that animating in Processing was intially a difficult concept for me to grasp because I was so accustomed to highly graphical animation software – more specifically, After Effects – in which ‘clicking and dragging’ is a sufficient tactic for creating movement (well there is some programming involved, but only to a minimal extent). This second attempt at the animation was both a large step outside of my comfort zone as well as a significant mental leap for me – and I found it to be a very rewarding and enjoyable experience.

I stumbled across some code on OpenProcessing.org that beautifully simulated moving particles and used a bit of it to make these animations. The code was heavily tweaked though – the original used vectors in 3-dimensional space and did some weird isometric formula conversions (???) that I barely understood. So, I simplified it so that it could both be easier for me to understand and behave the way I wanted it to.

There is a quote by Virgil that goes, ‘sic itur ad astra’, which translates to ‘thus you shall go to the stars’. I hope these simple, yet strangely soothing GIFs will make people feel a little closer to them. (:

Here is the source code for the third animation, as it is my favorite one. The code for the other two are very similar, except that they use different trigonometric functions.

```/* Credit to http://www.openprocessing.org/sketch/5582 for amazing particle reference
Credit to Golan Levin for the recording option
*/

int nFramesInLoop = 160;
int nElapsedFrames;
int particleCount = 1000;
Particle[] particles = new Particle[particleCount];
boolean bRecording;

void setup()
{
noStroke();
size(500, 500);

//initializing particles
for (int i = 0; i < particleCount; i++) {
particles[i] = new Particle();
}
}

void keyPressed() {
bRecording = true;
nElapsedFrames = 0;
}

void draw() {

// Compute a percentage (0...1) representing where we are in the loop.
float percentCompleteFraction = 0;
if (bRecording) {
percentCompleteFraction = (float) nElapsedFrames / (float)nFramesInLoop;
}
else {
percentCompleteFraction = (float) (frameCount % nFramesInLoop) / (float)nFramesInLoop;
}

// Render the design, based on that percentage.
renderMyDesign (percentCompleteFraction);

// If we're recording the output, save the frame to a file.
if (bRecording) {
saveFrame("output/myname-loop-" + nf(nElapsedFrames, 4) + ".png");
nElapsedFrames++;
if (nElapsedFrames >= nFramesInLoop) {
bRecording = false;
}
}
}

void renderMyDesign(float percent)
{
translate(width/2, height/2);

background(10);

/* Loop through the particles */
for (int i = 0; i < particleCount; i++) {
Particle particle = (Particle) particles[i];
particle.display();
}
}

class Particle {
float angle;
float dec;

int size;

//particle constructor
Particle() {
angle = random(-10, 10);
dec = (150 - rad) * 0.00004;

size = (int) random(2, 4);
}

void display () {

fill((int)random(80, 200)); //twinkling
ellipse(rad * sin(angle), 180 * cos(angle), size, size);

fill((int)random(150, 255)); //twinkling
ellipse(rad * cos(angle), 180 * sin(angle), size, size);

angle += dec; //direction of particle movement
}
}

```

I also made this thing for giggles:

## Ticha-Bots and Bitrain

Sketch:

Result:

Coding this was awkward. I never imagined that drawing without a stylus and mouse would be so tedious and time-consuming. Thankfully, I was able to make most of the measurements + find the pixel locations using my Photoshop file and the surprisingly handy Ruler Tool. As far as the animation goes, in general I am more satisfied with the final composition than the initial draft. The use of binary bits instead of conventional rain makes for a more interesting composition, and the ‘bit splashes’ in the foreground are a nice touch. Additionally, the too-small umbrella gives the animation a more comical feel. The gif itself is actually a little slower than the real one; I tried removing some frames manually but decided not to mess with it too much to avoid losing the flow of the animation (Here’s what it’s supposed to look like). I think I somewhat succeeded in giving the animation some dimensionality despite how I was limited to working with more simple tools, as opposed to working with different special effects that are available in typical animation programs. I attempted to make the robot look round by adding some simple shading and tried to give depth to the bitrain by making its color range from medium gray to white.

Still, I wish that *more* animation could be involved and that the piece could be more compositionally interesting. I added a ‘lightning’ feature briefly but decided to remove it when I felt like I was ready to have a seizure. Perhaps what it needs the most is a more interesting environment, as gray is not very exciting to look at. However, I would also have to learn how to strike a balance between minimalism and detail – as the best animations / works of art are able to use both to strengthen the overall piece.

Here is the code which is probably longer than it should be (I have no idea why WordPress add more lines at the bottom):

```/* Thanks Golan for the reference code!
Credit to LearningProcessing for oscillation reference.

*/

int     nFramesInLoop = 160;
int     nElapsedFrames;
boolean bRecording;
float theta = 0;
float theta2 = 0;
float x;

//===================================================
void setup() {
size(500, 500);

bRecording = false;
nElapsedFrames = 0;

x = width/2-100;
}
//===================================================
void keyPressed() {
bRecording = true;
nElapsedFrames = 0;
}

//===================================================
void draw() {

// Compute a percentage (0...1) representing where we are in the loop.
float percentCompleteFraction = 0;
if (bRecording) {
percentCompleteFraction = (float) nElapsedFrames / (float)nFramesInLoop;
}
else {
percentCompleteFraction = (float) (frameCount % nFramesInLoop) / (float)nFramesInLoop;
}

// Render the design, based on that percentage.
renderMyDesign (percentCompleteFraction);

// If we're recording the output, save the frame to a file.
if (bRecording) {
saveFrame("output/myname-loop-" + nf(nElapsedFrames, 4) + ".png");
nElapsedFrames++;
if (nElapsedFrames >= nFramesInLoop) {
bRecording = false;
}
}
}

void renderMyDesign(float percent) {
background(#464646);
smooth();
noStroke();
strokeWeight(2);
//----------------------
// Here, I assign some handy variables.
float cx = 100;
float cy = 100;

robotLegs();
robotBody();
robotArms();

textSize(40);
fill(255);
text("Error 404;;", x, 100);

textSize((int)random(10, 22));
pushMatrix();
translate(-75, 30);

//rain effect
for (int i = 0; i < 80; i++) {
int w = int(random(width+40));
int h = int(random(height+40));
fill(random(70, 255));

int rand = (int)random(2);
String bit = (rand == 0)? "0" : "1";

text(bit, w, h);
}

popMatrix();

pushMatrix();
translate(-60, 480);

textSize(18);
//rain effect
for (int i = 0; i < 100; i++) {
int w = int(random(width+40));
int h = int(random(height/8));
fill(random(70, 255));

int rand = (int)random(2);
String bit = (rand == 0)? "0" : "1";

text(bit, w, h);
}
popMatrix();
/*
// If we're recording, I include some visual feedback.
if (bRecording) {
fill (255, 0, 0);
textAlign (CENTER);
String percentDisplayString = nf(percent, 1, 3);
text (percentDisplayString, cx, cy-15);
}*/
}

float a = map(sin(theta2), -1, 1, -5, 5); //shaking
theta2 += 0.6; //lower values slow down the movement

fill(#e24b00);
ellipse(257 + a, 250, 124, 84);
fill(#f25d13);
ellipse(257 + a, 243, 115, 70);
fill(#f67535);
ellipse(257 + a, 235, 110, 50);
//shiny
pushMatrix();
translate(270 + a, 220);
fill(#ffb793);
ellipse(0, 0, 30, 10);
popMatrix();

//left eyebrow
stroke(#1c1c1c);
strokeWeight(2.6);
line(223 + a, 245, 223, 238);
noStroke();
fill(#1c1c1c);
pushMatrix();
translate(213 + a, 236);
rect(0, 0, 20, 6);
popMatrix();

//left eye
fill(#1c1c1c);
ellipse(224 + a, 255, 20, 20);
fill(#ebcb88);
stroke(50);
strokeWeight(2);
ellipse(224 + a, 255, 13, 13);
noStroke();

//right eyebrow
stroke(#1c1c1c);
strokeWeight(2.6);
line(283 + a, 235, 283, 248);
noStroke();
fill(#1c1c1c);
pushMatrix();
translate(273 + a, 230);
rect(0, 0, 25, 6);
popMatrix();

//right eye
fill(#1c1c1c);
ellipse(284 + a, 255, 25, 25);
fill(#ebcb88);
stroke(50);
strokeWeight(2.6);
ellipse(284 + a, 255, 16, 16);
noStroke();

//left ear
fill(#1c1c1c);
rect(186 + a, 230, 15, 44, 7);
ellipse(185 + a, 252, 10, 30);
stroke(#1c1c1c);
strokeWeight(3);
line(181 + a, 253, 165 + a, 253);
line(165 + a, 253, 160 + a, 247);
line(160 + a, 247, 155 + a, 253);
line(155 + a, 253, 152 + a, 253);

//right ear
rect(310 + a, 230, 15, 44, 7);
ellipse(326 + a, 252, 10, 30);
stroke(#1c1c1c);
strokeWeight(3);
line(323 + a, 253, 343 + a, 253);
line(343 + a, 253, 348 + a, 247);
line(348 + a, 247, 353 + a, 253);
line(353 + a, 253, 360 + a, 247);

//line across middle
noFill();
stroke(#9e3b0a);
strokeWeight(1.5);
line(257 + a, 209, 257 + a, 291);

//bolts
int b=0;
for (int i = 1; i <= 5; i++) {
ellipse(250 + a, 218+b, 4, 4);
b+=12;
}
noStroke();
}

void robotBody() {
//body
fill(#f25d13);
fill(#fa6a22);
fill(#e24b00);
ellipse(241, 308, 114, 26);

//neck
fill(#1c1c1c);
rect(230, 275, 26, 37, 10);
}

void robotLegs() {
//left leg
fill(#1c1c1c);
pushMatrix();
translate(212, 420);
ellipse(0, 0, 35, 20);
popMatrix();

rect(205, 418, 15, 90);

//left foot

//right leg
pushMatrix();
translate(280, 410);
ellipse(0, 0, 35, 20);
popMatrix();

rect(273, 418, 15, 90);

//right foot
}

void robotArms() {
float a = map(sin(theta), -1, 1, 230, 240);
float b = map(sin(theta), -1, 1, 220, 230);
float c = map(sin(theta), -1, 1, 225, 235);
float d = map(sin(theta), -1, 1, 230, 240);
theta += 0.03; //lower values slow down the movement

//umbrella
fill(#0076b3);
rect(c, 160, 8, 200, 3);

//left arm
fill(#1c1c1c);
ellipse(200, 333, 18, 23);

fill(#1c1c1c);
rect(195, 333, 10, 60, 5);

stroke(0);
strokeWeight(10);
line(200, 390, a, 330);
noStroke();

//left hand
fill(0);
rect(b, 320, 20, 25, 3);
}
```

## Ticha-LookingOutwards-2

Silk by Yuri Vishnevsky

I was looking through some generative artwork for inspiration the other day and came across this beautiful piece called Silk. As the name suggests, the user selects a color and uses the mouse to ‘weave’ the silklike neon threads. The user also has the option to choose the degree of rotational symmetry, whether they would like the figure to be mirrored across the center, and whether they would like the figure to spiral towards the center.

While Silk is not anything revolutionary, it certainly holds a lot of aesthetic appeal and its use of musical ambience makes it both a visual and audial experience. However, what I find most appealing about the work is not in the visuals or the sound, but the way users can share their creations. In more conventional ‘art apps’ the sharing of results simply involves saving the image and uploading it to some social network site. In Silk, people can actually view the process involved in making the images – giving it an almost livestream-like feature. I only wish that the generation of the user-created images did not run so quickly, as I feel that this detracts from the meditative aspect of the program.

Content is Queen by Sergio Albiac

‘Content is Queen’ is an interesting mosaic of internet videos that combine to form a unique portrait of the Queen. I believe the charm of this project comes from the fact that, instead of using images to create just a picture of someone’s face, Albiac uses motional  images to create a dynamic form of portraiture.

One aspect I particularly like about this project is that each of the generated portraits have some semblance of the Queen’s face, but have just enough randomness and ‘artistic liberties’ for the viewer to see the content of the videos. There some images that show the Queen’s face in near-perfect clarity, and others that just barely resemble a face and are dominated by the videos. This I think, epitomizes the tension between monarchy and democracy – which is Albiac’s intent. With this particular kind of piece, it would be even more effective if he placed it in a certain location (like a place that has a lot of traditional portraits of the Queen) to further enhance the effect of  the work.

(on a side note – I also enjoyed Albiac’s Videorative Portrait of Randall Okita)

Fine Collection of Curious Sound Objects by Georg Reil

Found this little gem in the Exhibition section of Processing’s site. The project is very true to its name indeed; the sounds produced by the objects are by no means conventional, and each give its object a unique history. Reil is able to effectively create a narrative for each of these commonplace items by giving them distinct – and unexpected – ways of interacting with the user. In doing so, he challenges the belief that objects are purely utilitarian tools and entices the viewer to rediscover each object’s function. While I think this work is already very poetic as it is, I only wish his descriptions of the items were even more cryptic – so as to allow the viewer to construct their own interpretation of an item’s history.

All aspects considered, this is quite an inspirational piece that makes me want to experiment with sound objects in the future.