Keali-Last Project

f

Staying loyal to my usual aesthetic and everlasting motivation of making a beautiful, virtual environment… I’d like to thank Golan for dealing with the general constant theme, but I hope some of my other projects were different and experimental enough as well (book, data viz, mocap, etc.) For starters, it was unfortunate that I didn’t have the confidence to get started and learn OpenFrameworks within the timeframe of this project; it is definitely something I want to pick up and utilize in the future–especially since this project really hammered in to me what the limits of Processing were. For an interactive environment, Processing will undeniably start running super slowly because of all the assets involved; this actually limited what interactivity and aesthetics I wanted to include in this environment, and I had to prioritize what to keep without sacrificing a sense of completion in the overall product.

As such I aimed for something calm (my own bias) and serene… an environment that would provide subtle and endearing movement and interaction potential. I definitely focused on setting up the entire environment overall, much like a stage setting, rather than attempting to go for quantitative assets. I wanted a well-rendered and atmospherically polished environment rather than a more lackluster one with more features. Basically the whole setup was contingent on object-oriented programming, something I actually barely did throughout the semester for previous projects–but this was crucial here as every characteristic of the setting was its own object class: I worked with waves, landscapes, fireflies/particles, rain, stars, and branches and leaves (aggregated into trees), to carefully render each aspect in the right order to eventually stack together to become the final environment. I initially was inspired to include more air-related features through some beautiful OpenProcessing samples which I wanted to customize, but the utility of noise to render the smoke and clouds slowed down my entire program to the point that movement was hardly seen, so I had to abandon that. With the idealistic assets I had in mind and the further progress of how much code I’d typed with how increasingly sluggish my program was running, I had to make the decisions to output a workable and seamless product that still stayed aesthetic. I am incredibly happy with the resulting soft and subtle features, with the highlight movements from the stars, waves, and fireflies; and the interactions of the light particles following the user’s cursor and the movement of vectors upon clicking trees to allow for branch swaying and leaves scattering and falling. It is definitely not a game, or an interactive interface with a goal per say, but the whole point was to imagine a user mindlessly enjoying the program, as if it were a virtual art piece.

In contrast to how I had to present it the final Friday of class, I really think the program benefits from an experience where the user is running it alone in a dark room, with headphones. That is how I believe the piece should be seen and felt, and funnily enough the dark room and headphones is exactly the environment that I mostly coded it in as I worked on it during many nights. With this setup I feel like the user would become more fully immersed in the cathartic and serene nature of Nocturne.

This project is definitely something I want to continue and further develop in my free time or if given the chance in the future. This would probably also assume that I will be developing and enhancing it in OpenFrameworks, because as of now I think I’m already hitting the max of what Processing will run and handle smoothly. As of now I intent to add interactions with new objects such as plant growth and animals, and perhaps some more precipitation and particle options.

GitHub repository//

/*
REFERENCES: 



https://processing.org/examples/sinewave.html
https://processing.org/examples/multipleparticlesystems.html
https://processing.org/examples/simpleparticlesystem.html
*/


import processing.sound.*;

int xspacing = 16;   // How far apart should each horizontal location be spaced
int w;              // Width of entire wave

float theta = 0.0;  // Start angle at 0
//float amplitude = 75.0;  // Height of wave --> moved as parameter
float period = 500.0;  // How many pixels before the wave repeats
float dx;  // Value for incrementing X, a function of period and xspacing
float[] yvalues;  // Using an array to store height values for the wave

SoundFile file;

int Y_AXIS = 1;
color c1, c2;

color cloudFill, fade, far, near, mist;

int rainNum = 80;
Rain[] drops = new Rain[rainNum];

ArrayList trees = new ArrayList();

void setup() {
  size(1500, 700);
  smooth();
  
  file = new SoundFile(this, "khiitest.wav");
  //file = new SoundFile(this, "khii.mp3"); // testing audio loop?? 
  file.loop();
  
  c1 = color(17, 24, 51);
  c2 = color(24, 55, 112);

  // some setups aborted
  fade = color(64, 85, 128);

  w = width+16;
  dx = (TWO_PI / period) * xspacing;
  yvalues = new float[w/xspacing];

  //for (int i = 0; i < particleCount; i++) {
  //  sparks[i] = new Particle(176, 203, 235);

  for (int i = 0; i < smallStarList.length; i++) {
    smallStarList[i] = new smallStar();
  }
  
  for (int i = 0; i < bigStarList.length; i++) {
    bigStarList[i] = new bigStar();
  }
  
  for (int i = 0; i < fireflyList.length; i++) {
    fireflyList[i] = new firefly();
  }
  
  trees.add(new Tree(600,0));
  trees.add(new Tree(-500,0));
  trees.add(new Tree(300,0));
  trees.add(new Tree(50,0));
  trees.add(new Tree(400,0));
  for (int i = 0; i < rainNum; i++) {
    drops[i] = new Rain();
  }
  
  ps = new ParticleSystem(new PVector(400,600)); // buffer default loc
}

smallStar[] smallStarList = new smallStar[110];
bigStar[] bigStarList = new bigStar[50];
firefly[] fireflyList = new firefly[70];
float gMove = map(.15,0,.3,0,30);
ParticleSystem ps;


void draw() {
  background(0);
  setGradient(0, 0, width, height, c1, c2, Y_AXIS);

  makeFade(fade);
  //clouds(cloudFill); //cloud reference from https://www.openprocessing.org/sketch/179401

  for (int i = 0; i < smallStarList.length; i++) {
    smallStarList[i].display();
  }
  
  for (int i = 0; i < bigStarList.length; i++) {
    bigStarList[i].display();
  }

  drawMountains();
  
  ps.addParticle();
  ps.run();
  for (Tree tree : trees) {
    tree.display(); 
  }
  
  anotherNoiseWave();

  calcWave(30.0);
  renderWave();
  
  for (int i = 0; i < fireflyList.length; i++) {
    fireflyList[i].update();
    fireflyList[i].display();
  }
  
  ps.setOrigin(new PVector(mouseX,mouseY)); 
  
  //if (raining) {  for temp rain no-respawn fix 
    for (int i = 0; i < rainNum; i++) {
      drops[i].update();
    }
  //}
}

void makeFade(color fade) {
  for (int i = 0; i < height/3; i++) {
    float a = map(i,0,height/3,360,0);
    strokeWeight(1);
    stroke(fade,a);
    line(0,i,width,i);
  }
}

class ParticleSystem {
  ArrayList particles;
  PVector origin;
  ParticleSystem(PVector location) {
    origin = location.copy();
    particles = new ArrayList();
  }
  
  void addParticle() {
    particles.add(new Particle(origin));
  }
  
  void setOrigin(PVector origin) {
    this.origin = origin; 
  }
  
  void run() { 
    for (int i = particles.size()-1; i >= 0; i--) {
      Particle p = particles.get(i);
      p.run();
      if (p.isDead()) {
        particles.remove(i);
      }
    }
  }
}

class Particle {
  PVector location;
  PVector velocity;
  PVector acceleration;
  float lifespan;

  Particle(PVector l) {
    acceleration = new PVector(0,0.05);
    velocity = new PVector(random(-1,1),random(-2,0));
    location = l.copy();
    lifespan = 255.0;
  }

  void run() {
    update();
    display();
  }

  // update location 
  void update() {
    velocity.add(acceleration);
    location.add(velocity);
    lifespan -= 10.0;
  }

  // display particles
  void display() {
    noStroke();
    //fill(216,226,237,lifespan-15);
    //ellipse(location.x,location.y,3,3);
    fill(237,240,255,lifespan);
    //ellipse(location.x,location.y,5,5);
    float w = random(3,9);
    ellipse(location.x,location.y,w,w);
  }
  
  // "irrelevant" particle
  boolean isDead() {
    if (lifespan < 0.0) {
      return true;
    } else {
      return false;
    }
  }
}

class Tree {
  ArrayList branches = new ArrayList();
  ArrayList leaves = new ArrayList();
  int maxLevel = 8;
  Tree(float x, float y) {
    float rootLength = random(80.0, 150.0);
    branches.add(new Branch(this,x+width/2, y+height, x+width/2, y+height-rootLength, 0, null));
    subDivide(branches.get(0));
  }
  
  void display() {
    for (int i = 0; i < branches.size(); i++) {
      Branch branch = branches.get(i);
      branch.move();
      branch.display();
    }
    
    for (int i = leaves.size()-1; i > -1; i--) {
      Leaf leaf = leaves.get(i);
      leaf.move();
      leaf.display();
      leaf.destroyIfOutBounds();
    } 
  }

  void mousePress(PVector source) {
    float branchDistThreshold = 300*300;
    
    for (Branch branch : branches) {
      float distance = distSquared(mouseX, mouseY, branch.end.x, branch.end.y);
      if (distance > branchDistThreshold) {
        continue;
      }
      
      PVector explosion = new PVector(branch.end.x, branch.end.y);
      explosion.sub(source);
      explosion.normalize();
      float mult = map(distance, 0, branchDistThreshold, 10.0, 1.0); 
      explosion.mult(mult);
      branch.applyForce(explosion);
    }
    
    float leafDistThreshold = 50*50;
    
    for (Leaf leaf : leaves) {
      float distance = distSquared(mouseX, mouseY, leaf.pos.x, leaf.pos.y);
      if (distance > leafDistThreshold) {
        continue;
      }
      
      PVector explosion = new PVector(leaf.pos.x, leaf.pos.y);
      explosion.sub(source);
      explosion.normalize();
      float mult = map(distance, 0, leafDistThreshold, 2.0, 0.1);
      mult *= random(0.8, 1.2); // variation
      explosion.mult(mult);
      leaf.applyForce(explosion);
      
      leaf.dynamic = true;
    }
  }

 void subDivide(Branch branch) {
  ArrayList newBranches = new ArrayList();
  
  int newBranchCount = (int)random(1, 4);
  
  float minLength = 0.7;
  float maxLength = 0.85;
  
  switch(newBranchCount) {
    case 2:
      newBranches.add(branch.newBranch(random(-45.0, -10.0), random(minLength, maxLength)));
      newBranches.add(branch.newBranch(random(10.0, 45.0), random(minLength, maxLength)));
      break;
    case 3:
      newBranches.add(branch.newBranch(random(-45.0, -15.0), random(minLength, maxLength)));
      newBranches.add(branch.newBranch(random(-10.0, 10.0), random(minLength, maxLength)));
      newBranches.add(branch.newBranch(random(15.0, 45.0), random(minLength, maxLength)));
      break;
    default:
      newBranches.add(branch.newBranch(random(-45.0, 45.0), random(minLength, maxLength)));
      break;
  }
  
  for (Branch newBranch : newBranches) {
    this.branches.add(newBranch);

    if (newBranch.level < this.maxLevel) {
      subDivide(newBranch);
    } else {
      // generate random leaves position on last branch
      float offset = 5.0;
      for (int i = 0; i < 5; i++) {
        this.leaves.add(new Leaf(this,newBranch.end.x+random(-offset, offset), 
        newBranch.end.y+random(-offset, offset), newBranch));
      }
    }
  }
}
}

class Leaf {
  PVector pos;
  PVector velocity = new PVector(0,0);
  PVector acc = new PVector(0,0);
  float dia;
  float a;
  float r;
  float g;
  PVector offset;
  boolean dynamic = false;
  Branch parent;
  Tree tree;
  Leaf(Tree tree, float x, float y, Branch parent) {
    this.pos = new PVector(x,y);
    this.dia = random(2,11);
    this.a = random(50,150);
    this.parent = parent;
    this.offset = new PVector(parent.restPos.x-this.pos.x, parent.restPos.y-this.pos.y);
     this.tree = tree;
    if (tree.leaves.size() % 5 == 0) {
      this.r = 232;
      this.g = 250;
    } else {
      this.r = 227;
      this.g = random(230,255);
    }
  }
  
  void display() {
    pushMatrix();
    noStroke();
    fill(this.r, g, 250, this.a);
    ellipse(this.pos.x,this.pos.y,this.dia,this.dia);
    popMatrix();
  }
  
  void bounds() {
    if (!this.dynamic) { return; }
  }
  
  void applyForce(PVector force) {
    this.acc.add(force);
  }
  
  void move() {
    if (this.dynamic) {
      // Sim leaf
      
      PVector gravity = new PVector(0, 0.025);
      this.applyForce(gravity);
      
      this.velocity.add(this.acc);
      this.pos.add(this.velocity);
      this.acc.mult(0);
      
      this.bounds();
    } else {
      // follow branch
      this.pos.x = this.parent.end.x+this.offset.x;
      this.pos.y = this.parent.end.y+this.offset.y;
    }
  } 
  
  void destroyIfOutBounds() {
    if (this.dynamic) {
      if (this.pos.x < 0 || this.pos.x > width || this.pos.y < 0 || this.pos.y > height) {
        tree.leaves.remove(this);
      }
    }
  }
}


class Branch {
  PVector start;
  PVector end;
  PVector vel = new PVector(0, 0);
  PVector acc = new PVector(0, 0);
  int level;
  Branch parent = null;
  PVector restPos;
  float restLength;
  Tree tree;

  Branch(Tree tree, float x1, float y1, float x2, float y2, int level, Branch parent) {
    this.start = new PVector(x1, y1);
    this.end = new PVector(x2, y2);
    this.level = level;
    this.restLength = dist(x1, y1, x2, y2);
    this.restPos = new PVector(x2, y2);
    this.parent = parent;
    this.tree = tree;
  }

  void display() {
    pushMatrix();
    stroke(159, 200, 195+this.level*5);
    strokeWeight(tree.maxLevel-this.level+1);
    
    if (this.parent != null) {
      line(this.parent.end.x, this.parent.end.y, this.end.x, this.end.y);
    } else {
      line(this.start.x, this.start.y, this.end.x, this.end.y);
    }
    popMatrix();
  }

  Branch newBranch(float angle, float mult) {
    // calculate new branch's direction and length
    PVector direction = new PVector(this.end.x, this.end.y);
    direction.sub(this.start);
    float branchLength = direction.mag();

    float worldAngle = degrees(atan2(direction.x, direction.y))+angle;
    direction.x = sin(radians(worldAngle));
    direction.y = cos(radians(worldAngle));
    direction.normalize();
    direction.mult(branchLength*mult);
    
    PVector newEnd = new PVector(this.end.x, this.end.y);
    newEnd.add(direction);

    return new Branch(tree, this.end.x, this.end.y, newEnd.x, newEnd.y, this.level+1, this);
  }
  
  // branch bouncing 
  void applyForce(PVector force) {
    PVector forceCopy = force.get();
    
    // smaller branches will be more bouncy
    float divValue = map(this.level, 0, tree.maxLevel, 8.0, 2.0);
    forceCopy.div(divValue);
    
    this.acc.add(forceCopy);
  }
  
  void sim() {
    PVector airDrag = new PVector(this.vel.x, this.vel.y);
    float dragMagnitude = airDrag.mag();
    airDrag.normalize();
    airDrag.mult(-1);
    airDrag.mult(0.025*dragMagnitude*dragMagnitude); // java mode
    this.applyForce(airDrag);
    
    PVector spring = new PVector(this.end.x, this.end.y);
    spring.sub(this.restPos);
    float stretchedLength = dist(this.restPos.x, this.restPos.y, this.end.x, this.end.y);
    spring.normalize();
    float elasticMult = map(this.level, 0, tree.maxLevel, 0.05, 0.1); // java mode
    spring.mult(-elasticMult*stretchedLength);
    this.applyForce(spring);
  }
  
  void move() {
    this.sim();
    
    this.vel.mult(0.95);
    
    // kill velocity below this threshold to reduce jittering
    if (this.vel.mag() < 0.05) {
      this.vel.mult(0);
    }
    
    this.vel.add(this.acc);
    this.end.add(this.vel);
    this.acc.mult(0);    
  }
}

float distSquared(float x1, float y1, float x2, float y2) {
  return (x2-x1)*(x2-x1) + (y2-y1)*(y2-y1);
}
  
class smallStar {
  color c;
  float x;
  float y;
  float a;
  float h;
  float w;
  float centerX;
  float centerY;
  float ang;
  
  smallStar() {
    x = random(0,width);
    y = random(0,height/2);
    w = random(3,6);
    a = random(100,200);
    color[] colors = {color(232,248,255,a),color(235,234,175,a),color(242,242,208,a),
                         color(250,250,240,a),color(255,255,255,a)};
    int index = int(random(colors.length));
    c = colors[index];
    h = w;
    centerX = x + w/2;
    centerY = y + h/2;
    ang = random(0,PI)/random(1,4);
  }
  
  void display() {
    pushMatrix();
    ang = (this.ang + .01) % (2*PI);
    fill(this.c);
    noStroke();
    translate(centerX,centerY);
    rotate(ang);
    rect(-w/2,-h/2,w,h);
    popMatrix();
    //println("x" + this.x + "y" + this.y);
  }
}

class bigStar {
  float x;
  float y;
  float r1;
  float a;
  float flicker;
  float r2;
  color c;
  float ang;
  float angDir;
  
  bigStar() {
    x = random(0, width);
    y = random(0, height/2);
    r1 = random(2,5);
    a = random(40,180);
    flicker = random(400,800); 
    r2 = r1 * 2;
    color[] colors = {color(232,248,255,a),color(201,239,255,a),color(242,242,208,a),
                         color(250,250,240,a),color(255,255,255,a)};
    int index = int(random(colors.length));
    c = colors[index];
    float[] angles = {radians(millis()/170),radians(millis()/150),radians(millis()/-150),
                      radians(millis()/-170)};
    int index2 = int(random(angles.length));
    ang = angles[index2];
    angDir = (random(1)*0.1) - .05;
  }
  
  void display() {
    pushMatrix();
    //colorMode(RGB,255,255,255);
    //float newA = map(shine,-1,1,0,255);
    float newR = c >> 16 & 0xFF; //use bit shifts for faster processing
    float newG = c >> 8 & 0xFF;
    float newB = c & 0xFF;
    //float newAA = (a + newA) % 255;
    //float newC = color(newR, newG, newB, newAA); 
    float shine = sin(millis()/flicker);
    float a = this.a + map(shine,-1,1,40,100);
    //if (a < 0) { a = -a; };
    fill(newR,newG,newB,a);
    //fill(newC);
    noStroke();
    translate(x,y);
    ang = (this.ang + angDir) % (2*PI);
    rotate(ang);
    makeBigStar(0,0,r1,r2,5);
    popMatrix();
    //println("shine " + shine + "newAA " + newAA);
  }
}
    

void setGradient(int x, int y, float w, float h, color c1, color c2, int axis) {
  noFill();
  for (int i = y; i <= y+h; i++) {
    float inter = map(i, y, y+h, 0, 1);
    color c = lerpColor(c1, c2, inter);
    stroke(c);
    line(x, i, x+w, i);
  }
}

boolean raining = false;
//boolean rainToggle = false;

void keyPressed() {
  if (key == 'r') {
    if (raining == false) {
      raining = true;
      //rainNum = 80;
      //rainToggle = true;
    } else {
      raining = false;
    }
  }
}

void mousePressed() {
  PVector source = new PVector(mouseX, mouseY);
  for (Tree tree : trees) {
     tree.mousePress(source); 
  }
}

class firefly {
  PVector position;
  PVector velocity;
  float move;
  //float flicker;
  float a;
  
  firefly() {
    position = new PVector(random(0,width),random(400,650));
    velocity = new PVector(1*random(-1,1),-1*(random(-1,1)));
    move = random(-7,1);
    //flicker = sin(millis()/400.0);
    a = random(0,100); //map(flicker,-1,1,40,100);
  }
  
  void update() {
    position.add(velocity);
    if (position.x > width) {
      position.x = 0;
    }
    if (position.y > height || position.y < 360) {
      velocity.y = velocity.y * -1;
    }
  }
  
  void display() {
    pushMatrix();
    float flicker = sin(millis()/400.0);
    float a = (this.a + map(flicker,-1,1,40,100)) % 255;
    fill(255,255,240,a);
    ellipse(position.x,position.y,gMove+move, gMove+move);
    ellipse(position.x,position.y,(gMove+move)*0.5,(gMove+move)*0.5);
    popMatrix();
  }
}  

float yoff = 0.0;
float yoff2 = 0.0;

float time = 0;

void anotherNoiseWave() {
  float x = 0;
  while (x < width) {
    //stroke(255,255,255,5);
    stroke(0,65,117,120);
    //stroke(11, 114, 158, 12);
    line(x, 520 + 90 * noise(x/100, time), x, height);
    x++;
  }
  time = time + 0.02;
}

void calcWave(float amplitude) {
  // Increment theta (try different values for 'angular velocity' here
  theta += 0.02;

  // For every x value, calculate a y value with sine function
  float x = theta;
  for (int i = 0; i < yvalues.length; i++) {
    yvalues[i] = sin(x)*amplitude;
    x+=dx;
  }
}

void renderWave() {
  noStroke();
  colorMode(RGB);
  float ellipsePulse = sin(millis()/600.0);
  float ellipseColor = map(ellipsePulse, -1, 1, 150, 245);
  fill((int)ellipseColor, 220, 250, ellipseColor-60);
  // A simple way to draw the wave with an ellipse at each location
  for (int x = 0; x < yvalues.length; x++) {
    ellipse(x*1.3*xspacing, height/1.2+yvalues[x], 6, 6);
  }
  for (int x = 0; x < yvalues.length; x++) {
    ellipse(x*1.7*xspacing, height/1.3+yvalues[x], 5, 5);
  }
  for (int x = 0; x < yvalues.length; x++) {
    ellipse(x*1.4*xspacing, height/1.15+yvalues[x], 7, 7);
  }
  for (int x = 0; x < yvalues.length; x++) {
    ellipse(x*1.5*xspacing, height/1.27+yvalues[x], 6, 6);
  }
}

class Rain {
  float x = random(0, width);
  float y = random(-1000, 0);
  float size = random(3, 7);
  float speed = random(20, 40);
  void update() {
    y += speed;
    fill(255, 255, 255, 180);
    //fill(185, 197, 209, random(20, 100));
    ellipse(x, y-5, size-3, size*2-3);
    fill(185, 197, 209, random(20, 100));
    //fill(255, 255, 255, 180);
    ellipse(x, y, size, size*2);

    if (y > height) {
      if (raining) {
        x = random(0, width);
        y = random(-10, 0);
      } 
      if (!raining) { // temp fix for stopping rain: let current rainfall not respawn at top
        //drops = new Rain[0];
        y = height;
        //speed = 0;
      }
    }
  }
}

void makeBigStar(float x, float y, float radius1, float radius2, int npoints) {
  float angle = TWO_PI / npoints;
  float halfAngle = angle/2.0;
  beginShape();
  for (float a = 0; a < TWO_PI; a += angle) {
    float sx = x + cos(a) * radius2;
    float sy = y + sin(a) * radius2;
    vertex(sx, sy);
    sx = x + cos(a+halfAngle) * radius1;
    sy = y + sin(a+halfAngle) * radius1;
    vertex(sx, sy);
  }
  endShape(CLOSE);
}

void drawMountains() {
  strokeWeight(15);
  strokeJoin(ROUND);
  for (int i = 0; i <= 10; i++ ) {
    float y = i*30;
    fill(map(i, 0, 5, 200, 35), map(i, 0, 5, 250, 100), map(i, 0, 5, 255, 140));
    stroke(map(i, 0, 5, 200, 35), map(i, 0, 5, 250, 110), map(i, 0, 5, 255, 150));
    beginShape();
    vertex(0, 400+y);
    for (int q = 0; q <= width; q+=10) {
      float y2 = 400+y-abs(sin(radians(q)+i))*cos(radians(i+q/2))*map(i, 0, 5, 100, 20);
      vertex(q, y2);
    }
    vertex(width, height);
    vertex(0, height);
    endShape(CLOSE);
  }
}

Keali-LookingOutwards09

Theo, Emily, and Nick’s works, and in particular Connected Worlds, have captivated me since the day I came across them; seriously, I was such a fanboy of Connected Worlds. By chance I came across their Eyeo talk to write about for my past LookingOutwards and I could not have been more grateful for that discovery.

It is December 2 and they just finished their presentation and I’m so giddy inside and I also lowkey want a picture with them but it’s okay.

Anyhow my last project will be an attempt to also make some virtual environment; personally I’ve never been as into VR or 3D, so I want the output to be purely run on the computer, reminiscent of a videogame/computer game environment. However the user manipulation and interaction persists, so ideally it would be some inferior Connected Worlds… I personally admire the aesthetics and graphics of Design I/O’s works, and would personally like to achieve my own style and effects with a similarly sleek, endearing, and effective designs. I also aim to execute a calm, serene environment–an innocent nature-scape.

Keali-Kadoin-Object

Drunken Nightlight
The Party Animals Awaken

nightlight

https://www.youtube.com/watch?v=lxGlkdMtN5Q&feature=youtu.be

If there’s one thing we learned from this, it’s that Keali and Kadoin are not meant for hardware.
Our plan was to make a nightlight that would activate based on the sunset time of a certain location, which would be done with the CloudBit and IFTTT’s weather applet options. This was a simple though practical idea, which meant the rest of the project became geared towards craft and the actual construction of the nightlight, which went through multiple iterations: we had rotating carousel ideas, light boxes, and planetarium-like sketches planned, but the actual reality was a more complicated process of how to get something to stand still and rotate smoothly, all the while having the littlebits stay connected and be distributed properly so that the lights would work within the tube. We went through multiple construction tests with papers, tissue papers, cardboard, clay, and finally made a platform by 3D-printing it (shoutout to John’s help), and finalized an unfortunately wonky design of a tall tube with knifed out designs that swings unevenly and tilts because Keali forgot that physics was a thing when he cut all of the landscape outlines on one side which consequently caused the entire structure to bend over (and much more so when it spins). A boxed structure was a base that contained the motor bit and set it to stand upright, while the splitter bit directed the rest of the RGB and LED lights up into the tube; clay animals were tied with string and hung from the top base of the tube. But all horror aside a final product was successfully completed (though the refinement itself is… not much a success), and we have ourselves a rotating drunken nightlight that activates once the sun sets, and halts at sunrise. We miss D3…

sunset

sunrise

sunrise

sunset

15145047_1372767432838965_483078566_o

Keali-Final-Proposal

I want to make an interactive environment: perhaps a nature scene that can be controlled by its user; inspired by Theo and Emily’s works, but restricted to a screen (i.e. controllable through the keyboard / mouse, rather than projected and real-time motion); also would rather work in 2D.

Keali-LookingOutwards08

Andante by the Tangible Media Group at MIT visualizes animated characters walking along a piano keyboard, as if they are playing the physical keys with each step. It was by chance that this project attracted me because of its aesthetic of the lit up figures, reminiscent of my recent Mocap project where I modeled the human figure as a pedestrian walking signal. The luminescent representation of the bodies are similar, and this project as a whole feels well-considered and completed; the attributes of the visualization successfully complement each other, and seamlessly integrate the virtual visuals with the physicality of the piano keys being pressed at the right times. I especially found the motive of the work to be very appealing as well: that it is based off the expressive, full-body, and communicative characteristic of learning music, thus promoting an understanding of how music is instinctively rooted in the human body–something that any audience can find relatable and introspective. It is also noted that such approach took advantage of walking as being one of the most fundamental human rhythms, indicating the careful and admirable consideration that went into the decision of representing the movement. I find charming in the color palette of the visuals, especially the brightness in conjunction with how the video in particular was recorded, amidst a darker background that strengthens the design choice of the figures–I can almost see the clips being from a dream or bedtime story scenario, of little lit fantasy characters bringing music to life in some sort of tale. Another neat characteristic is the variety of forms and physiques that were represented: different body types, walking postures, and even some animals! All things considered, I think this would make practicing the piano less lonely, and playing the piano more fun; it is indeed like a little other world.

Keali-Manifesto

The excerpt from “The Epic Struggle of the Internet of Things” read to me as… overly complicated and perhaps oddly unnecessary to me–as did the “Critical Engineering Manifesto”, but the latter delved into a depth of societal introspection that I merely could not grasp nor conjure enough interest. What I absorbed from the former, at least, was the hypothetical scenario of a consumer attempting to merge the utility of two seemingly unrelated products; oftentimes this assumes the objects to be of different technological time periods, which implies the desire to technologically advance the more mundane, archaic device of the two. I personally relate to the individuals who do not see the practical point in this mindset: with modern day technology naturally being comparatively impressive to the past, and constantly advancing to encompass more opportunities, there is constantly the rise of arbitrary explorations in how to make sometimes the most unnecessary things “more technologically advanced”. This can be the result of different factors, from pure curiosity to inherent human laziness. Though this “struggle of the internet of things” has potential to pave ways to new, useful, innovative inventions that are practical and reasonable, people believe, and I can sympathize, that plenty results are probably made purely “just because”, and with no real usable value. I have not dealt with this concept much at all, but perhaps a reflexive example I can come up with is the new Apple updates recently, for both the newest iPhone and Macbooks, both of which ignited anger and frustration in all my tech-savvy friends (much to my personal apathy as a non-Apple fanatic); from what I’ve heard, it feels like Apple has just been “upgrading” (debatable, haha), its specs of its newer products just for the sake of implementing some change, not really thinking about the practicality in actual use and potential feedback of its loyal customers. Although this isn’t combining two objects, this was the closest example I could probably relate to as someone who has never thought about, and is confirmed to be thoroughly uninterested, in this “struggle of the internet of things”.

Keali-Mocap

Beep boop!


beepboopfinal

First iteration of ideas: a body as particles–initially I automatically conjured ideas that could be attributed to some sort of radiation or atmospheric effect: a figure of star particles, pixel dust, wisps of smoke–something simple, seamless, but with admirable aesthetics. This desire to represent some aura-based atmosphere also led to indirect models of the form, such as a delicate rain scene, where the body is invisible and the viewer can only see the body from where the rain bounces off. Another exploration regarded the soul butterflies, i.e. butterflies of death, a common trope in anime where glittering butterflies fly near lost souls or people close to death. (So, perhaps some iteration where if the model makes abrupt movements/shaking, he or she could shake the butterflies/death off of them–this shaking and loosening effect could be applied to any of my particle-based ideas).

I originally partnered with Takos to do this assignment and toy with some of these, and her ideas, and as we assigned ourselves parts to develop further, we actually continually drifted apart in our coding approaches and end goals… which eventually led to separate projects haha.
Ironically, my final product was an idea that she gave to me, including the link to the video below (thanks Takos!); once she presented this idea to me, I already thought of all the attributes needed that I thought would make the execution successful, and ended up going with it, while she decided to develop another completely different idea (that, ironically, was more of my usual aesthetic with seamless monochromatic visuals…) But cool thing is, I’m glad I explored something different anyway, and am actually very happy with how well-rounded my results became, in that even though it was a visually simple simulation, I feel like all the details and characteristics were well-considered and complement each other with purpose very much.

As such is the walking signal simulator, where a plane of circle bulbs light up according to the human figure: if the figure is moving, it is green, and ideally if the figure stops moving, the lights go red. I included audio from the walking signal noise at Morewood and Forbes (commonly nicknamed the “beep boop” by CMU students), and the audio also pauses if the red stop signal is on. The lights are lit according to an isBoneNear function that calculates the theoretical segment between all the Bvh bones and compares it to a point(x,y) that would be the center of all the circles on the light bulb plane, and if the distance is within my hardcoded epsilon, the circle will be green or red instead of the default gradient of grays.

Final: Troubleshooting the head was interesting because I assumed that the head would be the bone without a parent (a conditional I had to include anyway so that there wouldn’t be a null exception error), but when I upped the epsilon I saw no change, so I… guess the head wasn’t it; Golan then taught me about the function that allowed me to directly check for bone names (“Head”) that made the process easier, so raising the epsilon ended up succeeding to make the head little more prominent, although the default Head bone itself was still very close to the torso so the final figure looks like it has a very short neck… (but this is still the best improvement because the figure originally looked headless… also thank you Golan.) I even had an iteration where, because I still couldn’t identify and isolate the head bone yet, where my increase in epsilon accidentally made the model look pregnant (because it turned out that the bone I affected was at the waist I guess…) I could not fathom how to get the stop signal of red to work at random pauses, as I found it difficult to calculate whether the Bvh model moved between the last frame or not, so I ended up coding a method to just make the file pause at the end of every loop for a bit longer than usual before relooping, and at that moment of pause, changed the lit color to red and the audio amp to 0. I also added a two frames to the borders to give it an effect of having the walking signal yellow box frame. Originally I also made the plane flat, but decided to give it a top down gradient of gray rather than the flat grays, to mimic some short of shadow being casted from the top of the walking signal box. The top four pictures of the screencaps below were the initial tinkering stages of making the colors work and align well (as you can see, I had some debugging to do.)

I particularly also found it fitting that the model is stretching, as if taking a break from a jog or pedestrian stroll or walk šŸ™‚ Take care of yourself, exercise, and remember that it’s the little things that count! (I should really take that advice…) Overall, I’m really pleased that, although the result appears uncomplicated, that all its parts combine very well… it made me really happy that the class laughed once they realized exactly what my mocap attempted to mimic in real life. (The beep boop audio helped immensely, I believe… by the way, credits to this CMU remix, which is where I cropped the audio from!)

finaldoc

15034103_1366314526817589_188347404_o

GitHub repository//

import processing.sound.*;
SoundFile file; 

// Originally from http://perfume-dev.github.io/

import java.util.ArrayList;
import java.util.List;

BvhParser parserA = new BvhParser();
PBvh bvh1, bvh2, bvh3;

long totalFrameTime;
long loopCounter;
long loopTime;

void setup()
{
  size( 600, 600, P3D );
  background( 0 );
  noStroke();
  frameRate( 30 );
  file = new SoundFile(this, "beepboop.wav");
  file.loop();

  bvh1 = new PBvh( loadStrings( "A_test.bvh" ) ); // testing w this one
  //bvh2 = new PBvh( loadStrings( "B_test.bvh" ) );
  //bvh3 = new PBvh( loadStrings( "C_test.bvh" ) );

  totalFrameTime = bvh1.parser.totalLoopTimeMillis();
  
  loop();
  
}

long lastMillis = -1;
long setToMillis = 0;

public void draw()
{
  if (lastMillis == -1) {
    lastMillis = millis();
  }
  background( 0 );
  fill(209,181,56);
  rect(0,0,width,height);
  fill(150,129,36);
  rect(20,20,width-40,height-40,8);
  fill(0);
  rect(30,30,width-60,height-60,18);

  //camera
  float _cos = 0.0;
  float _sin = 0.0;
  //camera(width/4.f + width/4.f * _cos +200, height/2.0f-100, 550 + 150 * _sin, width/2.0f, height/2.0f, -400, 0, 1, 0);
  camera(width/2, height/2, 510.0, width/2, height/2, 0.0, 0, 1, 0); 
  
  //ground 
  fill( color( 255 ));
  stroke(127);
  //line(width/2.0f, height/2.0f, -30, width/2.0f, height/2.0f, 30);
  stroke(127);
  //line(width/2.0f-30, height/2.0f, 0, width/2.0f + 30, height/2.0f, 0);
  stroke(255);

  pushMatrix();
  translate( width/2, height/2-10, 0);
  scale(-1, -1, -1);

  long currMillis = millis() % totalFrameTime;
  long elapsedMillis = currMillis - lastMillis;
  long savedCurrMillis = currMillis;
  if (currMillis < lastMillis) {
    loopCounter = 150;
    loopTime = setToMillis;
  }
  
  if (loopCounter > 0) {
    loopCounter--;
    setToMillis = 200;
  } else {
    setToMillis += elapsedMillis;
  }
    

  //model
  bvh1.update( (int)setToMillis );
  //bvh2.update( millis() );
  //bvh3.update( millis() );
  
  //bvh1.draw();
  //bvh2.draw();
  //bvh3.draw();
  
  lastMillis = savedCurrMillis;
  
  popMatrix();
  
  pushMatrix();
  int num = 54;
  int r = width / num; 
  noStroke();
  fill(64,64,64);
  //int count = 0;
  /*for (float i = 40; i < width-40; i = i+r) {
    count++;
    fill(0+count*2);
    for (float j = 40; j < height-40; j = j+r) {
      ellipse(j,i,r,r);
    }
  }*/
  
  
  fill(64,64,64); // 34
  
  for (float i = 40; i < width-40; i = i+r) {
    int count = 0;
    for (float j = 40; j < height-40; j = j+r) {
      count++;
      if (isBoneNear(bvh1.getBones(),i,j)) {
        if (loopCounter > 0) {
          fill(214,73,73);
          file.amp(0);
        } else {
          fill(182,232,169);
          file.amp(1);
        }
        ellipse(i,j,r,r);
      } else {
        fill(0+count*2);
        ellipse(i,j,r,r);
      }
    }
  }
  
  
  //ellipse(0,0,200,200);
  popMatrix();
      
}

boolean isBoneNear(List bones, float x, float y) {
  float epsilon = 6.8;
  float scale = 2.7;
  x = x / scale;
  y = -y / scale;
  float xOffset = -105.0;
  float yOffset = 201.0;
  x += xOffset;
  y += yOffset;
  for (BvhBone bone : bones) {
    PVector start = bone.absPos;
    PVector end;
    epsilon = 6.8;
    if (bone.getName().equals("Head")) {
      epsilon = 12;
    }
    if (bone.getParent() == null) {
      end = bone.getChildren().get(0).absPos;
    } else {
      end = bone.getParent().absPos;
    }
    //PVector end = bone.absEndPos;
    float x1 = start.x;
    float y1 = start.y;
    float x2 = end.x;
    float y2 = end.y;
    double dist = lineDist(x1, y1, x2, y2, x, y);
    if (dist < epsilon) return true;
  }
  return false; 
}

double lineDist(float x1, float y1, float x2, float y2, float x3, float y3) {
  float px=x2-x1;
  float py=y2-y1;
  float temp=(px*px)+(py*py);
  float u=((x3 - x1) * px + (y3 - y1) * py) / (temp);
  if(u>1){
    u=1;
  }
  else if(u<0){
    u=0;
  }
  float x = x1 + u * px;
  float y = y1 + u * py;
  float dx = x - x3;
  float dy = y - y3;
  double dist = Math.sqrt(dx*dx + dy*dy);
  return dist;
}

Keali-LookingOutwards07

entangledwordbank6

entangledwordbank9

The (En)tangled Word Bank is a collaborative project between Stefanie Posavec and Greg Mclnerny that visualized the insertions and deletions of text through the six editions of The Origins of Species by Charles Darwin. Each diagram represents an edition of the series, and is modelled on the ‘literary organism’ structure used for On the Road by Jack Kerouac. The visualization essentially represents the meat of the text, where chapters are divided into subchapters (as per Darwin’s original text), and these subchapters are divided further into paragraph ‘leaves’; the small, wedge-shaped leaflets represent sentences, where they are colored according to whether that sentence survives to the next edition (blue), or deleted beforehand and not be within the next edition (orange). Some of the executions of the diagrams were published as large banners, where each depicted a specimen plate per edition, mimicking a botanical illustrative design. Each plate shows the original diagram, first and last chapter excerpts from the original diagram, and four extrapolations of the diagram detailing the chapters, subchapters, paragraphs, and sentences in each edition.

This design initially appealed to me from a visual perspective–I hadn’t quite evaluated or digested the data that was being represented yet, and did not really decide whether or not the design template was practical or appropriate for the data subject. I felt it was visually strong–well-executed and very organized without being overwhelming. Upon reading the intent and description, I feel what is the most successful and amusing aspect is that the tree structure is used to represent The Origin of Species of all possible texts: there is the implementation of nature in both subject and interface, especially reminiscent of taxonomy structures in biological sciences. In this way I feel like the overall design is strong in combining aesthetic appeal with its slick branches and leaves, and relevance to the subject being presented. I also like the layout of the printed banners in organizing multiple circles focusing on different depths of information. entangledwordbank

entangledwordbank2

entangledwordbank4

entangledwordbank8

entangledwordbank11

Keali-Visualization

screenshot-2016-11-04-08-38-56
screenshot-2016-11-04-08-33-58

My visualization displayed the data arranged in a circle of all the stations recorded in the spreadsheet– and then connected the corresponding start stations with their end stations of each bike rented; as such each line represents the journey of one bike from one station to another, or from one station back to itself. (Otherwise seen as a loop in the visualization.) The customization of the lines being at a lower opacity allows for the concept of frequency in the diagram, so the darker, more often-overlapped, and more opaque lines imply that more instances of bikes traveling some path with those two stations as the destination endpoints.

Reference 1 | Reference 2 | Reference 3

I initially found this block appealing because I felt it balanced uniqueness in style as well as practicality and readability (at least, in general, not specifically picking out every single line…) I instinctively thought the only way to reasonably implement relevant data was to have the stations connect to one another, and I stayed on track to this idea. I originally practiced with a simple network graph example, and the results were barely readable because of the plethora of overlapping messes of lines; I then combined two other references to reformat the data as json with Python, mirroring the structure of the example’s json data by labeling nodes and links accordingly (I also originally placed dummy data at the nodes to figure out exactly how the code worked). I then dug through the code to find out exactly how much could be customized, and refined the node colors, opacities, edge colors, link widths, etc. to my liking. Frankly, I had the lowest expectations for this project as D3 was incredibly overwhelming, as well as even dealing with the data itself before I could even get into D3, so I am quite thankful for the results and just immensely relieved that I outputted something because I wanted to cry multiple times throughout the work process.

viz4

//GitHub_repository








import csv
import json

bikeIDs = open('bikeids.txt', 'r').read()
ignore = 1
with open('data/HealthyRide Rentals 2016 Q3.csv', 'r') as csvfile:

    rentalReader = csv.reader(csvfile)
    output = []
    rawNodes = set()
    rawLinks = []
    stationNameDict = dict()
    useEvery = 50
    counter = 0
    for row in rentalReader:
        counter += 1
        if (counter % useEvery != 0):
            continue
        ignore -= 1;
        if (ignore >= 0):
            continue
        try:
            start = int(row[5])
            startName = row[6]
            end = int(row[7])
            endName = row[8]
            stationNameDict[start] = startName
            stationNameDict[end] = endName
            rawNodes.add(start)
            rawNodes.add(end)
            rawLinks.append((start,end))
        except:
            pass
    nodes = []
    links = []
    rawNodes = list(rawNodes)
    indices = dict()
    for rawNode in rawNodes:
        name = stationNameDict[rawNode]
        group = rawNode
        nodes.append({"name":name, "group": rawNode})
    for i in range(len(rawNodes)):
       stationId = rawNodes[i]
       indices[stationId] = i
    for (startId, endId) in rawLinks:
        source = indices[startId]
        target = indices[endId]
        sourceName = stationNameDict[startId]
        endName = stationNameDict[endId]
        weight = 1
        links.append({"source":sourceName, "target":endName, "weight":weight})
    output = {"nodes": nodes, "links":links}

    with open("graphFile.json","w") as outfile:
        outfile.write(json.dumps(output))

def getStations():
    pass

Keali-Book

cover

****[Eng Sub] by HorribleSubs (but actually though…) is a collection of randomly selected anime episode screencaps mismatched with arbitrarily generated, nonsensical, and irrelevant subtitles created through Markov Chains. A light-hearted and comedic production, the book pays homage to the dear memories of classic Japanese animated shows around which my adolescent life revolved; the collection of such franchises was a big influence on me wanting to become an artist. The title is reminiscent of the general title formats of the videos, as non-Japanese speakers attentively sought for the episodes that had the [eng sub] tag in the search results. *HorribleSubs is a popular provider of such subtitled episodes for numerous seasons.

Process:
My initial idea was entirely different–in retrospect, it’s amusing that my ideas throughout the process and the final resulting product are both appealing aesthetics to me, though visually and thematically different. I originally intended to make my book using APIs regarding nature astronomy–I thus researched Google APIs, particularly those affiliated with Google Maps and Google Earth (which favorably extended to data of oceans, landmarks, and stars). I wanted to randomly generate latitude and longitude locations on the planet, and then visually represent the state of the sky, and possibly the surroundings, at that location using designs and typography. I sketched out some visuals, having different fonts and sizes for stars and planets being laid onto the page depending on the angle point of view and the distance of the celestial bodies and Earth. This plan eventually fell farther from my grasp as I failed to find the relevant APIs, and also discovered that many of the astronomy-related APIs were no longer updated or accessible.

It was then when I noticed that plenty of the generative book type arts I had seen up to that point were comedic, meant to be a light and comical read, and I thought perhaps I could attempt this as well being an illustrator that usually does not work on comedy-related artworks. As such came my anime idea, where I just had the most arbitrary thought of pairing up random screencaps of episodes to nonsensical, maybe gibberish, captions. I laughed at the thought of the most ordinary images captioned with the most irrelevant subtitles underneath. This also meant a lot to me personally as someone who decided to pursue art because of the cartoons and games with which I grew up. It was refreshing to, in a way, revisit and work on a project that displayed this field of art which, to my own experience, has been such a taboo at this institution for reasons I cannot comprehend. (Please take cartoons seriously…) I feel like it has been so long since I’ve made something this potentially funny and light-hearted.

The main key in getting started was familiarizing myself with Rita–something completely new: I needed to learn about Markov Chains. This meant Dan Shiffman tutorials, dissecting references, and downloading and analyzing the example code that generated random sentences whenever the user clicked. Essentially, I modified the example code to feed in my own text files (collected anime subtitles), and to output the JSON objects as an array with {“subtitle”:} types.

I then wrote a script in Python to take and save random screenshots per all the downloaded episode files, and to save and condense all the subtitles per video into .txt files, organized by series. I brought all the series’ subtitles into the Processing file to generate sentences using Markov from them. I then ran the program and clicked as many times as I wanted to collect plenty randomly generated, nonsense subtitles that would be placed in my book.

Bringing the code into basil was more bearable with the sample book which Golan provided; the necessary lines and files were replaced with my own, and from there on the priority was to mimic the aesthetic of an adolescent’s exposure to anime: I wanted to the book to be unrefined, simple, informal. The screencaps were aligned per page in the center, as if it was directly captured from the video itself, and the subtitles were positioned to be at the bottom of the screen, also center-aligned and not past the video borders. The subtitle font was purposely chosen to be of the ones typically used to sub episodes, with easy readability, reasonable size, and a white or pale yellow fill color. I wanted to represent the online streaming environment as much as possible to fulfill the vision of the book (as direct representations of a subbed episode on some browser video player).

Streaming the episodes of a foreign cartoon from the states actually felt risky at times; if you couldn’t find the episode you wanted on youtube, you had to rely on google, and I personally was always wary with which sites could be trusted/which one had reliable subtitles (haha)/which one had good quality/which one may give my computer a virus, etc… I also factored this “experience” into the overall visual aspect of the book: once again, childish, innocent, perhaps even ratchet–the outcome is simple with minimal elements, but I feel like it fulfilled the task of essentially representing the necessary attributes. The final result made me, and others, laugh–which I am more than thankful for. I would also say the simple outcome is deceptively simple from all the generativity behind it…(personal benchmark: first time writing a script…!)

process

picmonkey-collage

flipbookgif

Here’s a video of the professor, flipping through the book:

final pdf: kearniebook5

img_4015

//GitHub repository

Python script:

import subprocess
import shutil
import shlex
import re
import ass 
import os
def takeSnapshots(fileName,ep):
    amountPerEp = 10
    episodeTime = 20*60
    ignoreTime = 60*3
    interval = int((episodeTime - 2*ignoreTime)/amountPerEp - 0.01)
    base = "kearnie/screencaps/"
    extractTimes = [i for i in range(ignoreTime,episodeTime-ignoreTime,interval)]
    for i in range(len(extractTimes)):
        time = extractTimes[i]
        args = ["mpv","-ao","null","-sid","no","-ss",str(int(time)),"-frames","1","-vo","image",
        "--vo-image-format=png", fileName]
        try:
          subprocess.run(args)
          shutil.move("00000001.png",base + "%d.png" % (ep*amountPerEp+i))
        except:
          print("fal")
trackRegex = re.compile("mkvextract:\s(\d)")
removeBrackets = regex = re.compile(".*?\((.*?)\}")
def getSubtitleTracks(fileName):
    output = subprocess.check_output(["mkvinfo",fileName],universal_newlines=True).splitlines()
    currentTrack = None
    sub_tracks = []
    for line in output:
        if "Track number:" in line:
            trackNumber = trackRegex.search(line).group(1)
            currentTrack = trackNumber
        if "S_TEXT/ASS" in line:
            sub_tracks.append(currentTrack)
    return sub_tracks
def exportSRT(fileName, track):
    srtName = fileName + "-%s.srt" % track
    args = ["mkvextract", "tracks",fileName, "%s:%s" % (track,srtName)]
    subprocess.run(args)
    return srtName
def cleanLine(line):
    newLine = ""
    inBracket = False
    lastBackSlash = False
    for c in line:
        if c == "{":
            inBracket = True
        elif c == "}":
            inBracket == False
        elif not inBracket:
            if c == "\\":
                lastBackSlash = True
            elif c != "N" or not lastBackSlash:
                newLine += c
                lastBackSlash = False
    return newLine

def extractTextFromSubtitles(fileName):
    tracks = getSubtitleTracks(fileName)
    output = ""
    for track in tracks:
        srtName = exportSRT(fileName, track)
        lines = []
        with open(srtName,"r") as f:
            doc = ass.parse(f)
            for event in doc.events:
                lines.append(cleanLine(event.text))
        combined = "\n".join(lines)
        if "in" in combined or "to" in combined or "for" in combined:
            output += combined
    return output

def extractFromFile(fileName,ep):
    os.makedirs("kearnie/screencaps/",exist_ok=True)
    text = extractTextFromSubtitles(fileName)
    with open("kearnie/subs.txt","a") as f:
        f.write(text)
    takeSnapshots(fileName,ep)
def extractSeries():
    ep = 0
    for filename in os.listdir("."):
        if filename.endswith(".mkv"):
            extractFromFile(filename,ep)
            ep += 1

extractSeries()

Processing code:

import rita.*;
import java.util.*;

//RiTa Markov Chain example template
//Does not contain JSON attributes yet šŸ™ 

JSONArray subtitles;
int count = 0;

RiMarkov markov;
String line = "click to (re)generate!";
String[] files = { "../data/durarara.txt", 
                   "../data/fmab.txt",
                   "../data/geass.txt",
                   "../data/naruto.txt",
                   "../data/wata.txt" };
int x = 160, y = 240;

void setup()
{
  size(500, 500);

  fill(0);
  textFont(createFont("times", 16));

  // create a markov model w' n=2 from the files
  markov = new RiMarkov(2);
  markov.loadFrom(files, this);
  subtitles = new JSONArray();
}

void draw()
{
  background(250);
  text(line, x, y, 400, 400);
}

void mouseClicked()
{
  if (!markov.ready()) return;
  x = y = 50;
  String[] lines = markov.generateSentences(1);
  line = RiTa.join(lines, " ");
  
  JSONObject subtitle = new JSONObject();
  subtitle.setString("subtitle", lines[0]);
  subtitles.setJSONObject(count, subtitle);
  println(lines[0]);
  if (lines[0] != null) {
    saveJSONArray(subtitles, count + ".json");
  count++;
  }
}

Basil.js code:

#includepath "~/Documents/;%USERPROFILE%Documents";
#includepath "d:\\Documents";
#include "basiljs/bundle/basil.js";

// Load a data file containing your book's content. This is expected
// to be located in the "data" folder adjacent to your .indd and .jsx. 
var jsonString = b.loadString("finalsubs.json");
var jsonData;


//--------------------------------------------------------
function setup() {
  var pageFlip = 0;
  // Clear the document at the very start. 
  b.clear (b.doc());

  // Make a title page. 
  // b.fill(0,0,0);
  // b.textSize(24);
  // b.textFont("Calibri"); 
  // b.textAlign(Justification.LEFT_ALIGN); 
  // b.text("A Basil.js Alphabet Book", 72,72,360,36);
  // b.text("Golan Levin, Fall 2016", 72,108,360,36);

  
  // Parse the JSON file into the jsonData array
  jsonData = b.JSON.decode( jsonString );
  b.println("Number of elements in JSON: " + jsonData.length);


  // Initialize some variables for element placement positions.
  // Remember that the units are "points", 72 points = 1 inch.
  var titleX = 72; 
  var titleY = 72;
  var titleW = 72;
  var titleH = 72;

  var captionX = 38; 
  var captionY = b.height - 115;
  var captionW = 500;
  var captionH = 36;

  var imageX = 38; 
  var imageY = 72; 
  var imageW = 500; 
  var imageH = 288; 


  // Loop over every element of the book content array
  // (Here assumed to be separate pages)
  for (var i = 0; i < jsonData.length; i++) {

    // Create the next page. 
    b.addPage();
    b.println(pageFlip);
    if (pageFlip == 0) {
        b.println("*******");
        // Load an image from the "images" folder inside the data folder;
        // Display the image in a large frame, resize it as necessary. 
        b.noStroke();  // no border around image, please.
        var anImageFilename = "images/" + b.floor(b.random(0,40)) + ".png"
        var anImage = b.image(anImageFilename, imageX, imageY, imageW, imageH);
        anImage.fit(FitOptions.PROPORTIONALLY);
        pageFlip = 1;
        b.println("*")
    }
    
    if (pageFlip == 1) {
    // Create textframes for the "title" field.
    // Draw an ellipse with a random color behind the title letter.
    /*b.noStroke(); 
    b.fill(b.random(180,220),b.random(180,220),b.random(180,220)); 
    b.ellipseMode(b.CORNER);
    b.ellipse (titleX,titleY,titleW,titleH);

    b.fill(255);
    b.textSize(56);
    b.textFont("Calibri"); 
    b.textAlign(Justification.CENTER_ALIGN, VerticalJustification.CENTER_ALIGN );
    b.text(jsonData[i].title, titleX,titleY,titleW,titleH);*/

    // Create textframes for the "caption" fields
    b.stroke(0);
    b.fill(255);
    b.textSize(20);
    b.textFont("Calibri"); 
    b.textAlign(Justification.CENTER_ALIGN, VerticalJustification.TOP_ALIGN );
    b.text(jsonData[i].caption, captionX,captionY,captionW,captionH);
    
    pageFlip = 0;
    
    }
  };
}

// This makes it all happen:
b.go(); 

Keali-LookingOutwards06

The bot * (@soft_focuses) is a “poetic experiment” whose tweets evoke an atmospheric and serene theme–I spent considerable time deciding between this and the poem.exe (@poem_exe), as both appealed to me with its introspective and calm sense of literary aesthetic. Plenty of the generated poetry are short, subtle, and perhaps mysterious; not always grammatically correct, the diction and brief eloquence of the vocabulary is enough to present some soft and gentle imagery (as its username suggests). The tweets are non-formulaic and almost cursory reminders to myself to ponder within myself, and to think of the outside world, in a sense of “it’s the little things in life…”–and to laugh at the occasional hilarious outputs. In a way, it’s lovely for such fragments to possibly lead to extensive, deep thoughts.

tweets

Keali-FaceOSC

dreamscapeoscgif

screenshot-2016-10-15-12-59-33
screenshot-2016-10-15-13-00-23

Rather than experimenting with a concept that was instinctively face-controlled, such as objects inherently representing faces, or masks/robots/other images, I wanted to explore something more abstract and relatable to my own aesthetics: that translated into more introspective, tranquil, and atmospheric ideas on which I base a lot of my interactive works. I almost automatically thought of skies/seas, gearing towards the starry path of galaxies and stardust in which someone can lose him or herself.

As such came the daydreaming concept of my project, being reliant on the openness of one’s eyes; I would allow a dreamscape of night stars light specks to fade in if the user’s eyes are closed or lower than/default. However, if the user is to open his or her eyes wider to a noticeable threshold, the dreamscape would snap back to a blank reality abruptly, as if one were suddenly woken up from a daydream by a third party. I also wanted to toy with the juxtaposition of having visuals that one ironically wouldn’t be able to see if his or her eye’s were opened (haha)–afterwards I just changed this to a daydream, rather than actual sleeping/eyes had to be fully closed, concept–for practical reasons; I also feel like this didn’t stray too far from my intrinsic intentions of “being” in another world.

The visuals this time hinged on the atmosphere of the dreamscape, dealt with through gradients, flickering/fading opacities, and the subtle movements of a new particle class that I learned. Rather than having a blank template or default face, I thought to translate the daydream aspect abstractly to the “awake” portion as well. At the default, “awake” phase of the simulation, I removed the traces of an avatar and instead designated two particle bursts of where the eyes would be–these minor explosions follow the user’s eyes and represent how the stardust from the dreamscape realm is almost itching to come to fruition, to burst out–indicating how difficult it is for someone to perhaps not daydream usually. Once someone blanks out or loses focus, or in this case lessen the openness of eyes, the dreamscape reappears and the stardust particles float effortlessly and arbitrarily throughout the screen. When the simulation is first run, the stardust of the dreamscape expand from a cluster in the middle, otherwise abstracted as the center of the user’s mind, and travel outwards to the boundaries, giving a sense of the user moving forward and falling deeper, traveling further into the dream; the dreamscape is otherwise decorated by flickering stars and a fading night gradient.

I also wanted to advance the dreamscape with calming, therapeutic music, and I did so by utilizing the Processing audio capabilities, modifying the amp() to be higher when the eyes are smaller/dreamscape is occurring, and the amp() be 0 and for the music to stop abruptly if the eyes were open/avatar was in “reality”. The audio I selected is the soundtrack Will of the Heart from the animated show Bleach.

Hence I wanted this to be a sort of introspective interaction between the person and him or herself: that people are fighting their own concentration and awareness to not accidentally fade in to a daydream–or perhaps purposely daydream and enjoy the experience. Other than that, it is an interaction between the user and the interface, to enjoy the visuals of an abstract, idyllic, and atmospheric setting.

faceoscsketches

**personal notes with troubleshooting/my own laptop in particular: I think it’s because my laptop is of a newer model with Windows, so its screen resolution is supposedly “crazy high”–that being, a lot of my software windows (i.e. Processing, Photoshop, CamStudio, OBS, etc.) turn up tiny because of their inherently smaller windows resolutions… as such when I run the Processing code, the output windows is just a small square šŸ™ It’s unfortunate that I feel like the size takes away from fully being immersed in the visuals but it can’t be helped–I’ve checked online for problem-solving, and this has actually been tagged as a “high-priority” problem in Processing’s open-source GitHub forum, but no solution has been provided yet.. this project, especially when it got to the screengrab/video-recording portion, also really made me wish I had a Mac/used OSX haha.. finding a software to properly record the screen was hard–and then working with the software’s windows to properly click the Play/Stop button on my “high-res” laptop was even harder because–surprise–the windows for the software were tiny too. And then the outputted files were huge. (my first attempt outputted a file of 4.0 GB…) but–I just wanted to record some benchmarks for personal’s sake. This was quite a journey–and in the end I’m thankful I got what I needed to get done, done.

GitHub repository//

//
// a template for receiving face tracking osc messages from
// Kyle McDonald's FaceOSC https://github.com/kylemcdonald/ofxFaceTracker
//
// 2012 Dan Wilcox danomatika.com
// for the IACD Spring 2012 class at the CMU School of Art
//
// adapted from from Greg Borenstein's 2011 example
// http://www.gregborenstein.com/
// https://gist.github.com/1603230
//

//modified 10.14.2016 for faceosc project 
//dreamscape by kearnie lin
import oscP5.*;
OscP5 oscP5;
import java.util.concurrent.ThreadLocalRandom;
import java.util.*;

import processing.sound.*; //for dreamscape audio
SoundFile file;

// num faces found
int found;

// pose
float poseScale;
PVector posePosition = new PVector();
PVector poseOrientation = new PVector();

// gesture
float mouthHeight;
float mouthWidth;
float eyeLeft;
float eyeRight;
float eyebrowLeft;
float eyebrowRight;
float jaw;
float nostrils;

// Constants
int Y_AXIS = 1;
int X_AXIS = 2;
color b1, b2, c1, c2;
int dim;

ParticleSystem ps;
ParticleSystem ps2;
Dust[] dustList = new Dust [60];
float gMove = map(.15,0,.3,0,30); //thank you ari!

void setup() {
  size(640, 640);
  frameRate(30);
  c1 = color(17,24,51);
  c2 = color(24,55,112);
  ps = new ParticleSystem(new PVector(eyeRight-25,-10));
  ps2 = new ParticleSystem(new PVector(eyeLeft+25, -10));
  for (int i = 0; i < dustList.length; i++) { dustList[i] = new Dust(); } file = new SoundFile(this, "faceoscmusic.mp3"); file.loop(); file.amp(0); //play audio when avatar is awake oscP5 = new OscP5(this, 8338); oscP5.plug(this, "found", "/found"); oscP5.plug(this, "poseScale", "/pose/scale"); oscP5.plug(this, "posePosition", "/pose/position"); oscP5.plug(this, "poseOrientation", "/pose/orientation"); oscP5.plug(this, "mouthWidthReceived", "/gesture/mouth/width"); oscP5.plug(this, "mouthHeightReceived", "/gesture/mouth/height"); oscP5.plug(this, "eyeLeftReceived", "/gesture/eye/left"); oscP5.plug(this, "eyeRightReceived", "/gesture/eye/right"); oscP5.plug(this, "eyebrowLeftReceived", "/gesture/eyebrow/left"); oscP5.plug(this, "eyebrowRightReceived", "/gesture/eyebrow/right"); oscP5.plug(this, "jawReceived", "/gesture/jaw"); oscP5.plug(this, "nostrilsReceived", "/gesture/nostrils"); } float eyesClosedValue = 0; void draw() { background(255); stroke(0); boolean eyesClosed = false; if(found > 0) {
    pushMatrix();
    translate(posePosition.x, posePosition.y);
    scale(poseScale);
    noFill();
    if (eyeLeft < 3.0 || eyeRight < 3.0 || eyebrowLeft < 7.8 || eyebrowRight < 7.8) { eyesClosed = true; } print(eyeLeft); //debugging (finding threshold vals) print(eyeRight); if (eyesClosed == false) { ps.addParticle(); ps.run(); ps2.addParticle(); ps2.run(); } popMatrix(); } if (eyesClosed) { file.amp(eyesClosedValue/255.0); c1 = color(17,24,51,eyesClosedValue); c2 = color(24,55,112,eyesClosedValue); eyesClosedValue += 3; if (eyesClosedValue > 255) eyesClosedValue = 255;
    //gradient
    setGradient(0, 0, width, height, c1, c2, Y_AXIS);
    Random ran = new Random(50);
    //implement stars

    for (int i = 0; i < 60; i++) {
      noStroke();
      int[] r = {230,235,242,250,255};
      int[] g = {228,234,242,250,255};
      int[] b = {147,175,208,240,255};
      int starA = (int)(min(ran.nextInt(100),eyesClosedValue) + sin((frameCount+ran.nextInt(100))/20.0)*40);
      fill(r[(ran.nextInt(5))],
           g[(ran.nextInt(5))],
           b[ran.nextInt(5)], starA);
      pushMatrix();
      
      translate(Float.valueOf(String.valueOf(width*ran.nextFloat())), Float.valueOf(String.valueOf(height*ran.nextFloat())));
      rotate(frameCount / -100.0);
      float r1 = 2 + (ran.nextFloat()*4);
      float r2 = 2.0 * r1;
      star(0, 0, r1, r2, 5); 
      popMatrix();
    }
   for (int j = 0; j < dustList.length; j++) { dustList[j].update(); dustList[j].display(); } } else { eyesClosedValue = 0; file.amp(0); } } class Dust { PVector position; PVector velocity; float move = random(-7,1); Dust() { position = new PVector(width/2,height/2); velocity = new PVector(1 * random(-1,1), -1 * random(-1,1)); } void update() { position.add(velocity); if (position.x > width) { position.x = 0; }
    if ((position.y > height) || (position.y < 0)) {
      velocity.y = velocity.y * -1;
    }
  }
  void display() {
    fill(255,255,212,100);
    ellipse(position.x,position.y,gMove+move, gMove+move);
    ellipse(position.x,position.y,(gMove+move)*0.5,(gMove+move)*0.5);
  }
}

class ParticleSystem {
  ArrayList particles;
  PVector origin;
  
  ParticleSystem(PVector location) {
    origin = location.copy();
    particles = new ArrayList();
  }
  
  void addParticle() {
    particles.add(new Particle(origin));
  }
  
  void run() {
    for (int i = particles.size()-1; i >= 0; i--) {
      Particle p = particles.get(i);
      p.run();
      if (p.isDead()) {
        particles.remove(i);
      }
    }
  }
}

class Particle {
  PVector location;
  PVector velocity;
  PVector acceleration;
  float lifespan;

  Particle(PVector l) {
    acceleration = new PVector(0,0.05);
    velocity = new PVector(random(-1,1),random(-2,0));
    location = l.copy();
    lifespan = 255.0;
  }

  void run() {
    update();
    display();
  }

  // update location 
  void update() {
    velocity.add(acceleration);
    location.add(velocity);
    lifespan -= 5.0;
  }

  // display particles
  void display() {
    noStroke();
    //fill(216,226,237,lifespan-15);
    //ellipse(location.x,location.y,3,3);
    fill(248,255,122,lifespan);
    ellipse(location.x,location.y,2,2);
    ellipse(location.x,location.y,2.5,2.5);
  }
  
  // "irrelevant" particle
  boolean isDead() {
    if (lifespan < 0.0) {
      return true;
    } else {
      return false;
    }
  }
}

//draw stars
void star(float x, float y, float radius1, float radius2, int npoints) {
  float angle = TWO_PI / npoints;
  float halfAngle = angle/2.0;
  beginShape();
  for (float a = 0; a < TWO_PI; a += angle) {
    float sx = x + cos(a) * radius2;
    float sy = y + sin(a) * radius2;
    vertex(sx, sy);
    sx = x + cos(a+halfAngle) * radius1;
    sy = y + sin(a+halfAngle) * radius1;
    vertex(sx, sy);
  }
  endShape(CLOSE);
}

//draw gradient
void setGradient(int x, int y, float w, float h, color c1, color c2, int axis ) {
  noFill();
  if (axis == Y_AXIS) {  // Top to bottom gradient
    for (int i = y; i <= y+h; i++) {
      float inter = map(i, y, y+h, 0, 1);
      color c = lerpColor(c1, c2, inter);
      stroke(c);
      line(x, i, x+w, i);
    }
  }  
}

// OSC CALLBACK FUNCTIONS

public void found(int i) {
  println("found: " + i);
  found = i;
}

public void poseScale(float s) {
  println("scale: " + s);
  poseScale = s;
}

public void posePosition(float x, float y) {
  println("pose position\tX: " + x + " Y: " + y );
  posePosition.set(x, y, 0);
}

public void poseOrientation(float x, float y, float z) {
  println("pose orientation\tX: " + x + " Y: " + y + " Z: " + z);
  poseOrientation.set(x, y, z);
}

public void mouthWidthReceived(float w) {
  println("mouth Width: " + w);
  mouthWidth = w;
}

public void mouthHeightReceived(float h) {
  println("mouth height: " + h);
  mouthHeight = h;
}

public void eyeLeftReceived(float f) {
  println("eye left: " + f);
  eyeLeft = f;
}

public void eyeRightReceived(float f) {
  println("eye right: " + f);
  eyeRight = f;
}

public void eyebrowLeftReceived(float f) {
  println("eyebrow left: " + f);
  eyebrowLeft = f;
}

public void eyebrowRightReceived(float f) {
  println("eyebrow right: " + f);
  eyebrowRight = f;
}

public void jawReceived(float f) {
  println("jaw: " + f);
  jaw = f;
}

public void nostrilsReceived(float f) {
  println("nostrils: " + f);
  nostrils = f;
}

// all other OSC messages end up here
void oscEvent(OscMessage m) {
  if(m.isPlugged() == false) {
    println("UNPLUGGED: " + m);
  }
}

Keali-LookingOutwards05

Created by Milica Zec and Winslow Turner Porter III, the project Giant is a virtual reality experience detailing a story of a family amidst an active war zone; inspired by true events of Zec’s family during a war-torn Europe, the vision is of two parents struggling to distract their daughter by inventing a fantastical tale–that the belligerence and commotions above ground are mere antics of a giant. The audience is transported into a makeshift basement shelter in which the characters hide, becoming fully immersed in a dark and ominous atmosphere, complete with sound effects and physical motion as if one were living vicariously through someone in that virtual reality.
Being someone who has had minimal exposure and personal experience with VR, donning the Giant‘s headgear and noise-cancelling headphones was an indescribable and very intimate experience. Giant was impressive from both an artistic and technical viewpoint, boasting emotional storytelling expertise and seamless technological execution with heavy attention to detail. This work is the first VR I’ve experienced to have a fully-immersive, 360-degree view of its fictional realm; it was very invigorating, yet it also made me wary, that I could fully turn my head to view the full surroundings of a virtual room whilst within the piece: in this case, I could omnisciently scan the basement in which the family resided.
Giant was a subtle, powerful experience, and explored a concept similarly demonstrated by the film Life is Beautiful: masking darker truths with lighthearted fantasies for the sake of the innocent. It’s an entirely bittersweet intention, especially when one is seeing it from a third-party point of view.

//giant_website

Keali-LookingOutwards04

(I was so tempted to do one of Theo/Emily’s works again like Connected Worlds because I loved them so much but I wrote about their lecture in LookingOutwards01 so…)

The work I chose is Lit Tree by Mimi Son, an interactive artwork consisting of a real tree in conjunction with digital emulsion: the tree is augmented with video projection, resulting in patterns created from the light hitting the leaves, which act as voxels. The project is described to allow the tree to have a “visceral conversation with human visitors”, thereby becoming a sort of aesthetic object. This project coincides with my usual preferences when it comes to interactive artwork, which is why it appeals to and interests me: nature-based, with a calm, serene, and intimate atmosphere as the audience experiences a subtle, private, albeit dynamic relationship with the single tree. As someone who also usually admires digital work on a screen–immediate and through an interface, this was impressively novel to me in that it combines technical features with a real-life, physical object to create yet an immersive conversation between human and nature. The work also feels very well-resolved because of the specific attribute of light being chosen to be utilized with the tree’s leaves, thus relying on the varying placement, space, surface, and texture to output the beautiful patterns.

Lit Tree link//

Keali-Plot

screenshot-2016-09-29-12-40-34

screenshot-2016-09-29-12-40-39

screenshot-2016-09-29-12-40-51

img_3793

img_3788

img_3776
img_3778

I decided to work from the template of order to disorder, already conjuring ideas of singular, streamlined designs that somehow branch out or disperse into randomness and chaos–from then it was a methodical process of polishing the visuals: I would hardcode the “organized” half of my piece to make sure relevant segments or shapes are precisely and consciously positioned with a purpose, and then gradually lose control on the “disorganized” half with randomly generated float variables.
I created several nature-based themes to represent this idea of orderly lines becoming jagged; there were multiple ways of approach: flat landscapes to jagged mountains, calm waters to dangerous waves, silence to noise, stalk to branches and swirling vines–all of which I sketched and considered.
In conjunction with my past inspirations and works, I settled with the seascape version, and I customized accordingly with the necessary pens of differing stroke sizes and azure colors to give the flat design a subtle sense of depth. I originally planned for the water design to be on a paper that was naturally a light to dark blue gradient color, and to position the waves near the boundaries of two opposing shades, but I was unable to obtain such a colored paper and settled on a gray-light blue which I felt was neutral. From then, I used the axidraw at paused intervals where I switched out pens to accommodate each particular wave–and then I actually shifted and re-ran my program twice to give the visual impression of more lines and depth to the overall product, and to retrace some of the lighter, thinner wavelengths to make them darker.
If time and my own mental capabilities when it comes to coding allowed, I would strive to expand this series to encompass my other nature ideas as well (landscape, vines, trees, noise), each with its own considered color palette and background texture.

// see https://processing.org/reference/libraries/pdf/index.html
// deliverable04 template by golan 
import processing.pdf.*;
boolean bRecordingPDF;
int pdfOutputCount = 0; 
 
void setup() {
  size(600, 400);
  bRecordingPDF = true;
}
 
void keyPressed() {
  // When you press a key, it will initiate a PDF export
  bRecordingPDF = true;
}

class LineInfo {
  int x1, x2, y1, y2;
  LineInfo(int x1, int y1, int x2, int y2) {
   this.x1 = x1;
   this.x2 = x2;
   this.y1 = y1;
   this.y2 = y2;
  }
}
 
void draw() {
  if (bRecordingPDF) {
    background(255); // this should come BEFORE beginRecord()
    beginRecord(PDF, "myName_" + pdfOutputCount + ".pdf");
 
    //--------------------------
    noFill(); 
    int mid = width/2;
    strokeWeight(1.5);
    LineInfo[] lines = { new LineInfo(10,200,mid+10,200),
                         new LineInfo(10,210,mid-10,210),
                         new LineInfo(10,190,mid+5,190),
                         new LineInfo(10,185,mid-5,185),
                         new LineInfo(10,172,mid,172),
                         new LineInfo(10,230,mid,230),
                         new LineInfo(10,240,mid+2,240),
                         new LineInfo(10,168,mid-15,168) };

    for (int i = 0; i < lines.length; i++) {
      beginShape();
      smooth();
      LineInfo l = lines[i];
      line(l.x1,l.y1,l.x2,l.y2); 
      curveVertex(l.x2,l.y2);
      curveVertex(l.x2,l.y2);
      float rx = l.x2;
      float ry= l.y2;
      for (int j=0; j < 100; j++) {
        rx = rx + random(12,14); 
        ry = ry + random(-30, 25); 
        curveVertex(rx, ry);
      }
      endShape();
    }
     
    endRecord();
    bRecordingPDF = false;
    pdfOutputCount++;
  }
}

GitHub repository//

Keali-Clock-Feedback

I am incredibly thankful that my clock seems to have generally been well-received; general consensus when it came to improvements involved making the constellation fragments connect more orderly to construct shapes of some sort, rather than be as random as they are now. I agree with this sentiment and also came across the thought during my process, but because of lack of time, did not finalize my product with this in mind. I hope to revisit this asset and to ensure that later-formed lines’ x1,y1,x2,y2 coordinates be the x1,y1 or x2,y2 of previous lines so that the constellations will have a higher chance of forming polygons, and avoid haphazard intersections. I feel this would also alleviate the critiques that the hours are hard to read because of the random placements of the fragments; with less intersections, each fragment would ideally be distinctively apparent and the reader would be able to easily count the lines.

Keali-AnimatedLoop

seastarsgif

kealiseastarsgif

The introduction to the assignment and the subsequent examples shown made me instinctively assume that the gif project would have to be approached in a very geometrical and structured way–which was different from my usual ideas. As such, a lot of my original sketches and planning were ingrained in shapes and solids, trying to integrate a seamless transformation while manipulating perspective in space. However, I was unable to come up with an polished idea from this track, mainly because I couldn’t tell without modeling, whether some figure would be able to translate or rotate into a continuously looping gif. So I worked off my personal aesthetic of a streamlined design again, particularly inspired by the sinusoidal wave from Golan’s sample gif. With my affinity for nature, almost reflexive because of my constellation clock from the previous assignment, I thought of utilizing multiple sinusoidal waves as strings and curtains of stars against another dynamically rendered blue backdrop–but then finally decided to actually contrast this by customizing horizontal waves with differing heights and angles to mimic waves; I increased the substance of the idea of a body of water by also adding noise waves that barely blend in with a top-down gradient, upon which I loop the raining down of stars that lose opacity as they fall, as if swallowed by the waves of the water. I believe I succeeded at sticking true to the type of design and atmosphere I wanted to output–like my clock, I wanted the visuals to exude a calm sense of smoothness and serenity, rather than the mind-boggling, logically impressive illusions and riddles of the complicated, geometric gifs. Food for thought on improvements include: an alternative to noise waves that better compliment sinusoidal waves by being more wavy and streamlined, but also visualizes a solid color underneath it’s boundary; comets/shooting stars in the skies to add texture (streaks from the changes in velocity), perhaps another approach to gradient backgrounds (increase complexity/layers if necessary).

**in regards to exporting the gif, I used Photoshop after exporting the frames from the p5.js framework–but as to why it’s incredibly slow on some browsers, I unfortunately can’t tell why…
I also realize that my noise waves end up becoming the bug in the animated loop–they don’t carry over seamlessly like the sine waves do, so it probably would have been better if I just kept and multiplied the sine waves (I’m so sorry)
for future reference, I’ll try to learn how to somehow “hardcode” noise waves–I could numerically manipulate the sine waves to customize height, girth, speed, and ensured that the frames exported matched up as it loops, but I could only change the potential height and depth of the noise waves, and as such they are still “random” with each run of the program.
I also made the conscious decision to keep the noise waves because I felt like the bottom of the graphic felt a bit too empty with just the linework of the sine waves, and the gradient of the background..

gifnotes

GitHub repository//

Keali-Interruptions

  1. the artwork is square
  2. many short black lines
  3. white background
  4. thin white border at edge
  5. lines are the same length
  6. empty gaps in piece with no lines
  7. most lines are not parallel
  8. not all lines intersect
  9. intersections are not crowded or clumped
  10. many lines form a V/Y formation
  11. appears to be a general direction the lines point to

kealiinterruptions

GitHub repository//

My gut feeling when I first viewed what we were to copy was worry–and I was not let down. Instinctively I started with the structure similar to how I coded my linesIntersect from deliverable01, and as expected, the lines were arranged too randomly, and from then I had to look closer. I hence noted the generally one-directional slope of each line–this prompted me to somehow control the slopes of the lines so that they were not too haphazard, and I did so with trial-and-error constants being applied to PI. The next consideration was that each line was not too messily spaced–they seemed to follow a grid pattern such that each line appeared per grid of the overall square canvas. So, I consequently scrapped the truly random implementation of my lines and restructured my loops so that the x and y values of each line strictly iterated by the hard-coded size of my imaginary grid, which amongst many other variables like gaps (where lines didn’t show)Ā and slopes were visually estimated andĀ set to actual numbers. It goes without saying that I have immense respect for any coder, programmer, artist–regardless of the era they were active; and naturally if anything, it is incredible that Vera Molnar created this during her time when technology was not as advanced or thoroughly explored. Re-coding this was a challenge (and I expected, admittedly–and I’m curious as to how Vera created this herself.)

 

Keali-LookingOutwards03

tree
tree2
tree3

These are select projects of Kajima Sawako and Michalatos Panagiotis: interactive flash samples a_garden, plant_growth, and fishes, all of whose nature-inspired, stylized linework, and smooth motions are characteristics I admire. When it comes to generative artwork, I instinctively focus on the user experience and outwards aesthetic of the piece before observing closer, and as such the execution of mark-making in these mini projects are quite seamless and impressive. The algorithms that generated the work seem to reflect some sort of fractal, or similarly-based structure, to effectively grow the trees and branches. The fishes are reminiscent of some sprite or object-oriented type spawn to track and follow the user’s cursor. I originally found Sawako and Panagiotis on the AKT-UK website through scriptedbypurpose, which focuses on utilizing geometrical templates to design architecture, and I could connect the algorithm-based tendencies in both their digital and architectural works, which all seem inspired by some seamless, mathematical visualization. These samples represent some balance of effective complexity by generating growth and movement in a more ordered, controlled manner (expected fractal branches, path of fish movement), but not limiting such action to some boundary: this provides some randomness and unexpectedness in where new plants generate, and the constant respawn of new forms in the flash present some disorganized sense of space.

flash projects link//
scriptedbypurpose profile//
architectural designs//

Keali-Reading03

rain1
rain2rain4

1A. The particular frames were from the animated film Kotonoha no Niwa, which I specifically selected for its applauded visuals and realistic, beautiful rendering of nature on an illustrative media. Relevant to the overall backdrop and atmosphere of the movie, as well as particularly these selected scenes are what I perceive as dynamic intersections of order and randomness on the effective complexity curve: the randomness from the excessive motion of uncontrolled, unpredictable rainfall, and their interaction with the ordered, immobile existence of buildings and other rooted structures in both nature, and from human infrastructure.

1B. While growing up both as an amateur and as an artist, I’ve been exposed to The Problem of Meaning multiple times, more often in recent years as I have almost been trained to think that, in a professional setting, most works I am to output are instinctively met with the question of purpose. Why did you make this? What does this mean? How does this affect yourself or society? As per my stance, I grew to realize that ultimately, these introspective questions are perhaps cursory food for thought for myself as I prioritize and solidify my own artistic intent. As Galanter applies this to generative art saying how a generative system may just be pragmatic and create products without intrinsic meaning, I too accept this viewpoint–and arguably, this may be one of the charms of generative art after all, that it can have no explicit meaning.

Keali-Clock

constellationclock

Personal preference and style instinctively propelled me in a design, illustrative, streamlined, or nature/environmental-oriented approach to the assignment; the fact that it was an assignment with a timekeeping topic kept my ideas in check by balancing them with functionality and utility. As “abstract” was a key word to the work’s intentions, I geared my ideas towards nature, going through processes as a growing tree (trunk growing by the hour, branches by minute, leaves/flowers by second), and eventually the sky. The sky idea stuck with me, in that since origin has time reflexively been associated with the aerial plane–in regards to stargazing, orbits of planets, rotations of celestial bodies, etcetera. I decided to further this idea as I settled on my approach: with little star specks appear every second, bigger and brighter stars every minute, and constellation fragments per hour of current time; all of these would also be characterized by continuous rotations and dimming/brightening cycles via some calculation of milliseconds. Hence the product of a non-overwhelming, yet abstract starry night interface of my clock, appealing both my environmental and design-based tendencies; while it is reasonably possible to tell the time by counting constellation segments and stars, the effect is still subtle and the utility requires some effort; but in connection with how time and mother nature wait for no one, I want the audience to feel relaxed while viewing the atmospheric vibe of the work–and to sink in to a mere vague notion of time, and to admire the characteristics of this nature-based simulation.
Food for future thought: generation of mist/galaxy patterns/smoke throughout, possible integration of calming instrumental music, other celestial bodies/comets (may sacrifice utility and functionality however), subtle rotation of constellation figures around some origin…

clock

Keali-LookingOutwards02

Initially when given the assignment, I worried over the fact that I had no personally recognizable exposure to what I defined as interactive/computational art, project, or installation; perhaps I overthought and unnecessarily limited myself, but my aspiration in utilizing technology and design together stems from experience of sequential narratives such as animations and videogames–and ultimately decided to make this reflection on what I consider to be a compromise between an interactive work on a medium with which I was more familiar:
Such is the videogame Flower by thatgamecompany, in which the player is the wind and guides and collects petals by interacting with the surrounding environment; the goals and journey in each level vary, but involve flight and exploration to create an idyllic atmosphere. The game was created by a development team, which included producers, directors, engineers, designers, illustrators, writers, and composers amongst other members to accommodate for every last detail to be successfully integrated into the interface. Flower was actually the second project in a 3-game deal with PlayStation, in which Sony offered to fund three games from the company, meaning that it is specific to the PS3 and would likely not be available on any other platforms in the future.
However, Flower challenges traditional gaming conventions by delivering a simple gameplay using accessible controls (SIXAXIS motion sensors) and a medium that is meant to evoke positive emotions in the gamer; the team viewed their efforts as creating a work of art, removing elements and mechanics that were not provoking the desired response in the players. The result is a narrative arc that progresses through visual aesthetic and emotional cues as the audience fades from the external and stressful world. As a student whose goal is to effectively have technology and art coexist, this game provides future opportunity of advanced visual, audio, and interactive escapes to engage players and strum the chords of feelings that all consumers naturally have: this is something important to me as a gamer and an artist, that there is a feedback or response between the resulting products and the audience.

Official project website//
PlayStation based website//

https://www.youtube.com/watch?v=nJam5Auwj1E

Keali-LookingOutwards01

Theo Watson is an artist, designer, and experimenter who received a BFA from Parsons School of Design in Design and Technology, and is currently based in Brooklyn, NY. His works are inspired by the curiosity and excitement of his audience, which leads to the production of impressive interactive art installations in which people can immerse themselves. He founded Design I/O, a creative studio specializing in the design and development of such media, continuing to push the boundaries in the possibilities of an art-and-technology intersection. Having personally been interested in and inspired by the dynamic possibilities of art and technology, Theo engaged me as a listener to his lecture and as a fan to his projects. My goals as an artist have long stemmed from my experiences as an child when I was an avid fan of digital media: cartoons, animations, videogames alike. It is especially encouraging for me to see Theo’s execution of immersive environments that transport viewers to seemingly a wholly different and stimulating and reactive world. Being an environmentalist as well as an aspiring illustrator with more design-leaning aesthetics, I am fond of his works ‘Funky Forest’ and ‘Connected Worlds’, both of which integrate amazing, streamlined graphics with a scientific importance of how humans can affect the earth; the process of creating ‘Connected Worlds’ as detailed in the lecture show the amount of thought, love, and experimentation that went into the installation, as well as the series of troubleshooting involved. It was interesting to hearĀ about how every detail implemented was formulated from an either artistic, practical, or ideological reason. For instance, Theo and Nick discussed how they wanted to steer away from mindless interaction by implementing a rewards-like system to the ecosystems, where the children have to grow trees and put effort into planting seeds and nourishing plants to attract rarer creatures, while learning how to effectively manipulate water flow to sustain different biomes; this reflects the necessary care and effort that citizens should put into caring for their habitats around them in order to coexist with other species. Another notable dialogue from the lecture were the multiple times which he admitted, “–and we never thought this would work but…”, indicating how through the problem-solving that came with the creation process were the innovative, almost impulsive solutions. There is something very human, down-to-earth, and realistic about the presentation and his approach to the project: about how such large-scale, time-committing installation would present troubles along the way, and how one can just try things for the sake of trying things, oftentimes leading to unexpected and beneficial results. ‘Connected Worlds’ seamlessly implements art, technology, environmental worldviews, and the arduous but rewarding journey of bringing a dynamic, interactive space to life.

Theo Watson’s website//

Keali-FirstWordLastWord

Naimark’s essay resonated with me in that I feel the ideology of first word and last word art is accurate, especially today; in critiquing and reviewing artworks, there is a natural instinct on the audience’s part to make comparisons to older and similar pieces. Hence, when something is created that is arguably novel for its kind, there is initial shock and acknowledgement of the difference in craft, and this experience of coming across something new often overwhelms any evaluation of objective, technicalĀ strengths and weaknesses. This however is an inevitable and positive contribution to the unendingĀ influx of artists as new ideas, media, and tools can be discovered and utilized as time progresses. First word art sets a precedence for future generations, spurring new varieties of works and subsequently affecting involved cultures; generally speaking, creativity is often perceived as a positive asset to an individual, and so the first word arts encourage and give vitality to more of such creativity. This in turn leads to more development of the last word arts, and eventually, other first word arts, all of which together dynamically contribute to the building blocks of an ever-changing, artistic society; both types should be recognized as important. I personally find myself along the spectrum of last word art, as a lot of my aspirations stem from already established genres, tools, and media. However, I thoroughly advocate against the downgrade of last word art as unoriginal and inferior; there is still beauty, curiosity, skill, and fun in creating works that are inspired from others’, and oftentimes there is the unrealistic expectation of art to be completely original, unprecedented, and revolutionary to be considered worthy, or good. This unfair judgment is one that I believe should be reconsidered.