szh-techniques

Library

p5.geolocation looks like a really cool and useful library for many projects; given how much / reliant one's location is these days, I can imagine a handful of projects where I would want to be able to use a library like this to further my capabilities with them.

Example

I really liked the soft body example shown on p5.js because it offers a lot of different ways to be able to transition from one thing to another in a very smooth and in an aesthetic pattern that I feel like I may want to use.

Glitch

The square-connect app looks like a very useful thing to have as the company square becomes more and more popular for sellers to have in their back pocket. It also have a very simple application to use, and I think more people can take advantage of it.

szh-Body

Can you beat Mario Bros 1-1 with ONLY your face? 

(Turns out yes, but with many trials, slow game play, and very little chance to become a top ranked gamer.)

I downloaded FaceOSC, investigating how the different components worked, and read over the examples + their code to see what I could do with FaceOSC, and how the program is ran.

I really love the "Mario Bros 1-1 but with a twist" trend so I thought it would be funny to make a "Can you beat Mario Bros 1-1 with ONLY your face?"

 

As I tested my own game, the exact threshold for where the "center" of the game is was rather unclear, because I was estimated of where the face would be (assuming it was relatively center). Therefore, I later then added a "centering" section in the beginning, where the program would wait a couple seconds for the player to calibrate themselves before setting the thresholds for the left and right side.

I also later switch the left and right because the camera is flips the user so that left is their left eye, but can be confusing for when the player is actually playing cause it looks to be the opposite of what the intended result actually is.

FaceOSC and Processing needs to be installed before usage:

//
// Sabrina Zhai
// 9.23.19
//
// This program is intended to be used with SuperMarioBros, 
// allowing players to play 1-1 with a twist.
//
// This code is adapted from FaceOSCReceiver.
//
// A template for receiving face tracking osc messages from
// Kyle McDonald's FaceOSC https://github.com/kylemcdonald/ofxFaceTracker
//
// 2012 Dan Wilcox danomatika.com
// for the IACD Spring 2012 class at the CMU School of Art
//
// adapted from from Greg Borenstein's 2011 example
// http://www.gregborenstein.com/
// https://gist.github.com/1603230
// 
//

import oscP5.*;
OscP5 oscP5;

import java.awt.*;
import java.awt.event.*;
import java.awt.event.KeyEvent.*;

// num faces found
int found;

// pose
float poseScale;
PVector posePosition = new PVector();
PVector poseOrientation = new PVector();

// gesture
float mouthHeight;
float mouthWidth;
float eyeLeft;
float eyeRight;
float eyebrowLeft;
float eyebrowRight;
float jaw;
float nostrils;

Robot robot;

float openThreshold = 4;
float leftThreshold; // for the left SIDE and not the left EYE
float rightThreshold;

float previousMouthHeight;
float previousPosition;

int begin; 
int duration = 3;
int time = 3;
boolean faceSet = false;

void setup() {
  size(640, 480);
  frameRate(30);

  begin = millis();  

  oscP5 = new OscP5(this, 8338);
  oscP5.plug(this, "found", "/found");
  oscP5.plug(this, "poseScale", "/pose/scale");
  oscP5.plug(this, "posePosition", "/pose/position");
  oscP5.plug(this, "", "/pose/orientation");
  oscP5.plug(this, "mouthWidthReceived", "/gesture/mouth/width");
  oscP5.plug(this, "mouthHeightReceived", "/gesture/mouth/height");
  oscP5.plug(this, "eyeLeftReceived", "/gesture/eye/left");
  oscP5.plug(this, "eyeRightReceived", "/gesture/eye/right");
  oscP5.plug(this, "eyebrowLeftReceived", "/gesture/eyebrow/left");
  oscP5.plug(this, "eyebrowRightReceived", "/gesture/eyebrow/right");
  oscP5.plug(this, "jawReceived", "/gesture/jaw");
  oscP5.plug(this, "nostrilsReceived", "/gesture/nostrils");

  //Sets up the Robot to type into the computer
  try {
    robot = new Robot();
    robot.setAutoDelay(0);
  } 
  catch (AWTException e) { // (Exception e) {
    e.printStackTrace();
  }
}

void draw() {
  background(255);
  stroke(0);
  textSize(18);

  if (time > 0) { 
    time = duration - (millis() - begin)/1000;
    text("Setting current face position as center in..." + time, 10, 20);
  } else if (!faceSet) {  
    text("Setting current face position as center in...Face set!", 10, 20);

    //Set the face's threshold positions
    leftThreshold = posePosition.x - 75; 
    rightThreshold = posePosition.x + 75; 
    faceSet = true;
  }

  //Helps user see where the threshold to move their head is
  line(leftThreshold, 0, leftThreshold, height);
  line(rightThreshold, 0, rightThreshold, height);

   // Actions after a face is found
  if (found > 0) { 

    // Draw the face
    translate(posePosition.x, posePosition.y);
    scale(poseScale/2);
    noFill();
    ellipse(-20, eyeLeft * -9, 20, 7);
    ellipse(20, eyeRight * -9, 20, 7);
    ellipse(0, 20, mouthWidth* 3, mouthHeight * 3);
    ellipse(-5, nostrils * -1, 7, 3);
    ellipse(5, nostrils * -1, 7, 3);
    rectMode(CENTER);
    fill(0);
    rect(-20, eyebrowLeft * -5, 25, 5);
    rect(20, eyebrowRight * -5, 25, 5);

    // Makes Mario jump
    if (mouthHeight > openThreshold) { // Mouth open (continuously)
      robot.keyPress(java.awt.event.KeyEvent.VK_UP);
      if (previousMouthHeight < openThreshold) { // If the mouth is only opened ONCE (closed)
        robot.keyRelease(java.awt.event.KeyEvent.VK_UP);
      }
    }
    previousMouthHeight = mouthHeight;

    // Moves Mario to the left (user moves to the right)
    if (posePosition.x < leftThreshold && previousPosition < leftThreshold) {
      robot.keyPress(java.awt.event.KeyEvent.VK_LEFT);
    } else {
      robot.keyRelease(java.awt.event.KeyEvent.VK_LEFT);
    }

    // Moves Mario to the right (user moves to the left)
    if (posePosition.x > rightThreshold && previousPosition > rightThreshold) {
      robot.keyPress(java.awt.event.KeyEvent.VK_RIGHT);
    } else {
      robot.keyRelease(java.awt.event.KeyEvent.VK_RIGHT);
    }
    previousPosition = posePosition.x;
  }
}


// OSC CALLBACK FUNCTIONS
public void found(int i) {
  //println("found: " + i);
  found = i;
}

public void poseScale(float s) {
  //println("scale: " + s);
  poseScale = s;
}

public void posePosition(float x, float y) {
  println("pose position\tX: " + x + " Y: " + y );
  posePosition.set(x, y, 0);
}

public void poseOrientation(float x, float y, float z) {
  println("pose orientation\tX: " + x + " Y: " + y + " Z: " + z);
  poseOrientation.set(x, y, z);
}

public void mouthWidthReceived(float w) {
  //println("mouth Width: " + w);
  mouthWidth = w;
}

public void mouthHeightReceived(float h) {
  //println("mouth height: " + h);
  mouthHeight = h;
}

public void eyeLeftReceived(float f) {
  //println("eye left: " + f);
  eyeLeft = f;
}

public void eyeRightReceived(float f) {
  //println("eye right: " + f);
  eyeRight = f;
}

public void eyebrowLeftReceived(float f) {
  //println("eyebrow left: " + f);
  eyebrowLeft = f;
}

public void eyebrowRightReceived(float f) {
  //println("eyebrow right: " + f);
  eyebrowRight = f;
}

public void jawReceived(float f) {
  //println("jaw: " + f);
  jaw = f;
}

public void nostrilsReceived(float f) {
  //println("nostrils: " + f);
  nostrils = f;
}

// all other OSC messages end up here
void oscEvent(OscMessage m) {
  if (m.isPlugged() == false) {
    //println("UNPLUGGED: " + m);
  }
}

szh-clock

Menstrual Clock. /  Code

The circle represents the actual period cycle.

Whereas the red arc of the circle represents the time actually bleeding. In this case, the program takes the average, 28 day cycle with 6 days of bleeding.

I think an interesting/practical place for my clock to live is on the landing page of a period tracker app. It can live in the background as the user checks through their calendar / expected periods. Information wise, it may not as accurate as say, a table chart, but I think the visual representation can be a nice touch to the app.

In the future, I would like the day that the period starts and the length of bleeding + period cycle to be dependent on the actual user, and take data from previous cycles.

Process

I originally intended this clock to be used over the course of days, and the fluid simulates the actual flow of a period (color getting darker on the last couple days, starting off little and peaking on day two, then gradually decreasing). The amount of fluid (density, instead of area), that is being added was intended to be close to a bell curve and I tried using an easing function to simulate this, but my results don't have a drastic change towards the density of the fluid.

As I got my fluid simulation to work, I realized that visually, it works best at a second time step than a day one, hour one, or even minute:

 

(You can see there are hardly any difference between day vs hour vs minute.)

Therefore, I decided to make the fluid simulation based on the seconds passing, rather than any other time factor (since the density of fluid added wasn't working as well as I hoped anyways, which was also supposed to be set at a day-scale).

Changing between days:

As you can see, the line increases, but because the bleeding period is over, the arc is white.

 

szh-LookingOutwards03

Portée

The interaction of this piece consists of people playing the literal strings they see criss crossing through the room, and then the piano will play a corresponding note to that string.

I think it's so interesting to represent a string instrument with actual string; the spatiality of this piece really draws me towards it, and the feedback (music) I get makes me want to continue playing it.

The concept is centered on this question: What if we could express architecture through music? Architecture and music, to me, provides very different senses but through this, I'm able to have multi-sensory responses as I am walking through the installation.

EGLISE_10_exterieur.png

I enjoy the process of which the artists created this because the utilized 3D modeling tools, coded the layout and music, and wired everything with Arduino, but the final piece feels like all of that is really in the background; the tech doesn't really feel like it is in the forefront of the installation at all. The piece focuses on the strings and the music that accompanies it, and I feel like it's an immersive way for people to interact with song.

szh-AnimatedLoop

 

I created the wave/oscillating motion with the easing function sineOut(x). I achieved the oscillating motion that I intended but I fell short in trying to fully loop (without hiccup) the add ons/effects I had paired with this animation:

I had hoped to achieve a motion blur effect, which with the affect of the arcs changing dimensions (closing and opening like Pacman), gives the trail it leaves behind a certain effect.

I was inspired after looking through various of beesandbombs' gifs: https://twitter.com/beesandbombs/status/1090673528758849536?s=20