nixel-arsculpture

AR Naruto summoning scrolls, created as an homage to my favorite childhood anime.

In Naruto, scrolls are used to store objects for later use. When it's unsealed, the object appears from the scroll. You can seal anything, from weapons to energy, to creatures. I saw this as a perfect opportunity to use Vuforia. I wanted to do this as a sort of wish fulfillment. For lots of us as kids (and adults), being able to mimic the abilities of our favorite characters is great fun. I searched for various scrolls used in the show, for example:

Image result for naruto scroll

Image result for naruto weapon scrollImage result for naruto water scroll

I re-edited them from 360p screenshots to HD for printing, and handmade the scrolls from discount paper and wooden sticks. Using Unity and a lot of tutorials, I created the fire and water from particle systems and grabbed the shuriken from online archives. I found this concept's alignment with the assignment very amusing because of how well they fit together. I was pleasantly surprised how well the scrolls worked as Vuforia image targets. One thing I wish I could change is that in Unity, I set a delay so that the object would appear after the glowing ring, but it didn't translate with the build to the Pixel phone.

Thank you to my friends for fooling around with this concept and indulging my love for Naruto.

nixel-nerual-justaline

Snowscape, by Nixel and Nerual: An AR environment of suspended snowflakes, frozen mid-breeze.

This AR would be experienced as a peaceful walk through a snowscape frozen in time.  We originally contemplated making the project a first person shooter type experience by drawing brief and short lines in the app while walking backwards. However, we discovered that this instead created a winter-like environment of falling snow that mirrored our previous location, so we decided to go with that.

nixel-book

Title: Fake Love

Description: A collection of hypocritical songs and ill-fated stars.

PDF zip: here

Chapter sample: here

This project was of interest to me since I've spent a considerable amount of time writing fiction. In the beginning brainstorming phases, I considered making a generative chapter based on my own writing, but I quickly deviated from that idea after browsing the projects from previous years, and this one stood out to me.  It was fun and relatable content.  I didn't want to remake the same project (although it was tempting), so I thought about how I could apply the same format to my interests. I remembered seeing posts from this twitter account going around, and I really liked the aesthetic of it.

So, the resulting idea was to combine the imagery of Kpop music videos about love with corresponding lyrics.  Something that has always struck me as strange is that Kpop idols sing about love all the time, yet they are signed under contracts to never date, or to never fall in love in other words. This is pretty ironic, and I wonder about the validity of their love songs if choosing to date someone is seen as a huge offense.  In response to that, this project heavily favors artists who have gone through dating scandals and corrupts lyrics mentioning love.

In terms of coding, this project was less generative, perhaps, and more corruptive.  For a while, I tried to work with Rita and Wordnik before realizing I didn't need a new libraries to achieve what I wanted. In this case, less was more.

For grabbing screenshots:

var video;
 
function setup(){
	createCanvas(1080, 700);
	video = createVideo('taeyeon-make-me-love-you.mp4');
	video.size(1080, 720);
	setInterval(takeSnap, 5000);
}
 
function takeSnap(){
	save('0.jpg');
}
 
function draw(){
	image(video, 0, 0)
}
 
function mousePressed(){
	video.loop();
}

For corrupting lyrics & exporting JSON:

var myVersesArray = [];
var lyricsArray = [];
var verseArray = [];
var verseObjectArr = [];
var data;
var newWords;
var currVerse = "";
var currentVerse = "";
var corruptedLove = ["ḽ̴̼̃͒̀̽́̔̚̚ơ̶͚͛̿̕̕v̵͓̻͚͔̒͑͛̓͝e̸̼̱̟̖̳̼͙͘",
"l̷̙̮̻̚͝ŏ̴̰͍̩̆̔v̸̨̀͂͝e̸̺̜̱̎̃͘",
"l̸̥̍̽͐͗̑̊ơ̵͓͓͖̘͖̒̈̀͜v̷̛͇̾͂̔́͆̃̓̈̿̈́͊̉̕͘e̸̼̓̍̍̊̒͐̓͠͠",
"ľ̵̡̨̧̢̡̰̺̰͈͇̞̞̺̭͇̞̠̯̬̬̓͝ͅo̴̧̢̲͇̥̟͎̦̝͓͖̗̣͌̅̎̅̈̎̊̃͑̀͒͆͐̍́͊̋̐͋̆͛͐̔͘̚ͅͅṿ̵̧̨̧̲͚̙̙̻̟̝̤̬̟͖̭̮̃̔́͑̀̀̋̓̑̄̄̈̒̅̿̾͆̚͝ę̵̞̩̖̣͔̱͉̻̳̘͉͙͍̆̑͊̈̏͛͊̈̿̕͜͝ͅ",
"l̶͈̹̘̹̫̄̓̓̀̀̿̉͌͝͠͝ǒ̵̘̲̩̖̹͎̻̝̐̑̄ͅv̸̧͍̻͙̗̠̜̱̹̖̠͇͚͛͋̓̈́͂̀̆͋̓́̎̕ȩ̵̢̭̻̙̤̪̫̳̤͇͓̎͋̀̍̾̊͋",
"l̷̴̛o͏v̸̸́e̡",
"l͓̟̗̻̞̪̙̘̹͇̍̋̍̅̋́̉̿̕ơ̡̼̹͓͚̹̎́͊̏͌̍͂̽̓͟v̜̬̭̪̝̽̋̆̅͊e̶̮͔̟̪͍̼̦̐̓̀̄͞ͅ",
"ǝʌol",
"l̶o̶v̶e̶",
"l̶o̶v̶e̶",
"☐☐☐☐",
"☐☐☐☐",
"☐☐☐☐",
"----",
"¿¿¿¿"]
 
var corruptedLoved = ["l̶o̶v̶e̶d̶", "l̵͏o̧͟v̵͏e̸d̛͟", "ḽ̸̛̪̎̅͗́̄͝͠o̵̗̹͕̗̹̾̕̕͜v̵͍͈̲̙̤͔̩͉̟͈̆̉̋͛͌͠e̶̦̲̖͉̋́̔̓͗̔d̸̩̬̮͙̦̲̬̲͈̳̈́", "☐☐☐☐☐","¿¿¿¿¿"]
 
class Verse {
  constructor(verse) {
    this.verse = verse;
  }
}
 
function setup() {
  createCanvas (300, 100);
  background(200);
  data = loadStrings('lyrics.txt');
  createButton('SAVE LYRICS BUTTON')
  .position(10, 10)
  .mousePressed(function() {
    saveJSON(myJsonObject, 'myVerses.json');
  });
  }
 
function replaceLove(){
  for (var i = 0; i < data.length; i++){
 
    var mapObj = {
      love: corruptedLove[Math.floor(random(0, corruptedLove.length))],
      loved: corruptedLoved[Math.floor(random(0, corruptedLoved.length))]
    };
 
    var words = data[i];
    newWords = words.replace(/love|loved/gi, function(matched){return mapObj[matched];});
    lyricsArray.push(newWords);
 
    }
 
    for (var z = 0; z < lyricsArray.length; z++){
      verseArray.push(lyricsArray[z]);
 
    }
 
    for (var y = 0; y < verseArray.length; y++){
 
        if (verseArray[y] != " "){
          currVerse = currVerse + "\n" + verseArray[y];
        }
        else {
          myVersesArray.push(currVerse);
          var aVerse = new Verse(currVerse);
          verseObjectArr[y] = aVerse;
          currVerse = "";
        }
    }
  }
 
var myJsonObject = {};
myJsonObject.verses = verseObjectArr;
 
function mousePressed(){
 
replaceLove();
console.log(lyricsArray)
 
  }

For layout in InDesign:

#include "../../bundle/basil.js";
 
 
var jsonString;
var jsonData;
 
//--------------------------------------------------------
function setup() {
 
  jsonString = b.loadString("Verse1.json");
 
  b.clear (b.doc());
 
  jsonData = b.JSON.decode( jsonString );
  b.println("Number of elements in JSON: " + jsonData.length);
 
  var captionX = 300; 
  var captionY = b.height/2-50;
  var captionW = b.width/2 + 210;
  var captionH = 136;
 
  var imageX = 0; 
  var imageY = 0; 
  var imageW = 72*8.5; 
  var imageH = 72*8.5; 
 
  for (var i = 0; i < 12; i++) {
 
    b.addPage();
    b.rotate(-1.57);
    b.noStroke(); 
 
    var imgNum1 = b.floor(b.random(1,200));
    var imgNum2 = b.floor(b.random(201,208));
 
    var anImageFilename1 = "images2/" + imgNum1 + ".jpg";
    var anImage1 = b.image(anImageFilename1, imageX, imageY, imageW, imageH);
    var anImageFilename2 = "images2/" + imgNum2 + ".jpg";
    var anImage2 = b.image(anImageFilename2, imageX, imageY, imageW, imageH);
    anImage1.fit(FitOptions.PROPORTIONALLY);
    anImage2.fit(FitOptions.PROPORTIONALLY);
    b.transformImage(anImage1, 0, 0, 615, 900);
    b.transformImage(anImage2, imageX + 800, 0, 615, 900);
 
    b.pushMatrix();
    b.fill(0);
    b.rect(72*6.7,72*12,72*13,140);
    b.popMatrix();
 
    b.fill(255);
    b.textSize(20);
    b.textFont("Helvetica","Italic"); 
    b.textAlign(Justification.CENTER_ALIGN, VerticalJustification.TOP_ALIGN );
    var text = Math.floor(Math.random(1, 39));
    b.text(jsonData[text].verse, captionX,captionY,captionW,captionH);
 
  };
}
 
b.go();

 

nixel-parrish

As someone who has no previous experience in creating or even thinking about the topics that Parrish talked about in this video, I thought she did a great job in explaining and getting me interested in her work. I have spent a considerable amount of time writing myself, but fiction instead of poetry, and most definitely not procedural poetry. I have also never considered exploring nonsense, and why it's important to do so, but Parrish's analogies of exploration in the unknowns of poetry and language to the unknowns of science and space helped me understand. In a ways, I have always kind of looked down on nonsense and didn't get people who were obsessed with the random and unintelligible, but now I get why this exploration is valuable.

nixel-sapeck-Automaton

Perry and I let this project grow organically. We didn't create any sort of pre-formed idea or plan, but instead let our explorations at The Center for Creative Reuse and the other available resources guide our process.  We returned from the shopping trip with a bag of intriguing junk that we knew had potential to become some sort of disturbing creation, and got to work putting it all together in a way that would be both unsettling and aesthetically pleasing.

We collaborated on coming up with weird ways to use the materials we had on hand, and as Perry is more familiar with what the Arduino can do, he set up most of the physical computing and code. My role was in design, aesthetic input, and documentation.

An interesting note about this project is that because we didn't start with a concept, it was almost ludicrous how it all came together in the end.  One of the random objects we got from The Center of Creative Reuse was a stack of old religious booklets and fliers. When flipping through them near the project's completion, we found out one of the booklets was about a girl named Margaret who had a backstory that had uncanny resemblance to the situation we had put our doll in. And so, accidentally but fittingly, a our project happened upon a narrative.

nixel-LookingOutwards04

med_enc_7

Mediated Encounters is a project by Ken Rinaldo. It's composed of two hanging robotic arms with a Siamese Fighting fish on each end. The arms move based on the swimming pattern of the fish. These fish are naturally aggressive towards each other, so these arms allow them to get close but never close enough to actually fight.

I think this is an interesting project, like many of Ken Rinaldo's projects which connect fish and other living organisms with robotics to give them 'mobility' because it is sold as giving them a way to navigate our world freely outside the limitations of fish bowls. I see this method as somewhat contradictory and hypocritical, as the fish could easily move about the world if only they weren't trapped in fish bowls to begin with.  This is kind of like an exercise in power; these fish would have no way to interact on land without human help. It's like playing god to give them something they might not want or need ?

This project is very similar to the project at CMU with the fish that drove a robotic platform. Rinaldo also has a project almost identical to it, with three moving robotic fish tanks, but made it more than a decade earlier.

nixel-Body

Video:

Gif:

Description:

This project came naturally to me since I'm interested in dance and videography. From the beginning, I was curious about how I could convey movement, choreography, and group dynamics with the motion capture resources provided.  I wanted to study how choreography that relied on more than one dancer created compositions with bodies and movements on screen.

Process:

I went through several iterations of how I could convey what I wanted to, so after figuring out how Posenet worked, I ended up making a lot of visual representations of bodies in movement using some of my favorite dance videos. A lot of these I trashed because they looked too clunky and didn't feel right. It felt eh getting rid of several hundred lines of usable material, but in the end, I decided that visual simplicity and readability was more important than the technical work behind it.

I was not aiming for accuracy with this project, and I was okay with an end result that was not so minutely tied to the body in particular since I wanted to visualize several bodies in relation to each other. I wanted some sort of big picture, wholistic experience rather than something dependent on accuracy. Early on, I found that Posenet would never be able to capture the bodies as well as I wanted to, so I decided to use that messiness as an advantage. I kind of like how it sometimes mistakes body parts for each other, or sees a phantom body part in the wall somewhere.

The first, second, and fourth video in the compilation are all choreographies/videographies that I love a lot. The third video is one that I filmed for KPDC, a dance club on campus. I was interested to see how a video that I produced compared to those that I admired.

Still Images:

Code:

// Copyright (c) 2018 ml5
// Copyright (c) 2018 ml5
//
// This software is released under the MIT License.
// https://opensource.org/licenses/MIT
 
/* ===
ml5 Example
PoseNet example using p5.js
=== */
 
// PoseNet with a pre-recorded video, modified from:
// https://github.com/ml5js/ml5-examples/blob/master/p5js/PoseNet/sketch.js
 
let poseNet;
let poses = [];
let poseArray = [];
let noseArray = [];
let leftWristArray = [];
let rightWristArray = [];
let rightAnkleArray = [];
 
let video;
var videoIsPlaying;
 
function setup() {
  frameRate(10);
  videoIsPlaying = false;
  createCanvas(833, 480);
  video = createVideo('yabbay.mp4', vidLoad);
  video.size(width, height);
  poseNet = ml5.poseNet(video, modelReady);
  poseNet.on('pose', function(results) {
    poses = results;
    if(poses.length > 0){
      poseArray.push(poses);
    }
  });
 
  video.hide();
}
 
function modelReady() {
    console.log('Model Loaded');
}
 
function mousePressed(){
  vidLoad();
}
 
function draw() {
  image(video, 0, 0, width, height);
  //background(0);
  drawKeypoints();
}
 
function drawKeypoints()  {
//   for (let i = 0; i < poses.length; i++) {
//
//     let pose = poses[i].pose;
//
//     for (let j = 2; j < pose.keypoints.length; j=j+10) {
//       for (let k = 3; k < pose.keypoints.length; k=k+10){
//         for (let l = 4; l < pose.keypoints.length; l=l+10){
//           for (let m = 5; m < pose.keypoints.length; m=m+10){ // let keypoint = pose.keypoints[j]; // let keypoint2 = pose.keypoints[k]; // let keypoint3 = pose.keypoints[l]; // let keypoint4 = pose.keypoints[m]; // // if (keypoint.score > 0.2) {
//         strokeWeight(1);
//         stroke(255,0,0);
//
//         beginShape();
//         curveVertex(keypoint.position.x, keypoint.position.y);
//         curveVertex(keypoint2.position.x, keypoint2.position.y);
//         curveVertex(keypoint3.position.x, keypoint3.position.y);
//         curveVertex(keypoint4.position.x, keypoint4.position.y);
//         endShape();
//       }
//     }
//   }
//   }
// }
// }
 
  for (var i = 0; i < poses.length; i++){
    let pose = poses[i].pose;
    for (var j = 0; j < pose.keypoints.length; j++){ noFill(); strokeWeight(1); makeVectors("nose", pose, j, noseArray); makeVectors("leftWrist", pose, j, leftWristArray); makeVectors("rightAnkle", pose, j, rightAnkleArray); stroke(255); makeConnections(leftWristArray); makeConnections(noseArray); makeConnections(rightAnkleArray); } } } function makeVectors(part, pose, j, partArray){ if(pose.keypoints[j].part == part){ stroke(0, 0, 255); ellipse(pose.keypoints[j].position.x,pose.keypoints[j].position.y, 10); partArray.push(createVector(pose.keypoints[j].position.x,pose.keypoints[j].position.y)); if (partArray.length > 4){
      partArray.splice(0,1);
    }
  }
}
 
function makeConnections(partArray){
  beginShape();
  for (let k = 0; k < partArray.length; k++){
    curveVertex(partArray[k].x, partArray[k].y);
  }
  endShape();
}
 
function vidLoad() {
  video.stop();
  video.loop();
  videoIsPlaying = true;
}
 
function keyPressed(){
  if (videoIsPlaying) {
    video.pause();
    videoIsPlaying = false;
  } else {
    video.loop();
    videoIsPlaying = true;
  }
}

nixel-telematic

Embedded app:

Direct Link:

Custom Emoji Creator

How to interact:

Simply click on the small icons to customize the big one. There are 4 clear categories: eyebrows, eyes, mouth, and accessories. There can ever only be one (or none) of each type on the big emoji at once. Click the X if you ever want to clear that feature. You can interact with many people at once who are on their own browsers.

Description:

Sometimes you want to express an emotion that the current range of emoji's just don't cover. That was the purpose of this project. I took inspiration from iconic internet created emoji's like: Image result and wondered how many other emotions we feel but aren't able to express.

Gif of me playing with it along with a few friends:

My process:

For this project, I essentially mashed together two of the given prompts for when we're 'stuck for ideas': making a space where people can only interact with emoji's, and having the participants construct a monster corpse through body parts. And this is what I ended up with.

My process was pretty straightforward. I looked for existing dress up games on glitch to reference from, and found this one for cats and based the majority of my structure off of it. I spent a long time cutting apart emojis, formatting HTML, CSS, and messing with jQuery, all of which I'd never worked with before. I hit some pretty dumb bugs that I fought with for a long time (for example, not realizing I didn't import jQuery). I struggled a little with understanding the client-server stuff in application to my own project, but it was a lot simpler than it looked once I broke down the pathways.

Here are some notes I took on the given templates:

I made remixes of all of the templates and took notes on how each one worked with socket.io. This is how I brainstormed to apply it to my program:

I think a part of this project is that I want it to be kind of chaotic and unpredictable. I want multiple people to be able to explore the different options available and experiment. By having many people on at once, everyone with equal roles, it allows for lots of surprises. It is also good as a solo experience. I think of the multi-user interaction as a feature and not a defining point.

I shared it with my friends, and they had a lot of fun with it. At most, I think there were around 6 people messing with it at once, and everyone was amused at the results we were getting.

By the end of this project, I'd learned a lot about HTML, CSS, jQuery, socket.io, and glitch, and made something that made my friends laugh, so  it is pretty successful by my standards.