paukparl-book

Generated Self-Help Books

These are generated book covers (front and back) of self-help books, each with a different instruction on how to live your life.

 

Process

Lately I've been consumed by an unhealthy obsession with how to lead my life and how to start my career. Since we were making a book, I wanted it to make be about me. I hoped to resolve some of my issues through introspection.

During the process, I was somehow reminded of the sheer number of self-help books out there that instruct you on how to live your life. When you see too  many of them sometimes, you are made to think that it is that important to live your life to your fullest, when it just as well might not be that important. My book was an attempt to emulate, and thereby mock, these self-help books

I based most of my word selections on Text Corpora and used minimal code from RiTa.js library. For example, the title was drawn from corpora/data/words/common.json directory with a few filters such as (!RiTa.isNoun()). I also made a few template strings for subtitles, and a few arrays of words related to success. I think I could have hidden the repeating pattern and made more clever and controlled title-subtitle matches by using the Wordnik API. But there are still some examples that show the script is doing its job.

google drive link to 25 instances

 

Code

Some snippets of code showing functions for title and subtitle generation, and where I got the data.

adv = ['conveniently', 'quickly', 'certainly', 'effortlessly', 'immediately', 'completely']
adj = ['convenient', 'undisputed', 'quick', 'secret', 'true', 'groundbreaking', 'revolutionary']
obj = ['success', 'a successful career', 'true happiness', 'a new life', 'a healthier lifestyle']
v = ['succeed', 'win', 'challenge yourself', 'be successful', 'achieve success', 'get ahead in life', 'be famous', 'be happy', 'lose weight', 'make your dreams come true', 'plan ahead', 'make more friends']
 
function preload() {
    firstNames = loadJSON('/corpora-master/data/humans/firstNames.json');
    lastNames = loadJSON('/corpora-master/data/humans/authors.json');
    common = loadJSON('/corpora-master/data/words/common.json');
    adverbs = loadJSON('/corpora-master/data/words/adverbs.json');
    adjectives = loadJSON('/corpora-master/data/words/encouraging_words.json');
    newspapers = loadJSON('/corpora-master/data/corporations/newspapers.json');
}
 
...
...
...
 
function genTitle() {
    var temp = common[floor(random(common.length))];
    if (RiTa.isNoun(temp) || RiTa.isAdjective(temp) || RiTa.isAdverb(temp)
        || !RiTa.isVerb(temp) || temp.length >7) return genTitle();
    else return temp;
}
 
function randomDraw(array) {
    return array[floor(random(array.length))];
}
 
function genSubtitle() {
    temp = random(10);
    str;
    if (temp<2.3) str = 'How to ' + randomDraw(adv) + ' ' + randomDraw(v) + ' and ' + randomDraw(v);
    else if (temp<4.6) str = 'A ' + floor(random(12)+3) + '-step guide to ' + randomDraw(obj);
    else if (temp<6.6) str = floor(random(9)+3) + ' ways to ' + randomDraw(adv) + ' ' + randomDraw(v);
    else if (temp<8.6) str = 'A ' + randomDraw(adj) + ' guide on how to ' + randomDraw(v);
    else str = 'If only I had known then these ' + floor(random(12)+3) + ' facts of life';
 
    return str;
}
 
function genAuthor() {
        gen = {};
        gen['books'] = [];
        gen['author'] = firstNames[floor(random(firstNames.length))] 
                + ' ' + lastNames[floor(random(lastNames.length))];
 
        for (let i=0; i<8; i++) {
            book = {};
            book['title'] = genTitle();
            book['subtitle'] = genSubtitle();
 
            book['quotes'] = [];
            book['reviews'] = [];
            for (let j=0; j<5; j++) {
            review = {}
            temp = random(4);
            var reviewString;
            if (temp<1.2) reviewString = adverbs[floor(random(adjectives.length))] + ' ' + adjectives[floor(random(adjectives.length))] + ' and ' + adjectives[floor(random(adjectives.length))] +'.';
            else if (temp<2.0) reviewString = adverbs[floor(random(adjectives.length))] + ' ' + adjectives[floor(random(adjectives.length))] + ' but ' + adjectives[floor(random(adjectives.length))] +'.';
            else if (temp<2.8) reviewString = adverbs[floor(random(adjectives.length))] + ' ' + adjectives[floor(random(adjectives.length))] +'!';
            else if (temp<3.5) reviewString = adverbs[floor(random(adjectives.length))] + ' ' + adjectives[floor(random(adjectives.length))] +'.';
            else reviewString = adjectives[floor(random(adjectives.length))] + '...'
            review['text'] = uppercase(reviewString);
            review['by'] = newspapers[floor(random(8, newspapers.length))];
            book['reviews'].push(review);
            }
 
            book['price'] = '$' + floor(random(7, 11)) + '.50' // 6 <> 10
            gen['books'].push(book);
        }
        saveJSON(gen, 'book'+ n +'.json');
}

The Big XR Post

Using touch to place and resize objects

If you want to be able to place your object in front of you by touching your screen & then resizing it with two finger, use this script! It works by placing it on the thing you want to be able to move around!

(or you can point it to whatever you want it to move by dragging the sculpture/GameObject to the m_HitTransform public variable)

Here is the placing and resizing script: https://drive.google.com/open?id=1tudhCw5cxn6GxUlzM9HBJQnSgnbwqcfA

Some Quick reminders

(all of these things are covered in the tutorial, but I have just listed some stuff here that I have noticed students struggle with)

  • The XR Surface controller should be on all AR objects for it to work.
    • Deform to surface is for floors (add the transparent XR Shadow material by going into the XR folder then materials and drag that onto the game object)
    • Lock to first surface this simply initiates the first location of the Game Object, this can be changed via scripting.
  • There is wall detection, but it is proving to be slower than plane detection. It might be easier to simply place the object using the script at the bottom of this page.
  • When the app first starts, the phone doesn't remember anything about its environment, so you may need to search around every time.
  • XR remote is slow and glitchy, it is just for development, for better tracking and high-res build it to your phone! Your final piece should be built to your phone.

FOR iOS PEOPLE WHO HAVEN'T BUILT TO THEIR iPHONE BEFORE:

When you "build and run" it will open up xcode & you will need to set up your developer account:

  • go to "Window->Organizer" and then click on the Tab "Energy" then it will prompt you to log in to create a personal team developer account.
  • Then back in xcode click "Unity-iPhone" in the left side panel, you will now go into another left side panel to "Unity-iPhone" under "Targets" it has a dark Unity logo next to it
  • Then go to signing dropdown menu and click "your name (Personal Team)"
  • Hit play!

It should work after that! Keep in mind that you have a limit to how many different apps you can build to your phone, so just keep building to the same app name if you don't want to run into that issue!

From XR tutorial Class8th wall Tutorial:

 https://docs.8thwall.com/#tutorial

Go to: https://console.8thwall.com/downloads

  • Download 2 things:
    • Demo Project
    • 8th wall XR unity package (you need to be logged in)

Here are some resources to get 3D models:

  1. https://www.thingiverse.com/ -  Usually used for 3d printing stuff
  2. https://www.models-resource.com/ - ripped assets
  3. http://www.turbosquid.com/ - For professionals (most cost $$$)
  4. https://free3d.com/ - vanilla! from dan and pussy krew
  5. https://grabcad.com/ - more realism, its for mechanical engineers
  6. https://3dwarehouse.sketchup.com/?hl=en - for sketchup (arch)
  7. http://www.makehumancommunity.org/ - software for making 3d model

casher-ocannoli-Automaton

"Skipper's Dreamhouse"
    

"Skipper's Dreamhouse" is a scary house that first reacts to movement at a far distance, then waits for the viewer to come closer. Once the viewer is at a certain close distance, a random timer is triggered and the deconstructed Barbie doll, Skipper, pops up to scare the viewer. Throughout making the project, various decisions were made often having to compromise aesthetic over practicality; however, on the whole the house fits the spooky, homemade, childish vibe. If we were to further this project, we'd love to add sound, either with sound effects or music that reacts to the viewer, etc. As for the title, we slowly created this narrative that Barbie's little sister, Skipper, wanted her own dream house and it is terribly spooky. We joked that it could represent somewhat of a critique on Barbie and the stigma surrounding her, as we have deconstructed the doll to just her flailing limbs, showing how her appeal consists only of mindless, emotionless body -- an ultimately ~spooky~ idea to be teaching little girls. We contributed equally to the concept of the project and problem solving, where Cassie contributed more to the Arduino software and Olivia to the outer appearance. Overall, although we hit a lot of speed bumps, it was fun and the problem solving resulted in a satisfyingly spooky project.

chewie-rigatoni-Automaton

Working on this project was an immensely informative experience for me. My partner Huw had a lot of prior experience with arduino and working with physical computation in general, while I was relatively new to working in that way. We went through many different ideas before we came up with the idea of a marionette drummer made of trash, playing in a whacky, improbable manner in a room also made up of disposable "trashy" material. We were inspired by the materials we were working with, such as the dirty-looking carpet fuzz sheets, carpet crumb, pieces of a broken nerf toy and colored felt.

It was the use of these unconventional materials that led to a source of conflict for me and my partner (chewie) as we were trying to figure out to what extent we should try to use our materials to emulate a representation of a real space, or stick purely to our materials and not try to hide what we were working with. At one point our automaton sported a costume, a paint job and went through a few different caricaturized heads. It was after a great deal of discussion that we concluded that this attempt to aestheticize our materials was actually just clashing with the rest of the project. We removed the costume and the paint and added in the tennis ball bobbing on a spring instead, to both of our satisfaction. On the other hand, using the playing cards to make a poster, dowels and scrap metal for the drums and carpet, cardboard and foam for the bed turned out to be pretty effective as a story-telling device. We took to calling our automaton Wilson (for obvious reasons).

via GIPHY

As far as the division of labor went, I had been learning about puppetry in my video class and was able to apply those skills to making a simple marionette puppet. I then used code from the servo sweep example as well as the motor party example to get Wilson to play. I used the spare parts I had to put together a simple drumset. My partner (chewie) was instrumental in creating the room set, and without him our project wouldn't be nearly as sturdy, nor would our wiring work. He came up with the look for the interior too, for things like the poster and the unmade bed. The most valuable contribution chewie made however, was his unwavering perfectionism. I tend to let things slide and be ok with parts of my work not being effective. chewie made us iterate multiple times through elements of the piece that just weren't working. In addition to changing the costume and color, the initial armature for Wilson was rolled cardstock, the furniture was fully cardboard and the head was a sketched-in face of John Bonham with tacked-on hair. These were all either details that were lacking in thought, or they impaired the wild and free motion of Wilson, which was clearly the focus of the piece. I was surprised by just how much one can roll back and revert changes when working with physical objects. Although we had no CTRL-Z or Github to fix our mistakes, working with actual physical layers of materials felt like there was a concrete sense of version control after all.

-Rig

nixel-sapeck-Automaton

Perry and I let this project grow organically. We didn't create any sort of pre-formed idea or plan, but instead let our explorations at The Center for Creative Reuse and the other available resources guide our process.  We returned from the shopping trip with a bag of intriguing junk that we knew had potential to become some sort of disturbing creation, and got to work putting it all together in a way that would be both unsettling and aesthetically pleasing.

We collaborated on coming up with weird ways to use the materials we had on hand, and as Perry is more familiar with what the Arduino can do, he set up most of the physical computing and code. My role was in design, aesthetic input, and documentation.

An interesting note about this project is that because we didn't start with a concept, it was almost ludicrous how it all came together in the end.  One of the random objects we got from The Center of Creative Reuse was a stack of old religious booklets and fliers. When flipping through them near the project's completion, we found out one of the booklets was about a girl named Margaret who had a backstory that had uncanny resemblance to the situation we had put our doll in. And so, accidentally but fittingly, a our project happened upon a narrative.

nixel-Body

Video:

Gif:

Description:

This project came naturally to me since I'm interested in dance and videography. From the beginning, I was curious about how I could convey movement, choreography, and group dynamics with the motion capture resources provided.  I wanted to study how choreography that relied on more than one dancer created compositions with bodies and movements on screen.

Process:

I went through several iterations of how I could convey what I wanted to, so after figuring out how Posenet worked, I ended up making a lot of visual representations of bodies in movement using some of my favorite dance videos. A lot of these I trashed because they looked too clunky and didn't feel right. It felt eh getting rid of several hundred lines of usable material, but in the end, I decided that visual simplicity and readability was more important than the technical work behind it.

I was not aiming for accuracy with this project, and I was okay with an end result that was not so minutely tied to the body in particular since I wanted to visualize several bodies in relation to each other. I wanted some sort of big picture, wholistic experience rather than something dependent on accuracy. Early on, I found that Posenet would never be able to capture the bodies as well as I wanted to, so I decided to use that messiness as an advantage. I kind of like how it sometimes mistakes body parts for each other, or sees a phantom body part in the wall somewhere.

The first, second, and fourth video in the compilation are all choreographies/videographies that I love a lot. The third video is one that I filmed for KPDC, a dance club on campus. I was interested to see how a video that I produced compared to those that I admired.

Still Images:

Code:

// Copyright (c) 2018 ml5
// Copyright (c) 2018 ml5
//
// This software is released under the MIT License.
// https://opensource.org/licenses/MIT
 
/* ===
ml5 Example
PoseNet example using p5.js
=== */
 
// PoseNet with a pre-recorded video, modified from:
// https://github.com/ml5js/ml5-examples/blob/master/p5js/PoseNet/sketch.js
 
let poseNet;
let poses = [];
let poseArray = [];
let noseArray = [];
let leftWristArray = [];
let rightWristArray = [];
let rightAnkleArray = [];
 
let video;
var videoIsPlaying;
 
function setup() {
  frameRate(10);
  videoIsPlaying = false;
  createCanvas(833, 480);
  video = createVideo('yabbay.mp4', vidLoad);
  video.size(width, height);
  poseNet = ml5.poseNet(video, modelReady);
  poseNet.on('pose', function(results) {
    poses = results;
    if(poses.length > 0){
      poseArray.push(poses);
    }
  });
 
  video.hide();
}
 
function modelReady() {
    console.log('Model Loaded');
}
 
function mousePressed(){
  vidLoad();
}
 
function draw() {
  image(video, 0, 0, width, height);
  //background(0);
  drawKeypoints();
}
 
function drawKeypoints()  {
//   for (let i = 0; i < poses.length; i++) {
//
//     let pose = poses[i].pose;
//
//     for (let j = 2; j < pose.keypoints.length; j=j+10) {
//       for (let k = 3; k < pose.keypoints.length; k=k+10){
//         for (let l = 4; l < pose.keypoints.length; l=l+10){
//           for (let m = 5; m < pose.keypoints.length; m=m+10){ // let keypoint = pose.keypoints[j]; // let keypoint2 = pose.keypoints[k]; // let keypoint3 = pose.keypoints[l]; // let keypoint4 = pose.keypoints[m]; // // if (keypoint.score > 0.2) {
//         strokeWeight(1);
//         stroke(255,0,0);
//
//         beginShape();
//         curveVertex(keypoint.position.x, keypoint.position.y);
//         curveVertex(keypoint2.position.x, keypoint2.position.y);
//         curveVertex(keypoint3.position.x, keypoint3.position.y);
//         curveVertex(keypoint4.position.x, keypoint4.position.y);
//         endShape();
//       }
//     }
//   }
//   }
// }
// }
 
  for (var i = 0; i < poses.length; i++){
    let pose = poses[i].pose;
    for (var j = 0; j < pose.keypoints.length; j++){ noFill(); strokeWeight(1); makeVectors("nose", pose, j, noseArray); makeVectors("leftWrist", pose, j, leftWristArray); makeVectors("rightAnkle", pose, j, rightAnkleArray); stroke(255); makeConnections(leftWristArray); makeConnections(noseArray); makeConnections(rightAnkleArray); } } } function makeVectors(part, pose, j, partArray){ if(pose.keypoints[j].part == part){ stroke(0, 0, 255); ellipse(pose.keypoints[j].position.x,pose.keypoints[j].position.y, 10); partArray.push(createVector(pose.keypoints[j].position.x,pose.keypoints[j].position.y)); if (partArray.length > 4){
      partArray.splice(0,1);
    }
  }
}
 
function makeConnections(partArray){
  beginShape();
  for (let k = 0; k < partArray.length; k++){
    curveVertex(partArray[k].x, partArray[k].y);
  }
  endShape();
}
 
function vidLoad() {
  video.stop();
  video.loop();
  videoIsPlaying = true;
}
 
function keyPressed(){
  if (videoIsPlaying) {
    video.pause();
    videoIsPlaying = false;
  } else {
    video.loop();
    videoIsPlaying = true;
  }
}

rigatoni-viewing4

Spectacle: I see "spectacle" as work that pushes the limits of our notion of what we can do with the tools and knowledge we have at our disposal as a community of artists. An entirely-spectacle driven work would likely give the audience a new way of generating content, but lack in self-awareness or meaning that transcends its time and medium.
Speculation: I see "speculation" as an experiment of some immutable artistic concept in a novel way. An entirely-speculation driven work would probably cause the audience to question the way they think about a certain concept, but lack real world applicability as a product.

A few weeks ago I came with an idea for an idle-clicker genre of game called Penis Simulator. This was during a period in Concept Studio: Systems and Processes where we were to create work in response to a system within the body, and I was working with the reproductive system. I think my project erred on the side of Speculation while modestly dabbling in Spectacle. Penis Simulator gives the player a virtual abstracted representation of a penis and testes, and by rapid mouse clicks and dragging vertically the player can control temperature and testosterone levels to achieve the open-ended goal of raising over-all sperm count. The result was one of the rare few times I have seen that something as sexualized and perverted as a human penis perceived as a totally non-sexual, sterile object with simple inputs, outputs and controlling variables. My professor expressed her discomfort and called the project problematic on a few occasions but overall the game had a positive response both amongst testers required by the class to play my game as well as people interested in trying it out of their own volition. I see my work so far as a proof-of-product and want to use what I have learnt from Warburton's argument to integrate a more substantial element of spectacle into the speculation piece I have created.

 

 

In the beginning phases of working on this project my ambitions were much higher, possibly wanting to work with Spotify/music APIs and drum machines. However, I quickly decided that I just wanted to make something simple and fun to allow me to spend time learning and experimenting with new techniques, features etc. A constant concept through my process, has been the idea of equal roles between all users. When text is manipulated and messed with to a point of incoherence, it really puts all users on the same level of not understanding what's going on and having a good laugh. Experimenting with Rita and regex really shaped most of this project because I was mainly text focused and had also never used either one of those ever before. Although there are a lot of bugs and the project could be a lot more, I had a lot of fun and I learned a lot. (Also all of the trucker slang is real, here is a breakdown of what most of it means)

Trucker Slang Meanings 

yuvian-LookingOutwards02

Strata #3 - Excerpt from Quayola on Vimeo.

"Strata #3" is a film and installation project commissioned by Evento and Lumin for the 2009 Bordeaux Art & Architecture Biennale. Directed by Quayola, the project utilizes the aesthetics of classical architecture and Renaissance art combined with generative patterns and technological analysis of the original artwork in order to create a psychedelic and fragmented artistic experience.

One thing that inspired me is the creative use of "old" art and new technologies in one project. I love how the project feels familiar and yet completely new at the same time. The music in the video also pairs very well with the artistic direction of the film. The visuals are stunning and the concept itself is brilliant. I'm not sure how this would have looked in real-life as it was mentioned that this is an installation as well, but from the video, it is amazing to look at.

rigatoni-LookingOutwards02

Rogue (1980) is the progenitor of a PCG genre known as "Rogue-like". A player (denoted typically with an '@' character) traverses the floors of a procedurally generated 2D map, navigating obstacles, traps and enemies to get to the bottom. The player has a limited amount of food that may be replenished by looting, but each move in this turn-based tile game costs food; to venture further into the floors is to risk starvation and the brutal perma-death mechanic of this game.

Rogue, and its spin-off Nethack, was my first experience with games that run in the command line. I can only imagine the complexity that co-creators Michael Toy and Glenn Wichman faced as they developed this game for Unix, using a very nascent graphics library known as curses (developed by Ken Arnold. They also were restricted to only using ASCII character to represent their procedurally generated world. Rogue ended up being included in a popular distribution to the internet's predecessor, ARPANET.

"Rogue is the biggest waste of CPU cycles in history." --Denis Ritchie

The map generation algorithm used in Rogue, and consequently many titles in the "rogue-like" genre, follows the following instructions
1) Divide the map into 3x2 cells
2) Choose to/not to draw a room within each cell
3) Use a "drunken walk" to connect the rooms

Prior to Rogue, most adventure games lacked replay value and complexity. Toy and Wichman realized this, and came up with two clever ways to deal with effective complexity, the first of course being the procedural map generation and mob placement algorithm. The other one being "perma-death" was a controversial topic during development. I would argue the "perma-death" mechanic aids the effective complexity of Rogue since the player is unlikely to come across similar levels given the persistent consequences of death.

Here's a video of one of my favorite Youtubers, Scott Manley, playing the original Rogue game.