Guodu-Proposal

I want to create an interactive calligraphy experience where people can enter a word they want to practice writing, select different brushes, write on top of a template, and save their calligraphy. I’ve gotten a few opportunities to help out at some drawing and calligraphy workshops. After a recent opportunity to help out at a digital drawing workshop, I got inspired by how excited people were at trying digital drawing. But a small problem I see when people just recently purchased their new iPad or bought a new sketch program is they do not know where to start on their blank white pixels. I’m also curious about the transfer of practicing drawing or calligraphy between digital and physical mediums.

I’d like to improve my Generative Alphabet Book by making it actually alphabetical and improving the graphic design while making it seem more generative. uh no time to print 

 

 

Written by Comments Off on Guodu-Proposal Posted in Proposal

kander – proposal

For my final project, I want to create a generative horoscopes, generating astrological symbols that resemble the shape of existing symbols (Aquarius, Libra, etc.), as well as generate names that resemble the Greek names. I’d also like to use Rita.js to generate “horoscopes” for the signs. I’m not sure what I’d want the final product to be — maybe a set of cards or a plot

Written by Comments Off on kander – proposal Posted in Proposal

Jaqaur – Object

Screaming Monster Alarm

Okay, yes. This project was done fairly last minute, while I had no creative ideas, and it depends on the use of a CloudBit, which I didn’t have. So, all in all, I wouldn’t really call it a success.
Still, here is a little video I made showing my Baby Monster in action:

Basically, the screaming monster alarm is an alarm clock that is active between 8:00 and 8:30 (that’s the networked part) and will scream unless it sees motion with its motion sensors. I thought it would be good because it would stop me from accidentally falling back asleep after turning it off. I made the tube to cover it so it wouldn’t pick up miscellaneous other motion, like that of my roommate, because the sensor it very sensitive. I like how you can make it be quiet by covering its mouth (it was that covering action that gave me the idea to make the tube into a creature). But in retrospect, it’s really not much different from a regular alarm clock.

I got the sensor/buzzer part working, but I couldn’t actually get the timed aspect to work because, as I mentioned above, I didn’t really have access to a CloudBit and didn’t want to wait around for one for the sake of this project. I did create some events on “If This Then That” that WOULD activate and deactivate the cloud bit at 8:00 and 8:30 respectively, if I had one.

I have very few sketches for this project and no code to embed, due to all of the reasons above. Not my best work, but I’m glad I learned about “If This Then That,” and its always fun to play with Little Bits. Had networking our object not been part of the assignment, I would have loved to experiment with the Makey Makey. That device was definetely the my favorite thing from this last week.

Written by Comments Off on Jaqaur – Object Posted in Object

Keali-Kadoin-Object

Drunken Nightlight
The Party Animals Awaken

nightlight

If there’s one thing we learned from this, it’s that Keali and Kadoin are not meant for hardware.
Our plan was to make a nightlight that would activate based on the sunset time of a certain location, which would be done with the CloudBit and IFTTT’s weather applet options. This was a simple though practical idea, which meant the rest of the project became geared towards craft and the actual construction of the nightlight, which went through multiple iterations: we had rotating carousel ideas, light boxes, and planetarium-like sketches planned, but the actual reality was a more complicated process of how to get something to stand still and rotate smoothly, all the while having the littlebits stay connected and be distributed properly so that the lights would work within the tube. We went through multiple construction tests with papers, tissue papers, cardboard, clay, and finally made a platform by 3D-printing it (shoutout to John’s help), and finalized an unfortunately wonky design of a tall tube with knifed out designs that swings unevenly and tilts because Keali forgot that physics was a thing when he cut all of the landscape outlines on one side which consequently caused the entire structure to bend over (and much more so when it spins). A boxed structure was a base that contained the motor bit and set it to stand upright, while the splitter bit directed the rest of the RGB and LED lights up into the tube; clay animals were tied with string and hung from the top base of the tube. But all horror aside a final product was successfully completed (though the refinement itself is… not much a success), and we have ourselves a rotating drunken nightlight that activates once the sun sets, and halts at sunrise. We miss D3…

sunset

sunrise

sunrise

sunset

15145047_1372767432838965_483078566_o

Written by Comments Off on Keali-Kadoin-Object Posted in Object

Keali-Final-Proposal

I want to make an interactive environment: perhaps a nature scene that can be controlled by its user; inspired by Theo and Emily’s works, but restricted to a screen (i.e. controllable through the keyboard / mouse, rather than projected and real-time motion); also would rather work in 2D.

Written by Comments Off on Keali-Final-Proposal Posted in Proposal

Because Lumar can’t decide…

  1. Make this concept a reality with a rasberry pi. Have another application that people can input there own knocking patterns, so that when the door is knocked on and opened, we know who is coming in and out of studio.

2. Body moCap – coding is so bad for your body posture. Could we code by dancing and talking? Or I could finish what I first intended for my body mocap (music and particle systems)

3. Make an entire computational font set for the plotter.

4. body mocap alphabet – body mocap that takes the shapes you make with your body to form letters – convert for plotter.

 

 

 

 

Written by Comments Off on Because Lumar can’t decide… Posted in Proposal

Anson/Claker-NetworkedObject

Claire and I made a trampoline that can dial any phone number, input as numbers corresponding to jumps by a user. The trampoline essentially functions as a keyboard to type the numbers for you. Phones are often used absentmindedly, we may feel tethered to them to stay connected, and we often lead sedentary lives sitting at laptops all day. So, here comes TrampolineDial, breaking up the monotony, and turning phone calls into an active, fun interaction!

We wrote the code using Arduino. When a user jumps, the Little Bits act as a cursor, typing in numbers that create the phone number. Apparently if we used Java.robot we could automate the calling, so that the call would be placed once the 10 digits were generated by the jumps.

unnamed

2nd_class_lever

Here’s the diagram:

trampoline_diagram_new1

Here’s a video of TrampDial in action! :

Written by Comments Off on Anson/Claker-NetworkedObject Posted in Object

Zarard-final-proposal

Preface: The Teenie Harris Archive is a collection of photographs currently held by the CMOA from a photographer Teenie Harris who captured African-American lifestyle in Pittsburgh during the 1930s through 1980s. His work is one of the most comprehensives lenses we have to view African-American life during that time period in America.

So as I’ve been working on computationally analyzing the Teenie Harris Archive, I’ve realized that my artistic practice lies more in the realm of the artifact. My main interest is information visualization, but not in the way of conveying data and ideas to other people, but in the imagery and aesthetics of data to the computer. The images below are just a few of the artifacts produced through my analysis of the archive, but they are a far cry from what we usually think of when we talk about data visualization. However these images below are what the data looks like. So my final project is going to be the creation of another artifact: 8 bit image tags. Actually, it’d be somewhat like QR codes for photos. The way i see it, Each pixel has 6 dimensions (row, column, red, green, blue, and alpha), Each of those dimensions can be used to represent a feature of the data, and multiple pixels can be used to represent the multiple features that the photo is comprised of.

Features that might be used

  • Faces (Number of them, male or female, approximate age, position in photo)
  • Line of best fit (number of them, rho, theta, sum of squared residuals)
  • Hough Lines (see above)
  • Tags by Microsoft API (tag, confidence of tag, location of tag in photo)
  • Classification of Courier vs. Non-Courier
  • Edge Detection results
  • Fourier Transform results
  • More will be considered but the end result could result in new methods of searching, sorting, or storing this information. Hopefully leading to a fuller understanding of the data.

    resizedequalized_2035

    copy_2010

    final_54021

    screen-shot-2016-11-18-at-12-06-56-pm

    zariacontrasts2052

    Written by Comments Off on Zarard-final-proposal Posted in Proposal

    Catlu – Final – Proposal

    For my final project I’d like to continue working in Maya and learning more about python scripting. I’m not sure if I want to continue with my mocap project and improve it (then I wouldn’t have to remodel stuff) or if I want to start a new project instead. I think I’m leaning towards continuing my mocap project so I can really dig into the python scripting of Maya.

    Written by Comments Off on Catlu – Final – Proposal Posted in Proposal

    kander + lumar – object

    For our Networked Object, we decided to make one of Lumar’s long-desired ideas, a bubble wrap dispenser, under the assumption that everybody finds popping bubble wrap useful to relieve nervous energy. We improved upon it by installing a roll trigger, and every time a piece of bubble wrap is pulled out, the trigger is pulled, and the connected CloudBit activates the IFTTT Applet that posts to the slack channel called “#stresspopper (you should all subscribe). Now everybody can feel your pain!

    img_1791 Diagram of the StressPopper. The power source is connected to a roll trigger, which it triggered every time a piece of bubble wrap is pulled out. The trigger is connected to a CloudBit, which activates the IFTTT Applet that posts to the #stresspopper Slack channel

    We used a ceramic flower pot to house the bubble wrap, and we laser cut a top with a slit for the bubble wrap to come out of. We had to do several iterations of the top, as the first tops we made broke. We also had to figure out how to fold the bubble wrap so that pulling one piece brings the next piece out of the slit (like Kleenex), and used a little bit of rubber cement between each piece to aid in this process.

    img_1782

    Our original idea was to create a stress ball with a push button inside, and every time the ball was squeezed, it posted to the Slack channel, but the balloons we were using kept tearing, and we were worried about the Little Bits getting crushed.

    img_1779 img_1778 img_1776

     

    Written by Comments Off on kander + lumar – object Posted in Object

    Catlu – Arialy – Object

    IoT – Umbrella

    Physical Object Documentation:

    dsc02195 dsc02209 dsc02210

    If This Then That Documentation:

    screen-shot-2016-11-18-at-12-05-44-pm screen-shot-2016-11-18-at-12-09-59-pm screen-shot-2016-11-18-at-12-10-45-pm screen-shot-2016-11-18-at-12-12-09-pm

    Diagram:

    file_000

    For our project, we were drawn to the idea of using the internet and the Little Bits to establish a connection between 2 people. We thought a lot about how we could use the little bits we had (mostly lights, buttons, and a speaker), to establish a relationship between different places. In the end we decided to use the sound of rain. When people are living apart from each other in different places, it’s often hard to feel connected because you do not experience things together anymore. The final idea we decided on was a set of 2 umbrellas, each of which uses a cloud bit to connect to If This Then That. Each umbrella would be set to the place where the opposite person is, so when the opposite person is experiencing rain, you will hear the rain too. Sometimes it only take a little to help establish a greater feeling of intimacy, and our project was an attempt to do so. For the umbrella, we connected a wall charger power bit to a cloud bit and an input speaker bit and an mp3 player. We then connected the cloud bit to IFTT and created an applet with Weather.com that would let you set a location. If the status of the location changes to “rain,” IFTT sends a signal to the cloud bit, and the speaker will start playing the music from the mp3 player. Optimally we would have wanted something small like an Ipod Nano or Shuffle, but had to make do with a phone. Originally we also wanted to use a battery for the power supply, but the p1 battery power didn’t work with the cloud bit, which then wouldn’t connect to IFTT. We ended up using the wall charger bit and running the strip along the edge of the wall.

    Written by Comments Off on Catlu – Arialy – Object Posted in Object

    Kelc – Final-Proposal

    For the final project, I plan to use a Skin Color Detection API to write a Python function that detects a face in the camera, takes a picture of the user, extracts two colors from the skin tone (using the API), and maps that as a texture onto a character model in Maya.

    Written by Comments Off on Kelc – Final-Proposal Posted in Proposal

    Anson-Final-Proposal

    For my final project, I think it would be useful if I did something using p5.js and also string parsing. I am not sure what this project would be – maybe some more mocap or a better FaceOSC – but I think that I could use a review / more in-depth project with the concept of String Lists, Matrices, etc. I don’t have a hugely artistic vision for this yet but I know the technical review will be very beneficial to me. I’m interested in three.js but that may be far beyond my scope currently.

    Written by Comments Off on Anson-Final-Proposal Posted in Proposal

    Antar-Krawleb-object

     

    A compilation of some of our experimenting throughout the week. This includes playing with the sequencer, place two oscillators one after another, and also our generative melodies.

    15133895_10211257399339855_446231032_o

    This is the view from Audition, from when we were mixing our soundcloud track together.

    15139327_10211257399539860_552601983_n This is the visualizer that we used to help our debugging process. The top bar being the melody, and the bottom being the base line.

    Here are some of the little bit arrangements we used.

    15086907_10154217151249538_1150515672_n

    15129774_10154217151429538_927754563_n15151372_10154217151574538_1637099477_n

    This is our computer which is connected to the mp3 piece. The music is then manipulated through a filter and a delay, then played out the speaker which is connect to a splitter. One branch of the splitter goes to the headphones, and the other to the microphone, which is also attached to the computer. The tracks were first recorded with Audacity, then mixed with Audition. fullsizerender-1

    Little Bits pieces

    • power
    • oscillator
    • mp3 out
    • midi in/out
    • synth speaker
    • filter
    • delay
    • envelope
    • keyboard
    • pulse
    • noise
    • sequencer

    Initial Concept
    We both immediately got excited over the Korg synth Little Bits pieces, and knew that we wanted to see how far we could push the affordances of the hardware. At the beginning we thought of only creating music collaboratively, one on the keyboard piece, and one manipulating the sound through playing with cutoffs, feedback loops, and attacks/delays. Our first instinct was to work with Makey Makey to use IFTTT. The general idea was to have a jamsesh and when we fell into a rhythm that we liked, we would hit the Makey Makey to begin recording, then hit it again to stop recording and to upload the track to Soundcloud. However, after some investigation, we realized the limitations of IFTTT and that we would rather spend time trying to learn new ways to create music, rather than to dwell over uploading issues.

    2nd Iteration
    Deciding to focus on the music, we discovered the mp3 Little Bits piece. This piece allowed create music through p5js, which we could then modify and manipulate the way we had been for the music generated with the Little Bits pieces. Our concept was that we could have a stream of melodies being generated through our program, and we could have a collaborative jamsesh between the coded music, and our own improvisations on the little bits pieces. After some brainstorming, we came up with an idea to translate strings, possibly scraped from places like Twitter or NYTimes, into melodies, but in a very meta way. The program would translate any letter from A-G into their corresponding note, if the character was not one of those letters, it would simply hold the previous note, or rest. If given a string such as “a ce melody“, our program would translate it into AAAACEEEEEEDD. This required us to also to lean about the p5 sound library, and also create our own scale using an array of midi numbers.

    We wanted our music to have some depth so we figured we could also generate a baseline. If we were using the content of the article or the tweet as the melody, then the username or author’s name could be used for the baseline. This was when we had the pleasure of learning about Euclidean rhythms (Godfried Toussaint). Given two numbers, the algorithm can create a beat made by evenly distributing the smaller number into the larger number. Some of these patterns can be found in traditional music, such as E(3,8). “In Cuba it goes by the name of the tresillo and in the USA is often called the Habanera rhythm” [Toussaint]. We would then use the mp3 Little Bit to bring in the melody, and use the midi Little Bit for the baseline, which would enable us to manipulate the two pieces differently.

    antarkrawleb

    Unfortunately, because the melody was following patterns within the string, it wasn’t really following any rhythmic time at all. This made it difficult to align it to any baseline. Though we thoroughly enjoyed coding and learning about euclidean rhythms, we ended up not using this algorithm.

    3rd Iteration
    Once we had finalized our melody code, we began to set up all of our pieces. We connected Krawleb’s computer with an aux cord to the mp3 little bits piece, to play our coded melodies into the little bit. We then attached a filter and a delay to the mp3 and connect them to the speaker. We used a splitter connected to the speaker, with one branch connected to a microphone that recorded into Antar’s computer, and the other branch connected to headphones. We were really excited to start manipulating the melodies and adding in our own improvised music, but we quickly discovered the limitations of the Little Bits. Unfortunately, the audio would have these sharp clips, that sounded like the speaker was blowing out. At first we thought one of the bits was broken, but after testing each piece individually, we realized that it was not a single piece that was causing this, but rather the quantity. When we added a 10th piece to our path, no matter which piece it was, we would have this error. Unfortunately to be able to do any sort of collaboration with our generative melody, we would need to have at least 10 pieces connected. We decided to then simply record our manipulated melody, then, separately, record us creating our own music.

    Through the project we felt some other limitations of the Little Bits, such as overheating, weak power, sound compression, and a fatality (a delay blew out). We spent a ton of time making really interesting music, but failed to record most of it. In retrospect, we should have began recording earlier, and much more frequently, so as to have a larger library to mix with. We should have also incorporated a simple networked component.

    Jaqaur-Proposal

    For my final project, I want to redo my plotter project, once again with ink and brush, but with more compelling content than the binary trees from before. I would like to replace those trees with lines generated from motion capture data. I could make a series of pictures with different colors, each one from a different motion (including ballet dance, martial arts, break-dancing; I know a few people who are wiling to help).

    Ideally, each picture would feature a few different colors (with multiple trays of ink), perhaps each one corresponding to a joint whose progressive x and y coordinates will direct the line on the paper. I plan to experiment with which joints are best to draw; it might be different for different motions.

    This project will require setting up and using the Kinect and also the Axidraw. So, it will take some time. But I’m very excited about it!

    Written by Comments Off on Jaqaur-Proposal Posted in Proposal

    ngdon-Proposal

    For my final project I’m going to expand on my generative book. I’m going to fix the imperfections of the things I already have, and add a lot of new things.

    First I’m going to improve the eyes of the creatures, and devise an algorithm for generating descriptions for them as Golan recommended. For the descriptions I’m thinking of grabbing descriptions about real animals from wikipedia, scramble them and substitute key information. I’m also thinking about extracting animal features from my generation algorithm to use them in the description.

    For the new things, I’m also going to generate terrain and plants. The terrain, the plants and the animals are going to the three major components of my new book.

    For the terrain, I’m going to render some images in 3D (or pseudo-3D) with the plants and animals I generated, so that it looks like a photograph of a real landscape. I’m also going to analyze the terrain, and assign the different animals and plants to different parts of the island based on the geography.

    For the generative plants, I’m probably going to combine fractals with generative textures. I’ll look into L-systems, which I heard is very good at generating plants. I have already written an algorithm for generating realistic wood textures using perlin noise last week, and I will probably use similar algorithms to generate other plant-like textures.

    Written by Comments Off on ngdon-Proposal Posted in Proposal

    ngdon-manifestoReading

    “5. The Critical Engineer recognises that each work of engineering engineers its user, proportional to that user’s dependency upon it.”

    I think the tenet means that a work of engineering modifies the way the user thinks about the problem, so the more the user uses it, the more the user thinks in its way. I found this particularly interesting because I’m gradually becoming aware of the fact that I’m being engineered by so many things around me. I’m also thinking about common things that could have been engineered in another way and what would happen to us if they were.

    One example would be the Processing language. After using Processing a lot for a semester, I begin thinking in Processing’s way. I think of the screen as a canvas, and lines and shapes are drawn onto it to make a frame. Even when I’m not using Processing, or when I’m just thinking about random ideas, (or when I’m thinking about things not even related to programming,) this mode of thinking sticks. Similarly, if I have been using d3 for the whole semester, I might have think of programing as data entering and exiting.

    arialy-mocap

    I wanted to have decorate to bodies through this mocap project. Both figures would be created by cubes. Figure A would be the light source, mostly dark since the inner light would cast a shadow on the visible parts of its cubes, and figure B would be lit only by figure A’s light. I liked the irony of the light source being dark, and the figure without its own light being bright.

     

    arialymocapgif

    screen-shot-2016-11-17-at-6-32-22-pm

    mocapscreenvideo

     

    https://github.com/acdaly/60-212/tree/master/brekel_mocap

    
    import java.util.List;
    
     
    
    class PBvh
    {
    BvhParser parser;
    
    PBvh(String[] data) {
    parser = new BvhParser();
    parser.init();
    parser.parse( data );
    }
    
    void update( int ms ) {
    parser.moveMsTo( ms );//30-sec loop
    parser.update();
    }
    
    //------------------------------------------------
    void draw() {
    // Previous method of drawing, provided by Rhizomatiks/Perfume
    
    fill(color(255));
    for ( BvhBone b : parser.getBones()) {
    pushMatrix();
    translate(b.absPos.x, b.absPos.y, b.absPos.z);
    ellipse(0, 0, 2, 2);
    popMatrix();
    if (!b.hasChildren()) {
    pushMatrix();
    translate( b.absEndPos.x, b.absEndPos.y, b.absEndPos.z);
    ellipse(0, 0, 10, 10);
    popMatrix();
    }
    }
    }
    
    //------------------------------------------------
    // Alternate method of drawing, added by Golan
    
    void drawBones(int light) {
    //noFill();
    stroke(255);
    strokeWeight(1);
    
    List<BvhBone> theBvhBones = parser.getBones();
    int nBones = theBvhBones.size(); // How many bones are there?
    BvhBone cBone = theBvhBones.get(1);
    PVector boneCoordc = cBone.absPos;
    float x2 = boneCoordc.x; // Get the (x,y,z) values
    float y2 = boneCoordc.y; // of its start point
    float z2 = boneCoordc.z;
    /*
    BvhBone bBone = theBvhBones.get(16);
    PVector boneCoordb = bBone.absPos;
    float x3 = boneCoordb.x; // Get the (x,y,z) values
    float y3 = boneCoordb.y; // of its start point
    float z3 = boneCoordb.z;
    line(x2, y2, z2, x3, y3, z3);
    */
    BvhBone dBone = theBvhBones.get(0); // Get the i'th bone
    println(dBone.getName());
    PVector boneCoordx = dBone.absPos; // Get its start point
    float xl = boneCoordx.x; // Get the (x,y,z) values
    float yl = boneCoordx.y; // of its start point
    float zl = boneCoordx.z;
    //LIGHT SOURCE
    
    
    if (light == 1) {
    pointLight(255, 255, 225, xl, yl - 60, zl);
    }
    for (int i=0; i<nBones; i++) { // Loop over all the bones
    BvhBone aBone = theBvhBones.get(i); // Get the i'th bone
    
    PVector boneCoord0 = aBone.absPos; // Get its start point
    float x0 = boneCoord0.x; // Get the (x,y,z) values
    float y0 = boneCoord0.y; // of its start point
    float z0 = boneCoord0.z;
    String boneName = aBone.getName();
    
    if (aBone.hasChildren()) {
    stroke(255);
    
    // If this bone has children,
    // draw a line from this bone to each of its children
    List<BvhBone> childBvhBones = aBone.getChildren();
    int nChildren = childBvhBones.size();
    for (int j=0; j<nChildren; j++) {
    BvhBone aChildBone = childBvhBones.get(j);
    String childName = aChildBone.getName();
    
    PVector boneCoord1 = aChildBone.absPos;
    
    float x1 = boneCoord1.x;
    float y1 = boneCoord1.y;
    float z1 = boneCoord1.z;
    //for (
    int cubeNum = 10;
    float deltaZ = (z1 - z0)/cubeNum;
    float deltaY = (y1 - y0)/cubeNum;
    float deltaX = (x1 - x0)/cubeNum;
    
    float maxDelta = max(deltaZ, deltaY, deltaX);
    for (int c = 0; c < cubeNum; c++) {
    pushMatrix();
    noStroke();
    translate( x0 + deltaX*c + random(-5, 5), y0+ deltaY*c+ random(-10, 10), z0+ deltaZ*c+ random(-5, 5));
    //translate(x0 + deltaX*c, y0+ deltaY*c, z0+ deltaZ*c);
    box(random(2, 5));
    popMatrix();
    }
    
    //line(x0, y0, z0, x1, y1, z1);
    }
    } else {
    // Otherwise, if this bone has no children (it's a terminus)
    // then draw it differently.
    stroke(255);
    PVector boneCoord1 = aBone.absEndPos; // Get its start point
    float x1 = boneCoord1.x;
    float y1 = boneCoord1.y;
    float z1 = boneCoord1.z;
    
    //line(x0, y0, z0, x1, y1, z1);
    boneName = aBone.getName();
    
    if (boneName.equals("Head")) {
    if (light == 1) {
    noStroke();
    
    //stroke(255, 50);
    pushMatrix();
    translate(x1+random(-5,5), y1+random(-5,5), z1+random(-5,5));
    box(8);
    popMatrix();
    } else {
    noStroke();
    shininess(1.0);
    pushMatrix();
    translate(x1+random(-5,5), y1+random(-5,5), z1+random(-5,5));
    box(8);
    popMatrix();
    }
    }
    }
    }
    }
    }
    
    
    Written by Comments Off on arialy-mocap Posted in Mocap

    Ngdon-Darca-Object

    The Toilet Paper Printer

    wechatimg5

    The Toilet Paper Printer plots data in realtime on a roll of toilet paper. It can accept data such as a simple sine wave, amplitude of noises in its surroundings, or even any data curled from the internet. Its circuit mainly consists of three littlebit DC motors and a cloudbit, while the entire structure is built with cardboard boxes, straws, and toothpicks.

    Video

    plotterv

    Inspiration

    I like plotters. As a kid I enjoyed ripping off the top of a printer while its printing and peaking into the machine as it worked. So when I saw all the motors we’ve got, I suggested that we can make our own plotter/printer thing.

    wechatimg3

    The Print Head

    We spent most of our time figuring out how to make a print head that actually works.

    At first we tried to make a conveyer-belt-like mechanism to move the pen. That didn’t work because little bits motors couldn’t their change direction of rotation in real time. So I designed a logic circuit which splits the analog input to control the motors separately. This didn’t work either, because the axles are locked even when the motors are not activated.

    I came up with the idea of a gear and a rack when I suddenly realized that the wavy layer in cardboards could serve perfectly as the teeth of the gears. We made both the gear and the rack out of peeled cardboard, and powered the gear with the only motor that can rotate both ways. It worked.

    There is rail at the bottom of the print head, so it could slide smoothly left and right.

    wechatimg1

    ezgif-com-optimize

    The Paper Feeder

    We had a couple of ideas how this could look like. One is that the printer should have wheels like cars and drives back and fro on the paper. Later we decided to print onto a motor-driven toilet paper, because it’s both easy to implement, and interesting as a concept.

    We made two gears, rotating in opposite directions, fixed very close to each other. Thus the paper which is placed in between them gets driven out.

    wechatimg2

     

    Littlebits Circuit

    snip20161117_15

     

    The Software

    We used cloud bits to receive data from the computer over wifi. Basically a processing program gets input from the microphone, writes it real time into a text file, which is then read in real time by an AppleScript script, which uses shell commands to communicate with the cloud bit API.

    snip20161118_18

    Code

    Processing:

    import processing.sound.*;
    Amplitude amp;
    AudioIn in;
    
    void setup() {
      size(640, 360);
      background(255);
        
      amp = new Amplitude(this);
      in = new AudioIn(this, 0);
      in.start();
      amp.input(in);
    }      
    
    void draw() {
      background(255);
      float aa = min(0.99,amp.analyze()*10);
      //println(aa);
      line(0,aa*height,width,aa*height);
      saveStrings("audioin.txt", new String[]{""+aa});
    }
    
    

    AppleScript:

    set p to "/Users/lingdonghuang/Documents/Processing/audioin/audioin.txt"
    set f to 0
    set lf to 0
    set i to 0
    try
    	repeat
    		delay 0.2
    		try
    			set f to (read p) * 100
    		end try
    		set d to (f - lf) * 0.4 + 50
    		if d > 80 then
    			set d to 80
    		else if d < 20 then
    			set d to 20
    		end if
    		set d to d - 3
    		set a to do shell script "curl \"https://api-http.littlebitscloud.cc/v2/devices/243c201dc4f7/output\" -X POST -H \"Authorization: b97ba2fe26cdb3de50b2ead1c2838e0c13e244f0d628a0c5a20a8ca3d6d358ab\" -H \"Content-type: application/json\" -d '{ \"percent\": " & d & ", \"duration_ms\": " & -1 & " }'"
    		set lf to f
    		log {d}
    	end repeat
    on error
    	log ("STOP!")
    	set a to do shell script "curl \"https://api-http.littlebitscloud.cc/v2/devices/243c201dc4f7/output\" -X POST -H \"Authorization: b97ba2fe26cdb3de50b2ead1c2838e0c13e244f0d628a0c5a20a8ca3d6d358ab\" -H \"Content-type: application/json\" -d '{ \"percent\": " & 50 & ", \"duration_ms\": " & -1 & " }'"
    	
    end try
    
    

    The Box

    Made of laser-cut white and frosted plastic.

    img_4729

    Reflections

    Little bits sucks. They might sound like a nice idea, but when you actually want to make something with them, they only “kinda works”. They make you want to trample on them and fling them out of the window. But in general we’re quite satisfied with what we’re able to achieve.

    I enjoyed the process of struggling with the littlebits and the cardboards-and-straws mechanisms we had. Instead of being able to make whatever I want like coding in Processing or Python, we had to constantly take into consideration the flakiness of little bits and the straws, and challenge ourselves into coming up with more and more robust and reliable solutions.

    The thing that excites me most is being able to make a working printer entirely out of trash. In the future I can probably improve the hardware and software so that it will be able to write letters and even make drawings. I’m going to publish the recipe so even beggars can own printers.

     

    img_4711

    Written by Comments Off on Ngdon-Darca-Object Posted in Object

    hizlik-proposal

    “I don’t know”

    this page will be updated when I do

    Written by Comments Off on hizlik-proposal Posted in Proposal

    kadoin-lookingOutwards08

     

     

    Adam Ben-Dror seems to have a PIXAR them going with his physical computing projects. The Abovemarine reminds me the fish in the bags at the end of Finding Nemo, and you can’t not think about Luxo the lamp when watching the Pinokio the lamp video. Despite being very PIXAR they show a fair amount of creativity and originality, especially in the documentation.

    They both use pretty simple motion tracking to make the objects move around, but those movements give them a pretty great deal of personality. I would very much like to go on a walk down the street with Jose once he is not bound by the wires.

    I was also told Adam was an exchange student here from New Zealand so I feel like I gotta keep upping my game because of how simple but fun these concepts and executions are.

    cambu-mocap

     

    The Story

    When I was about 12, I visited the North American veterinary conference (NAVC) with my mom in Orlando, Florida. I was walking around the show floor with my mom when we decided to stop at the Bayer booth. In the middle of the booth was an original Microsoft Surface table — many people were congregating around to to see what it was all about. My mom and I played with it for awhile and then she left to enjoy the rest of the conference, but I stayed in the Bayer booth for easily 3 or 4 more hours becoming good friends with the booth attendants. I think it was the first highly responsive touch interface I’d ever used and it played on in my dreams for weeks. When I returned home, I tried to get my dad to buy one for our house, but at the time it was ~10-15K to install and you had to be a commercial partner…

     

    Documentation

    60-212: cambu-mocap demo

     

    giphy

    process_sketching

    Code

    //include statements for the library
    import oscP5.*;
    import netP5.*;
    
    img image1; //Constructor for Image
    hand leftHand; //the object that will contain all of the leftHand Data 
    hand rightHand; //the object that will contain all of the rightHand Data
    OscP5 oscP5; //name the oscP5 object
    NetAddress serverAddress; //name the addresses you'll send and receive @
    PImage imageFill1;
    
    int listeningPort; //server and client ports
    
    float rectX = 200;
    float rectY =  200;
    float rectWidth = 350;
    float rectHeight = 250;
    
    //now set the addresses, etc
    void setup()
    {
      imageFill1 = loadImage("IMG_1087.JPG");
      //if listening and sending are the same then messages will be sent back to this sketch
      listeningPort = 12345;
      oscP5 = new OscP5(this, listeningPort);
    
      size(1200, 700);
      background(rectX, rectY, rectWidth, rectHeight);
    
      // create image object 
    
      image1 = new img(rectX, rectY, rectWidth, rectHeight);
    
      // create hand objects
      leftHand = new hand();
      rightHand = new hand();
    }
    
    void oscEvent(OscMessage receivedMessage) {
      String[] message = receivedMessage.addrPattern().split("/");
    
      //ripping out all joint:hand data
      boolean isHand = message[4].equals("HandLeft") || message[4].equals("HandRight");
      if (message[3].equals("joints") && isHand == true) {
    
        if (message[4].equals("HandLeft")) {
          float handLeftXPos = receivedMessage.get(0).floatValue();
          float handLeftYPos = receivedMessage.get(1).floatValue();
          String tracked = receivedMessage.get(3).stringValue();
    
          leftHand.updateXYC(handLeftXPos, handLeftYPos, tracked);
        }
        if (message[4].equals("HandRight")) {
          float handRightXPos = receivedMessage.get(0).floatValue();
          float handRightYPos = receivedMessage.get(1).floatValue();
          String tracked = receivedMessage.get(3).stringValue();
    
          rightHand.updateXYC(handRightXPos, handRightYPos, tracked);
        }
      }
      //ripping out all hand:closed data
      if (message[3].equals("hands")) {
        String leftOrRight = message[4];
        String grabVar = (receivedMessage.get(0).stringValue() + "/" + leftOrRight);
    
        if (grabVar.contains("Left")) {//change something about left
          if (grabVar.contains("Open")) {
            leftHand.updateIsClosed(false);
          } else {
            leftHand.updateIsClosed(true);
          }
        }
        if (grabVar.contains("Right")) {//change something about the right hand
          if (grabVar.contains("Open")) {
            rightHand.updateIsClosed(false);
          } else {
            rightHand.updateIsClosed(true);
          }
        }
      }
      //println ("rectX" + rectX);
      //println ("rectY" + rectY);
      //println ("rectWidth" + rectWidth);
      //println ("rectHeight" + rectHeight);
    }
    void hoverCheck() {
      //check if right hand is hovering over the object
      if (rightHand.xPos >= image1.xPosition && rightHand.xPos <= image1.xPosition + image1.rectWidth && rightHand.yPos >= image1.yPosition && rightHand.yPos <= image1.yPosition + image1.rectHeight) { //println(rightHand.xPos + " >= " + rectX + " && " + rightHand.xPos + " < = " + (rectX+rectWidth)); image1.updateHoverState(true); if (rightHand.closed == true) { println("hoverGrab"); image1.move(rightHand.xPos, rightHand.yPos); toScale(); } } else { image1.updateHoverState(false); } } void toScale() { if (leftHand.xPos >= image1.xPosition && leftHand.xPos <= image1.xPosition + image1.rectWidth && leftHand.yPos >= image1.yPosition && leftHand.yPos <= image1.yPosition + image1.rectHeight) {
        //left hand also hovering
    
        if (leftHand.closed == true) {
          //get distance
          float rightToLeftDist = dist(rightHand.xPos, rightHand.yPos, leftHand.xPos,leftHand.yPos);
          println(rightToLeftDist);
          float scaleVar = map(rightToLeftDist, 0, 0.5*image1.rectWidth, 0, 1.5);
          image1.rectWidth = image1.rectWidth*scaleVar; 
          image1.rectHeight = image1.rectHeight*scaleVar;
          //scale by some multuplier 
        }
      }
    }
    
    void draw() {
      noStroke();
      fill(255, 255, 255, 100);
      rect(0, 0, width, height);
      hoverCheck();
      //image1.render();
    
      image(imageFill1, image1.xPosition, image1.yPosition);
      imageFill1.resize(int(image1.rectWidth), int(image1.rectHeight));
      image1.render();
      scale(1);
      leftHand.render();
      rightHand.render();
    }
    class hand { //class that allows the creation of any hand method
    
      boolean closed;
      float xPos;
      float yPos;
      color fillColor;
      String trackingConfidence; //is either Tracked, Inferred, or (maybe something else)
    
      hand() {
        closed = false;
        xPos = 200;
        yPos = 200;
        fillColor = color(200, 200, 200);
      }
    
      void updateXYC(float newXPos, float newYPos, String trackedState) { // a function to update x position, y position, and tracking confidence
    
        //direct map
        //xPos = map(newXPos, -1, 1, 0, width);
        //yPos = map(newYPos, 1, -1, 0, height);
    
        //smooothed map
        //X------
        float mappedNewXPos =  map(newXPos, -1, 1, 0, width);
        //println(mappedNewXPos);
        xPos = 0.5 * xPos + 0.5 * mappedNewXPos;
        //Y------
        float mappedNewYPos =  map(newYPos, 1, -1, 0, height);
        //println(mappedNewXPos + "," + mappedNewYPos);
        yPos = 0.5 * yPos + 0.5 * mappedNewYPos; 
    
        trackingConfidence = trackedState;
      }
    
      void updateIsClosed(boolean openOrClose) {
        if (openOrClose == true) {
          fillColor = color(230, 50, 100);
          closed = true;
        } else { // open
          fillColor = color(200, 200, 200);
          closed = false;
        }
      }
    
      void render() {
        fill(fillColor);
        ellipse(xPos, yPos, 25, 25);
      }
    }
    class img {
    
      color c;
      float xPosition;
      float yPosition;
      float rectWidth;
      float rectHeight;
      boolean isHovering;
    
      img(float xPos, float yPos, float rWidth, float rHeight) {
        c = color(200, 200, 200, 0);
        xPosition = xPos;
        yPosition = yPos;
        rectWidth = rWidth;
        rectHeight = rHeight;
        isHovering = false;
      }
    
      void render() {
        fill(c);
        rect(xPosition, yPosition, rectWidth, rectHeight);
      }
    
      void updateHoverState(boolean hoverState) {
        isHovering = hoverState;
        if (isHovering) {
          c = color(245, 50, 100, 50);
        } else {
          c = color(245, 50, 100, 0);
        }
      }
    
      void move(float x, float y) {
        
        //xPosition = xPosition + deltaX;
        //yPosition = yPosition + deltaY;
        xPosition = x-rectWidth/2;
        yPosition = y-rectHeight/2;
      }
    }
    
    Written by Comments Off on cambu-mocap Posted in Mocap

    Xastol – Proposal

    For my final project, I want to revisit generative works. Specifically, I want to create a program that randomly generates movie concepts. I am currently researching machine learning algorithms and searching for a large movie database, as these will be imperative to my program.

    Written by Comments Off on Xastol – Proposal Posted in Proposal

    Krawleb-Mocap

    walking-man

    extra right arm on right knee

    4arms

    extra set of arms on hips

    capearms

    extra set of legs attached at elbows

    armtri

    extra left arm on right hand and right arm on left hand

    legbox

    extra right arm on left knee and left arm on right knee

    antler-arms

    ‘antler arms’ extra arms attached at hand

    facearms

    mouth grabbers, extra arms attached at face

    This project was a collaboration with the wonderful Kadoin, where we explored the idea of arbitrary limb duplication to explore strange—sometimes horrific—semi-human skeletons.

    By imagining the skeleton as a collection of bones, we looked at what could happen if skeletons could augment themselves by collecting more bones, adding extra limbs, and building new bodies.

    The frankensteined results are uncanny but familiar forms that make us wonder about what each of these creatures might do with those extra limbs, how they might walk and interact, and what their unique structure allows them to do.

    This project was created in processing, using sample code from Golan’s BVH player. We ran into an unfortunate heap of technical trouble when executing this rather conceptually simple project, which caused bugs with rendering anything other than lines for the limbs, as well as being unable to render the terminals of the duplicated bones.

    Ideally, we would have loved to attach more fleshy, deforming geometry to the skeletons, and refine the visual presentation beyond just wireframes, but were satisfied with how compelling the simple output was.

    Github here

    Processing BvhBone Class Below:

    (majority of the custom code is here)

    import java.util.List;
    
    class PBvh
    {
      BvhParser parser;  
      PBvh(String[] data) {
        parser = new BvhParser();
        parser.init();
        parser.parse( data );
      }
      void update( int ms ) {
        parser.moveMsTo( ms );//30-sec loop 
        parser.update();
      }
      void drawBones(boolean renderExtras) {
        noFill(); 
        stroke(255); 
        strokeWeight(2);
    
          List theBvhBones = parser.getBones();
          int nBones = theBvhBones.size();       // How many bones are there?
          BvhBone aBone;
    
          /////////MAIN BONE LOOP/////////
          for (int i=0; i<nBones; i++) {         // Loop over all the bones
          
            PVector anchorTranslation = new PVector (0,0,0);
            pushMatrix();
            
            aBone = theBvhBones.get(i);
            
            /////////////////////////////////////////////////////////////////
            //Manual Duplicated adding
            if (aBone.getName().equals("LeftForeArm") && renderExtras == true) {
              aBone.duplicates.add("Head");
              
              stroke(255);
            
              //draw root bone in original position
              line(
                aBone.absPos.x, 
                aBone.absPos.y, 
                aBone.absPos.z, 
                aBone.getChildren().get(0).absPos.x, 
                aBone.getChildren().get(0).absPos.y, 
                aBone.getChildren().get(0).absPos.z);
    
              // Look through duplicates array, find the matching translation Vectors (where to attach the duplicated limb)
              for (String dupe : aBone.duplicates) {
                for (int l = 0; l < theBvhBones.size(); l++) {
                  if (theBvhBones.get(l)._name.equals(dupe))
                  {
                    //then, save the translation in preparation for drawing duplicate
                    anchorTranslation = new PVector(theBvhBones.get(l).absPos.x, theBvhBones.get(l).absPos.y, theBvhBones.get(l).absPos.z);
                  }//end if
                }//end the for loop
              }//end for dupe
    
              BvhBone currentBone = aBone;
              float modifier = 4.0;
              translate(-currentBone.absPos.x,-currentBone.absPos.y,-currentBone.absPos.z);
              while (currentBone.hasChildren()) {
                List currentChildren = currentBone.getChildren();
    
                for (int j = 0; j < currentChildren.size(); j++) {
                  pushMatrix();
                  translate(anchorTranslation.x,anchorTranslation.y,anchorTranslation.z);
                  
                  line(
                    currentBone.absPos.x, 
                    currentBone.absPos.y, 
                    currentBone.absPos.z, 
                    currentChildren.get(j).absPos.x, 
                    currentChildren.get(j).absPos.y, 
                    currentChildren.get(j).absPos.z);
    
                  println(currentBone);
                  println(currentChildren.size());
                  println(currentChildren.get(0));
                  println("--------");
    
                  List grandchildren =  currentChildren.get(j).getChildren();
    
                  for (int k = 0; k < grandchildren.size(); k++) {
    
                    //line(
                    //  currentChildren.get(j).absEndPos.x*0, 
                    //  currentChildren.get(j).absEndPos.y*0, 
                    //  currentChildren.get(j).absEndPos.z*0, 
                    //  grandchildren.get(0).absPos.x, 
                    //  grandchildren.get(0).absPos.y, 
                    //  grandchildren.get(0).absPos.z);
                  }//end grandchildren for
                  popMatrix();
                }//end current children for
    
                BvhBone nextBone = currentChildren.get(0);
                currentBone = nextBone;
                
              }//end of while loop
            }//end specific bone if
            /////////////////////////////////////////////////////////////////
            
            /////////////////////////////////////////////////////////////////
            //Manual Duplicated adding
            if (aBone.getName().equals("RightForeArm") && renderExtras == true) {
              aBone.duplicates.add("Head");
              
              stroke(255);
            
              //draw root bone in original position
              line(
                aBone.absPos.x, 
                aBone.absPos.y, 
                aBone.absPos.z, 
                aBone.getChildren().get(0).absPos.x, 
                aBone.getChildren().get(0).absPos.y, 
                aBone.getChildren().get(0).absPos.z);
    
              // Look through duplicates array, find the matching translation Vectors (where to attach the duplicated limb)
              for (String dupe : aBone.duplicates) {
                for (int l = 0; l < theBvhBones.size(); l++) {
                  if (theBvhBones.get(l)._name.equals(dupe))
                  {
                    //then, save the translation in preparation for drawing duplicate
                    anchorTranslation = new PVector(theBvhBones.get(l).absPos.x, theBvhBones.get(l).absPos.y, theBvhBones.get(l).absPos.z);
                  }//end if
                }//end the for loop
              }//end for dupe
    
              BvhBone currentBone = aBone;
              float modifier = 4.0;
              translate(-currentBone.absPos.x,-currentBone.absPos.y,-currentBone.absPos.z);
              while (currentBone.hasChildren()) {
                List currentChildren = currentBone.getChildren();
    
                for (int j = 0; j < currentChildren.size(); j++) {
                  pushMatrix();
                  translate(anchorTranslation.x,anchorTranslation.y,anchorTranslation.z);
                  
                  line(
                    currentBone.absPos.x, 
                    currentBone.absPos.y, 
                    currentBone.absPos.z, 
                    currentChildren.get(j).absPos.x, 
                    currentChildren.get(j).absPos.y, 
                    currentChildren.get(j).absPos.z);
    
                  println(currentBone);
                  println(currentChildren.size());
                  println(currentChildren.get(0));
                  println("--------");
    
                  List grandchildren =  currentChildren.get(j).getChildren();
    
                  for (int k = 0; k < grandchildren.size(); k++) {
    
                    //line(
                    //  currentChildren.get(j).absEndPos.x*0, 
                    //  currentChildren.get(j).absEndPos.y*0, 
                    //  currentChildren.get(j).absEndPos.z*0, 
                    //  grandchildren.get(0).absPos.x, 
                    //  grandchildren.get(0).absPos.y, 
                    //  grandchildren.get(0).absPos.z);
                  }//end grandchildren for
                  popMatrix();
                }//end current children for
    
                BvhBone nextBone = currentChildren.get(0);
                currentBone = nextBone;
                
              }//end of while loop
            }//end specific bone if
            /////////////////////////////////////////////////////////////////
    
       
            ////////////////////////////////STUFF THAT DRAWS THE ORIGINAL SKELETON/////////////////////////////////
    
            PVector boneCoord0 = aBone.absPos;   // Get its start point
            float x0 = boneCoord0.x;             // Get the (x,y,z) values 
            float y0 = boneCoord0.y;             // of its start point
            float z0 = boneCoord0.z;
    
            if (aBone.hasChildren()) {
              println(aBone);
               
              // If this bone has children,
              // draw a line from this bone to each of its children
              List childBvhBones = aBone.getChildren();
              int nChildren = childBvhBones.size();
              for (int j=0; j<nChildren; j++) {
                BvhBone aChildBone = childBvhBones.get(j);
                PVector boneCoord1 = aChildBone.absPos;
    
                float x1 = boneCoord1.x;
                float y1 = boneCoord1.y;
                float z1 = boneCoord1.z;
    
    
                line(x0, y0, z0, x1, y1, z1);
                
                
              }//end if children loop
            } else {
              // Otherwise, if this bone has no children (it's a terminus)
              // then draw it differently. 
    
              PVector boneCoord1 = aBone.absEndPos;  // Get its start point
              float x1 = boneCoord1.x;
              float y1 = boneCoord1.y;
              float z1 = boneCoord1.z;
    
              line(x0, y0, z0, x1, y1, z1);
    
              String boneName = aBone.getName(); 
              if (boneName.equals("Head")) { 
                pushMatrix();
                translate( x1, y1, z1);
                ellipse(0, 0, 30, 30);
                popMatrix();
              } //end if head
            } //end else
            popMatrix();
          }//end loop over all bones
        } //end drawbones
      } //end class BVH
    
    
    Written by Comments Off on Krawleb-Mocap Posted in Mocap