Category: Assignment-13-LO

Looking Outwards

Eunoia by Lisa Park

Eunoia collects the artist’s brain using an EEG sensor and then uses them to vibrate 5 bowls of water. Using Processing and MaxMSP, each bowl of water is receiving the frequencies of her brain activity (Alpha, Beta, Delta, Gamma, Theta) as well as eye movement. This is visualizing data that is usually difficult to grasp by the average person.

This is the data she is using as a graph, and it seems incomprehesible.Elements by Britzpetermann

Straightforwardly Elements is an interactive walkway. One of four elements will build up where the subject puts their hands. Because of its simplicity the experience of Elements relies heavily on its visuals, and the graphics aren’t as natural looking as they could be. Looking at Elements you are very aware that the images are computationally generated. It is very difficult to re-create something as magnificent and beautiful as natural fire and sand dunes, and not seem lame compared to the real thing.

Equilibrium by Memo Akten

Similarly to Elements, Equilibrium becomes disturbed when touched and settles to a calmer state when left alone, except with this project the visuals are stellar. It is very complex, but soothing at the same time, all in very good taste. Interestingly, I read from Memo Akten’s website that it was the wild landscapes of Madagascar which served as inspiration for this project, which adds a whole new frame. Equilibrium creates its own platform by becoming something that is incomparable to anything in nature, yet still reflects natural behavior.

Looking Outwards: Final Project

(apologies for posting this so late)

For the past year or so, I’ve been very interested in surveillance conducted by machines using hidden/mysterious/proprietary algorithms and giant databases. These three projects are very relevant to this idea and gave me a lot of inspiration for my final ‘panopticon’ project.

1-tscktfS2eUZdjr6vsyBDIg

“Data Masks” by Sterling Crispin, 2013-present

“Sterling Crispin’s “Data Masks” use raw data to show how technology perceives humanity…Reverse-engineered from surveillance face-recognition algorithms and then fed through Facebook’s face-detection software, the Data Masks confront viewers with the realization that they’re being seen and watched basically all the time.”

_sample4_face_web

“Stranger Visions” by Heather Dewey-Hagborg, 2012-2013

“In “Stranger Visions”, Heather Dewey-Hagborg analyses DNA from found cigarette butts, chewed gum and stray hairs to generate portraits of each subject based on their genetic data…While not so exact as to readily identify an individual, the portraits demonstrate the disquieting amount of information that can be derived from a single strand of a stranger’s hair and the disturbing potential for surveillance of our most personal information.”

ssbkyh_captcha_tweet_01

“CAPTCHA Tweet” by Shin Seung Back and Kim Yong Hun, 2013

“‘CAPTCHA Tweet’ is an application that users can post tweets as CAPTCHA. Since computers can hardly read it, humans can communicate behind their sight.”

LO & Final Project

I’ve been recently interested in the text-based gaming revival started by Porpentine. Porpentine created this platform called Twine where anyone is able to create their own text-based video game (some examples http://aliendovecote.com/intfic.html). The game borders an immersion of story/plot and contemporary poetic elements.

Screen Shot 2014-11-19 at 20.09.33

After thinking about this, I was inspired to create a sort of facial simplification via prose by creating a software that would recognize elements of your face (level of eyebrows, face structure, expression, facial direction, colour of shirt below face). The data would then give the participant a small snippet of prose based upon these elements.

Looking Outward Final Project

“Nomis is a musical instrument created with the aim of making loop based music more expressive and transparent through gesture and light.” For this project, the artist, Jonathan Sparks, uses Maxmsp to allow a viewer to became a music creator by using their hands to activate certain position on a circle and two vertical columns.

“Voice Lessons is an electronic, audio device that interrogates the popular myth that every musical instrument imitates the human voice. Touching the screen allows the participant to manipulate the visuals and vocalizations of the “voice teacher” as he recites vocal warm up exercises”  For this piece, even though we already have viewed it in class, is important to my final project because it engages the viewers hand movement in controlling the different aspects of  sound of the video.  These aspects include, speed, pitch, and vowel sound.  However, it is not as successful for me because the viewer is force to engage a screen and does not quite as free of a range of motion and I would like my piece to have.

“Move is a technology garment designed by Electricfoxy that guides you toward optimal performance and precision in movement in an ambient, precise, and beautiful way. “I chose this piece because I enjoyed how it was a wearable object, similar to the glove I’m plan on using, that coordinates the performers movements and visual displays their movement.  Through these displays, the performer is also able to see improvements that the cloths calculate you could make in order to add precision to the performers form.

All of these other projects are  interesting to me because they actively engage the viewer and force a gesture or movement of their body in order to create a sound.  This translation from a tactile sense to a auditory/visual sense is a theme I am looking for in my final project.

 

 

Looking Outwards Final Project

For my final project I’m not exactly sure what I want to do. I only know that I want to work with sound. Without getting my expectations too high, i’d like to attempt to create a work which reacts to the conversations and interactions in a room. The projects I have selected each reacts to information presented to it. Some sound and some visuals.

Conversnitch – Conversnitch by Kyle McDonald is a device which listens to conversations around it, secretly uploads them to mechanical turk to be transcribed, and then tweets to the world what was supposed to be a private conversation. The integration of turking into this project is extremely interesting in that it is very difficult for computers themselves to transcribe audio. Integrating a “silent human” element to the work is extremely powerful because it makes the process still seem automated even though a majority of the difficult work is done by humans.

Conversnitch from Kyle McDonald on Vimeo.

Descriptive Camera – Descriptive Camera by Matt Rishardson is a device which snaps an image of an area and rather than out out that image, “develops” the image into a description of the scene in words. This project also uses mechanical turk to transcribe information but what’s most interesting about this project is that it changes what we expect. When a photo is taken we as digital individuals expect a lasting snapsht and when we are returned a description we are both jarred and freed. Freed in the sense that we can now use this information as we wish.

Giver Of Names – Giver of Names by David Rokeby literally gives names to objects which are placed in it’s view point. What intrigues me about this piece is that the computer is actively attempting to describe what’s in front of it with a name. It is immediately responding to information presented to it and then allowing the participants in the room to know it’s interpretation. I hope to achieve this in my final project.

The Giver of Names from David Rokeby on Vimeo.

Looking Outwards – Projection Art

Lit Tree by Kimchi and Chips

Kimchi and Chips maps the projectable surfaces of a tree in order to project onto the branches, stems, and leaves. In this way they are using the tree to create a 3D voxel display. I like this because it is an unconventional way to display a 3D image. It relates to my project in that I plan to work with projectors and 3D imagery.

Lighting of the Sails by URBANSCREEN

In this project the group URBANSCREEN projected onto the Sydney Opera House. I liked it because they used imagery which caused you to question the shape of the opera house. Particularly when the opera house appeared to be sails blowing in the wind. This is relevant to the work I’d like to do because I plan to work with projection.

Box by Bot & Dolly

Look Here.

BOX is interesting because it looks like magic. The camera movement is well planned and works perfectly with the movement of the screens and the projections. While watching I forgot at times that I was seeing a screen and not a moving box. This is relavent to my work because I would like people to watch a projection I made and forget that they are looking at something formed from technology.

 

Looking Outwards

RUAH by Giulia Tomasello

RUAH by Giulia Tomasello reacts to human breathing and makes people think about the diaphragmatic breathing and how important taking time in modern society is. The Corset helps the wearer perform a slow deep diaphragmatic breathing, taking time to relax and providing chance to think back at oneself. Her works consists of muscle wire and and Lily pad. I think her work is meaningful in a busy life style that people have and also since its finals week.

Her work is something that is wearable, which is related to my final idea and also something that is inspired by the nature which I can get a lot from. I want to investigate more about the usage of the muscle wire and also the movements that are created by the Arduino since by the video, it looks as if it is actually an organ moving rather than something mechanical.

Blog Post

Wooden Mirror

This interactive art that is wooden mirror has a camera that transfers the image to motors that control blocks of wood to show the image on the camera. The program controls the pixels taken by the camera and transfer them onto big blocks that can be transferred into the motors and control the motors degrees that affects the color of the wood (due to the amount of light that a single wood piece changes as the degrees change)

I think this project helped me develop my second idea of the simulated waves that changes color or either reflection according to the movement that a person makes. Right now I’m thinking of just an interaction since making a mirror like project is out of range of what I can do but this project had made me think of more about the movement of one piece making up the whole.

Clothes for Orciny by Lea Albaugh

This project of clothes that changes as the person walks across the border. This simple interaction had inspired me to develop my ideas for clothing too and some conceptual ideas about how to make my design work based on it.

 

Looking Outwards: THE FINAL PROJECT OF ULTIMATE DESTINY

So if you’d believe it or not, I had a really silly amount of trouble finding any large-scale installations or no-expenses-spared versions of what I’m aiming to do. There are, however, a lot of really ghetto built-in-mom’s-basement versions on Youtube, complete with horrible pop soundtrack playing in the background like so much elevator muzak.

This first project is along the same lines of what I’m up to; it’s a set of LED lights that respond to the music in much the same way that an equalizer does:

Second we have someone working with a whole lot of LEDs at once; they’ve made a full equalizer as well as various patterns that seem to move in time with the music:

Lastly we’ve got this guy giving a full video tutorial of his project, including the code. What he’s done, essentially, is pretty much exactly what I want to do only on a much smaller scale: He’s got three large LEDs responding to each of the low, middle and high frequencies. As far as mine goes, I’m mostly just doing this with more lights and more colors, but we’ll see how advanced I can manage. Anyway, here’s his video:

In addition to these, there’s a lot of really nice little tutorials out there that either have elements of or a version of what I’m trying to do, so I’m pulling together a decent idea of the work I have cut out for me.

Looking Outwards Assignment 13

Poppy is an open-source 3-ft humanoid robot and the main inspiration for my final project design-wise.  It was created by Flowers Lab at Bordeaux, France for use in both the arts and sciences, and as such it uses an Arduino, Raspberry Pi and other relatively easy to use components.  However, there are some distinct differences that make Poppy quite different from my design.  Although Poppy is designed as a more affordable medium-size humanoid robot than its competitors, it is still quite pricey: roughly $11,000 for all the parts, not assembled.  While I wish to have the same form factor in my design, I simply do not have the budget to recreate it exactly.  The main  difference is the servomotors – Poppy uses expensive Dynamixel servos ($200 per motor), and I use cheaper hobby servos ($11 per motor).  Also, the parts for Poppy are all 3D-printed; I used laser cut acrylic as I did not have the time to 3D-print all the components as that would take well over 80 hours on a standard MakerBot.  Despite this, it was very useful for me to study the all the joints and form of this robot.

RoboThespian is a large humanoid robot created by Engineered Arts Ltd.  I think this project is very similar to mine for the fact that both robots are designed to emulate humans: RoboThespian an actor on a stage, and mine a student in a classroom.  Another similarity is that both robots use LCD screens as a method of displaying a wide range of emotions.  I cannot say whether this robot’s motions are pre-animated or whether it can generate behaviors in real-time.  It interesting to ask when a robot becomes more than a mere animatronic and actually an interactive machine, which is what most people imagine when they think of robots.

Pepper is an expressive large humanoid robot created by Aldebaraan Robotics, and it begins to transcend into the realm of commercial robots.  Reportedly costing around $2000 (although not for sale yet, this is very cheap for the robot’s size and functionalities), the robot is designed to act as a store guide.  The robot has already taken some roles advertising Nescafe machines and selling mobile devices in a phone store in Japan.  While in commercial environments, it can be quite tricky to match to practical use for the asking price when dealing with large humanoid robots such as these.  Despite the fact that it may be a glimpse into a possible future where humans and humanoid robots coexist abundantly, one wouldn’t be too wrong if he or she thought Pepper was nothing more than a giant toy.

MAJ: Looking Outwards #7

Admiration: Puppet Parade

Puppet Parade, by Emily Gobeille and Theo Watson of Design I/O, is an interactive instillation that uses arm motions to puppeteer giant projected creatures. This project uses openFrameworks 007, an infra-red camera, and two Kinects to track motions and translate them into visuals. Puppet Parade was featured at the 2011 Cinekid festival.

I enjoy the bright, gaudy-yet-simplistic visuals that characterize Puppet Parade. I’m particularly impressed with how simple hand-motions create such a complex environment, and would be interested to see the process of fine-tuning the projected outputs of these simple inputs.

For more info on Puppet Parade, click here. For more info on Design I/O, click here. For more on Cinekid, click here.

Surprise: The Treachery of Sanctuary

The Treachery of Sanctuary, conceived and directed by Chris Milk, is an interactive triptych depicting inspired by the cave drawings on the walls of Lascaux. From left to right, each screen represents birth, death, and regeneration. Infra-red sensors and Kinect cameras are used to sense participants. The Treachery of Sanctuary made its debut at The Creators Project: San Francisco 2012.

I’m impressed by the visual effects The Treachery of Sanctuary utilizes. The fluidity of the bird’s wings is quite striking, and I imagine how harrowing it must feel to become the subject of such projections.

For more on The Treachery of Sanctuary, click here. For more on Chris Milk, click here. For more on The Creators Project, click here.

What Could Have Been: Bird on a Wire

Bird on a Wire was created by Ben Light, Christie Leece, Inessah Selditz, and Matt Richardson, for a Master’s course at NYU’s Interactive Telecommunications Program (ITP). The birds animate when a specific phone-number is called.

I like the playful interaction Bird on a Wire has the potential to inspire, but I do have a nitpick: the flight of the birds is not fluid. I think more variation regarding the birds’ flight paths and animations would give this instillation the finishing touch it needs, although I understand such an addition is not critical to the success of the project itself.

For more on Bird on a Wire, click here. For more on Ben Light, click here.  For more on Christie Leece, click here. For more on Inessah Selditz, click here. For more on Matt Richardson, click here.