Similar to the BeatBearing project I chose last time, this is an Arduino-based project that produces music based on user input. It is called Arduinome because it is based off a previous project called “64 Monome,” that has been recreated to involve the use of the Arduino. Simply put, buttons are turned on or off when they are pressed, and the row in which a pressed button exists determines what sound is output as the timer moves across that column. I think the design is very simple and, in the same way as BeatBearing, will continue to produce the beat after it is determined.
Using Control and OSC, this project is able to control the display on a the glass block window using an android tablet. I think the ability to use pre-recorded displays and also allow the user to draw the desired output is a cool concept, and the interface is so simple that it makes it easy for anyone to draw with it. Comparing this project with the Arduinome, the grid interface is very similar, yet one produces a visual display and the other an audio display. Perhaps these works could be combined together in some way. What is interesting about this project (in addition to it’s use of the tablet interaction) is that the person who made the OCS interaction seemed to be building off code that was already out there, and simply updating it for this interactive use. It comes from “Hive13,” which is a Cincinnati-based “Hackerspace” community.
This project also seemed relevant to what we have been talking about in terms of the various uses for the Arduino. Using the Wi-Fi shield for the Arduino, the “Beeri” device checks Twitter for any new posts containing the word “pour.” If any are found, the car is activated to run, driving into the spike on the wall and pouring the bear into a conveniently placed cup. When multiple pour commands are found, the Beeri “pours drinks for all” according to the description, while in reality it freaks out and drives back and forth. While this part isn’t quite worked out (and the whole project doesn’t quite seem to make getting a beer more convenient) it is definitely a cool idea.
I was inspired to look up Lee Byron after we saw his clock in class. On his website he also has a series of three processing exercises he did. They all initially appeared to be quite simple. They require viewer interaction for their true complexity to be revealed. My favorite is his project called “Pastures” in which he has the viewer create creatures by clicking their mouse. The viewer chooses not only where the creature is created, but what the creature will be. A different creature is created depending on how long you hold your mouse down. I enjoy how the program hides it’s complexity and thereby forces interaction.
Touch and Reaction -Lee Byron
I found this project really interesting and inspiring. This piece explores the gestural reaction to texture. Byron uses an IR camera to capture the viewer’s hand movements as they interact with a mystery texture box. The box is designed with light surrounding the hole in which the viewer is to place their hand, so as to disallow them to see inside. The movement of the “viewer’s” hand is picked up with the IR camera. Byron presents the footage of each of the hands in a grid style, so you can see different hands interacting with different textures. Underneath each texture there are also continuously changing one word descriptors. These words were taken from the audio taken of the participants during the interaction. I think seeing the juxtaposition between different gestures and words i a good combination. I wish there was an audio component.
Future Fragments– Kyle McDonald
I went to Art and Code this weekend, so there is really way way too much to write about. I am just going to write about one specific project from one artist who attended. I really liked Kyle McDonald’s work, so I looked up his website. Although, he has done many more impressive projects I thoroughly enjoyed Future Fragments. He describes the project as an “anti-time-capsule” . He encoded sound bites from his classmates and inscribed them as color on little pieces of paper. He then had those same individuals carry around those little slips of paper for a summer. Some of the slips were lost and some were damaged. I assume, although it is not mentioned, that he re-translated these bits into sound. I’d be interested to see a sound and video piece made based on this project.
Comments Off on Looking outwards 5- Caroline Record
The Sonic Body, by artists Harry Neve, Thomas Michalak, and Anna Orliac, is an installation that on the outside looks like a neutral cylindrical shape. On the inside, however is an interactive, tactile installation composed of fabric forms that mimic organs in the human body. As visitors enter the installation and interact with the fabric forms they trigger a symphony of sounds that have been recorded from the human body. The Sonic Body was made using arduinos to trigger the sounds.
NTQ (Near Tag Quality)
NTQ, created by the french branch of GRL (Graffiti Research Lab), is a DIY graffiti printer. Comprised of an arduino, spray cans, and solenoids, NTQ prints messages that can be programmed into the arduino board onto a variety of surfaces, including walls, cars, and paper. GRL’s goal with this project was to opensource these graffiti printers so that others may be able to build their own (all the parts total to about 200 euros).
Kulbuto, by Émile Sacré, is both an installation and an instrument. The work explores what Sacré calls non-uniform compositions by providing visualizations determined by “changing rates determined by graphic collisions” that influence the rhythmic cycles. Overtime these rhythmic cycles, which are created by tensions in the visualizations, become synchronized and unsynchronized.
“Moony” an interactive installation created by Takehisa Mashimo, Akio Kamisato and Satoshi Shiabta.
This artwork project is a challenge to create a marginal space between virtual images and physical objects. After visitor looks through the vapor steam, visitor can detect the animated butterflies inside. If the visitor try to touch them, butterflies start to swarm around your hands or escape biside the visitor’s hand. But of cource, it is impossible to touch them phisically. Visitor will experience the mystic phenomena between real and unreal.
Interactive Installation experimenting with illusion and perception. The classic LED screen as a medium was simulated and disintegrated by the creation of a pixel-like LED optic with the ability to change and transform with the viewer’s movement and, hence, his perspective and point of view.
Eggert’s often kinetic installations invite viewers to physically participate with her in “puzzling out” themes of time and change and language. Humming behind much of her recent work are motors, clock works and motion sensors that track the proximity of viewers, changing aspects of the work at human approach.
The Creators, by students at the University of Sydney, have created a work in which viewer interaction is multidimensional. The idea that the viewer’s existence has both a direct and indirect relationship to the images on screen lends itself to a much greater notion of the impact a human has on the rest of the world. As stated, the amount of viewer interaction has a significant impact on the image, which means every individual has a completely unique experience with the piece.
Alison Mealy has created an incredibly interesting representation of information available over the internet. The image created is a real time (10 second delay) map of characters in an online role playing game of where the characters are in the world, and in relation to one another. The dialogue between the relationship between characters, and the relationship to the people controlling the characters is highly interesting to me. The map becomes a representation of real and imaginary space.
Alphabot, by Nikita Pashenkov, caught my interest because of its relation to the puppet project we completed. I myself could visualize what code I would need to make something like this. However, I can also appreciate the smoothness of the motion, and how the robot really becomes an organism, and not an assembly of parts. I think this piece is a good example of continuous movement, something a lot of the class struggled with, myself included, on the puppet assignment.
A set of incandescent light bulbs light up in relation to the visitors in a gallery. The positions of the participants are recorded and the information is used to make sure that the visitor’s shadows from the light bulbs are always overlapping, no matter the visitor’s position in the room. This interesting piece allows viewers to modify their behavior in response to a simple consequence. I like this piece because it forces the user’s interaction with the piece and with other people also viewing the piece.
Time Travelers – Toby Schachman
As the user goes closer and farther from the piece, the user travels back and forth in time. Initially a still image, the user’s silhouette changes the image in time. I like the idea of creating a kind of time machine, using technology to explore and control concepts of time. However, I think the concept could be pushed much farther than just an exploration of time. The images Schachman uses (a sunset, decaying strawberries, jellyfish, atomic bomb) are too stereotypical, generalized, and kitschy, making the piece much less interesting than it could have been. Schachman has a whole word of problems and issues he could engage with, and he chose a sunset?
Auto Rosary – Chris Eckert
A pretty self-explanatory piece; a user engages the machine, which then proceeds to pray the rosary for you, so that you don’t have to do it yourself. I think its a simple yet poignant piece that explores some of the very basic rituals of the Catholic religion. Meant to be a very personal and devotional activity, the rosary is undermined as it becomes automated by a machine. But is this machine any more automated than an actual person repeating the same prayers over and over again?
Awesome breathalyzer that talks to you. Even though it wont tell you your BAC, it tells you how drunk you are… in different voices! you can read the pure data if you want to, and the buttons at the top of the device can be programmed to do whatever you want. it can be programmed with arduino so its super fun times
Comments Off on The Talking Breathalyzer in Portable Mode
This interactive sound installation focuses on making human movement more involved in creating music. It also emphasizes people interacting with each other in order to make the music. With the use of three pumps, different beats are created with different drum sounds that project from the speakers.