104FinalVideo from Jessica Headrick on Vimeo.
For my final project in this class I wanted to take this opportunity to explore generative art using input from my microphone. I began by researching p5js sound library, but ultimately had to do some external research to collect aspects of the microphone input such as pitch. I used inspiration form Jer Thorp’s work, which includes radial graphic imaging. I really wanted to take the typical linear visualization of sound to another form in creating a spiral like design. In my program I visualize several aspects of sound, the amplitude(volume), pitch, and peaks (the beat).
Here is a picture of one of my earlier iterations when I had only one segment of my code working. Every fifth frame a rectangle is drawn, the color is depending on the pitch (red is high, blue is low), and the length is dependent on the volume of the microphone input at that moment. The fill is slightly transparent in order to see layering as time progresses.
After this iteration I realized I had to create an array of each rectangle if I wanted to have the fading beat affect, so that required some back-tracking in my code. Once I had all the object created, however the program still preforms rather quickly, and the beat visualizations definitely add to the playfulness. As a Final part to my visualization, I created a way to toggle the color visualization of my program. In the top right hand corner the user can select a any of hte four color choices to view the visualization.
The first source of inspiration that I discovered is an artist by the name of Jer Thorp. Jer is an artist that visualizes data is spectacular and intriguing 2d and 3d representations. The particularly interesting piece of his was project called Random Number Multiples. These were a set of screen printed posters stemming from a radial abstractions of word frequency visualizations that he created using Processing and the NYTimes Article Search API. These brilliantly striking pieces are my inspiration for how to visualize the of audio input fro mteh microphone.
Find out more abotu Jer Thorp and his work here!
Another source of inspiration for my project is a audio visualization named Piano Phase. This project visualizes the idea of Steve Reich’s 1967 piece, Piano Phase. Two pianists repeat the same twelve note sequence, but one gradually speeds up. The musical patterns are visualized by drawing two lines, one following each pianist. The lines are iteratively drawn in a radial pattern much as I would like to do.
Visit the live site here!
For my final project I would like to create program that takes audio input from the computer microphone and visualizes the synthesis of the sound. As time passes, my program will use the frame rate to draw the visual representation in a radial design. I would like to include aspects of the audio such as volume, pitch, duration, etc. As long as the program is running the art visualization will continue to grow and become more representative of the sound. For example, after singing a song to the program you will be able to see a representation of the song musically. Furthermore, you could use this program during a conversation, and then have an artifact that is unique to exactly that conversation and the participants in the dialogue.
As you can see in my sketch below, I would like to use various line weights to represent the volume and pitch for the audio. Additionally I would like my program to identify when the user claps, and the color palette of the visualization will change. I would also like to include ellipse elements that will represent staccato like sounds. Finally, the background of the visualization will softly fade between warm and cool colors depending on the pitch of the current sound input. If the pitch is low the background will be more cool, and if the pitch is high it will be more warm.
If time will allow I would also like to include a way to represent the passing of time in the visualization. My current idea for this visualization is having the drawing incrementally spiral outward so that there is more depth to final artifact.
**CLICK TO DROP TREATS**
My creature initially was a formless blob that chased after treats left by the mouse. However, my creature iterated into a school of fishes. When the mouse is clicked, a treat is dropped into the water and the school of fish swarm over to get the treat. If left, the fish will surround the treat until another treat is dropped. The closer the fish get to the treat, the more closed the fish mouths become – anticipating the treat.
I combined what I learned on the walking man, pac man, and particles to create this code. The most challenging part of writing this was implementing the orientation and mouth openness of each particle. All in all, I am pleased wth the final school of fish!
I discovered Emily Gobeille in my exploration of female computational designers. She is a visual graphic, motion graphics, and interaction designer. Emily Gobeille has been widely recognized for her stunning and thought provoking work; she has even been featured in a couple of magazines spreads. A particularly interesting project of hers was an installation project in the city. Gobeille and her team replaced a store front window with a screen that looks like an aquarium. By calling in and using voice activation, by-passers can create personalized fish and move them around the tank using their phone keypad. Gobeille wanted to explore how to grab the attention of by passers and engage them in a game through her installation. The response she received to her installation is quite inspiring. Gobeille’s work encourages me to seek new ways to engage a community through my design and really alter an environment.
Carnival Interactive Aquarium from Todd vanderlin on Vimeo.
Find out more! http://zanyparade.com/v8/about.php
** CLICK TO ADD PARTICLE **
** PRESS ANY KEY TO CHANGE ATTRACTION POINT, ACCORDING TO THE LIGHTNESS OF THE IMAGE **
The composition that I created is one where particles interact with the image beneath them. When you click a particle is added that hops around the canvas. When a key is pressed the particles gravitate to a consistent level of brightness in the pixels, this brightness increases every time the key is pressed. The results is particles swarming around gradations of lightness as you continuously press the keys. I chose this image so that I see how the particles will first swarm around the dark green parts of the image, then the grey, then dark yellow, etc. until they are only attracted to the very brightest parts of the flower.
This was an excellent opportunity for me to understand the behavior of particles and the application of this to what we have learned about image manipulation. Enjoy!
The portrait that I created is a geometric translation of the image I chose. Rows of triangles are created to form this pattern. Every second the rows shift back and forth making it difficult to see the image, but once the rows align the comical image appears. The mirage of shifting triangles creates an interest and ambiguity until the true image appears, which is my intended effect.
Below are stills from the transformed image (the actual animation may take processing power to show).
I reviewed Lily Fulop’s Looking Outward 04 post that explored Mark Wheeler, Russ Chimes, and Clay Wieshaar’s work in a visual sound experiment. The project is quite fantastic and I very much agree with all that Lily had to comment on in her post. As Lily explained, Wheeler creates openFramework apps linked to an Ableton Live set and MIDI (Musical Instrument Digital Interface) controllers to make visual sound experiments. Wheeler created visuals for Chimes’ track We Need Nothing to Collide and then projected this video onto various suburban settings. I totally agree with Lily that the stills of these projected light on the urban settings are very intriguing. Initially, I didn’t identify the lights projected as such, I thought it was a video effect. I wonder how the project could be algorithmically applied to other songs. Is this a project made specifically for the Chimes song ‘We Need Nothing to Collide’, or could it be applied to a wider variety? Further exploration on this project could be very interesting!
For my project I created a generative landscape in which multicolored kitties are floating toward you endlessly! Overhead puffy white clouds soar past you as randomly colored cats pass by. I enjoyed this project because, although it is quite abstract, I had a lot of fun imagining up this environment. The challenges I faced where having the cats move naturally toward the viewer with randomized trajectory. I believe I succeeded in making an entertaining generative landscape!
Roman Verostko, born in Western Pennsylvania in 1929, maintains an experimental studio in Minneapolis where he has developed original algorithmic procedures for creating his art. Active as an exhibiting artist since 1963, his earliest use of electronics consisted of synchronized audio-visual programs dating from 1967-68.
In his talk for Eyeo in 2014, Roman first focused on his background as an artist; his expressionistic approach to art and how his background in abstract art led him to deeper focus on how abstract art comes about. As an abstract artist, he not only focused on traditional mediums, but soon became interested in using new technology to display his artwork. He experimented with projected and light shows that correlated his abstract art with sound. He then moved on the very low-res pixel based abstract art that celebrated the technological minds of the time – I really appreciate this about Verostko. Verostko found it important to always push himself in the technological realm to make his work more dynamic. In the early 1980’s he began experimenting with algorithmic procedure to create art. A large breakthrough for him was the software he modified in 1987 that utilized a pen plotter’s drawing arm to reinterpret paint strokes into smaller scattered strokes. This was the most inspiring pice of his work for me because it exhibited a communication between algorithmic rhythm and abstract/random artwork. I am inspired to pursue this type of generative artwork that is brought into the physical world throughtechnology.
Find out more about Roman Verostko at http://www.verostko.com/
Eyeo 2014 – Roman Verostko from Eyeo Festival // INSTINT on Vimeo.