Ji Tae and I collaborated went through several suggested ideas, such as information visualization or games, before settling on an interactive instrument involving plants. We were inspired by the room we were in (the top floor of Margaret Morrison Hall), which had plants all over the place, and were interested in incorporating sound into our final project. Our original idea was to create a branch that drew itself randomly every new canvas, and leaves that grew in a distributed manner through the branch. When the user clicks the canvas, raindrop particles fall from the top and make a chime noise when they hit leaves.
We implemented Roger’s recursion tree and particle system functions after realizing that turtle functions and branch objects probably weren’t the best way to go about creating a tree structure. We spent several days attempting to use turtles and objects before learning recursion a few days before the project was due. From there, we moved onto creating the tree aesthetic that we wanted and having moving particles interact with a seemingly static canvas. Our original idea was to have the raindrops roll off the leaf, but we liked the glass-like style that the change in opacity gave the leaves, so we made the raindrops slow down upon interacting with the leaf instead of rolling around its edge. We also originally had a bunch of random chimes but it turned out to be increasingly annoying whenever they all played at once upon hitting a leaf. The final incorporates three chimes that work pleasantly together.
I’m pretty happy with the way the project turned out, but still wish we had incorporated more interaction with it. Our initial sketch had the idea of incorporating wind to make the leaves shake. We had also tried to create a more randomized tree every time the canvas was drawn, or even when a key was pressed, but we had trouble implementing those. The chimes are very pleasing to listen to and I like the aesthetic of the leaves, so it’s a pleasing experience that incorporated what we had learned throughout the semester.
I’m collaborating with Joseph Ji Tae Kim to create an interactive instrument inspired by leaves and mother nature. When we thought of the project, I was immediately reminded of a stage in Super Smash Bros. Brawl that utilized a similar concept. In the stage, there were many two-dimensional leaves that played certain notes when people fought on top of them or when fish jumped up from the bottom of the screen to hit them. Some research also led me to discover a project called “Audible Color” (2012) done by Hideaki Matsui and Momo Miyazaki at the Copenhagen Institute of Interaction Design. Both projects revolve around the idea of utilizing natural objects (leaves and water droplets) to generate sound, and they’re both fun to listen to despite not creating a recognizable song or anything of the sort. The project me and Joseph are doing will involve a chain reaction of noise rather than just singular notes, which will make it interactive when it starts, but will allow for a more unpredictable and unexpected melody to come about.
For the final project, I’ll be collaborating with Joseph Ji Tae Kim to create an interactive instrument influenced by mother nature. The idea is to have raindrops created by the user fall down the screen and land on leaves that in the middle of the canvas, with each leaf making its own musical note when a raindrop lands on it. The raindrop will fall from one leaf to another, creating a chain reaction of sounds that hopefully sound pleasant and relaxing. To add an element of unpredictability and randomness, we plan to incorporate wind into the project to blow on the leaves and move the raindrop to other parts of the canvas. There will be a spring function utilized within the leaf object to make it jitter or spring up a bit whenever the raindrop lands on it, so everything isn’t totally static as a rain particle moves down the canvas.
My goal for this project was to create a robot that hunts for food (oil) in the canvas. I incorporated the walking function from week 8 so that the robot could move towards its destination. Originally I was going to have it run out of batteries (by turning red) after a certain number of frames, but I wasn’t sure how to make it static all of a sudden and restart upon pressing a key. If I redid this project, I would try more to incorporate that aspect so it has more functions.
The robot waddles towards the oil and then turns green (happy!) when it “eats” it. A new oil blob can be created after the robot eats it by clicking the mouse.
I was intrigued by Kate Hollenbach’s Mezzanine (2010-2014) project. She studied at RISD as a graphic design major, and earned her bachelors in computer science and engineering from MIT. She eventually served as the design lead for Oblong Industries, the company that created Mezzanine. As a design major, it especially interested me to see her role as an interaction design in the creation of the project. Mezzanine serves as a sort of wand device to move content around on a digital interface, and changes on the layout will take effect around the world as well. In other words, it’s a very useful tool for conferences or meetings with people in other areas. While its function doesn’t apply to me right now, I think it’s really interesting how it can make digital conferences much more immersive than just email or video-calling.
My initial idea was to try and make a Tron-esque game where two turtles leave a path and try to avoid the paths left by the either of them. I got the lines to appear, but I had a lot of trouble making it draw them. As a result, I started messing around with the code that I had left and tried to make a more visual turtle graphic that’s fun to look at. I had made two shuriken/flower/windmill-type shapes, one within the other, but still found myself unsatisfied with a static image. I looked into ways of making them animate and input second() into the angles that they turn at. I found myself looking at the new program for longer periods of time, since the inside always looked like it was changing no matter what time it was. Additionally, I was really fascinated by the huge amount of patterns created by the figure in the middle.
I wanted a sort of geometric pattern to reveal my picture, so I made rectangles go across the screen from the top to the bottom and the left to the right. I utilized functions that drew the lines, and allowed for size changes when the keys “Z” (smaller) and “X” (bigger) were pressed. I’m moderately happy with the portrait, but it felt wrong using global variables after I heard from a TA that it’s bad style. The portrait takes a while to be truly visible, but it’s nice to see the grids that form as a result of the moving rectangles. Special thanks to Sharon Yu for letting me use a picture of her.
I was interested in the project that Faith talked about in her Looking Outwards 4, where she discussed a reactive surface called .fluid that could turn itself from a two-dimensional surface to a three-dimensional object. It’s a very conceptual project, but the possibilities that it suggests make it fascinating to read about and watch. I agree with Faith in that the project could have been conceived from the idea that electronic communication just does not give the same human feeling than if it was in-person communication, but it also applies to other non-electronic objects. In the article discussing .fluid, the possibility of a sofa giving you a three-dimensional response is mentioned, suggesting that this concept of a shape-shifting fluid could bring so many more things to life than just electronics. It could be interesting to think of ways our lives would improve if the non-electronic objects that we take for granted had responses that act as signals for us to behave. I’m sure there could be a way to prevent it from being a messy liquid, but it’s probably too early to tell what the findings of the project will lead to.
I partnered up with Ji Tae Kim for this project, and we created a landscape with the intention of having different layers of items. Ji Tae worked on the underground layers, where worms wiggled up and down throughout the soil. I worked on the parts above the ground, including the trees and the clouds. We had some trouble figuring out how to use a for loop to wiggle the worm around, but eventually made a row of ellipses that moved together to represent a worm wiggling through the soil. We also had a problem with making the trees appear spread out in the beginning, rather than having a bunch of trees lumped together in front. After the initialization, however, the trees came out fine.
We were both happy with the final project because we worked pretty well together and were pleased with the colors/setting we created. The vibe sort of reminds us of a 2D platformer video game, so we found it pretty fun to look at for a while and imagine Mario or someone interacting with the landscape.
I watched Kyle McDonald’s lecture at Instint 2014, where he talked about his works and process from his first installation to his more recent experiments. His body of work mainly consists of interactive installation pieces. One piece that I found really interesting was the piece where a screen was projected onto a building and people’s movements in front of a monitor were cast as huge shadows onto the screen. He said it was one of his first projects where real, live people were influenced and enjoying themselves as a result of his installation piece. It was fun to see his evolution from a small project he did in school (“Chips”) to making more large-scale installations that affect how people walking by interact.
His way of presenting information was very detailed and organized, showing how certain ideas originated all the way up to how they turned out and what he would have changed about them in the end. He showed how each project was used through videos, but also made note of the process he had to go through in order to arrive at the final pieces. It’s process work that makes each piece interesting and understandable, which makes it more inspiring to see what could be done through the use of creative computing.