Final Project – Ji Tae Kim & Albert Yang

12287283_10207010114665590_1688418713_o

Ji Tae and I collaborated went through several suggested ideas, such as information visualization or games, before settling on an interactive instrument involving plants. We were inspired by the room we were in (the top floor of Margaret Morrison Hall), which had plants all over the place, and were interested in incorporating sound into our final project. Our original idea was to create a branch that drew itself randomly every new canvas, and leaves that grew in a distributed manner through the branch. When the user clicks the canvas, raindrop particles fall from the top and make a chime noise when they hit leaves.

We implemented Roger’s recursion tree and particle system functions after realizing that turtle functions and branch objects probably weren’t the best way to go about creating a tree structure. We spent several days attempting to use turtles and objects before learning recursion a few days before the project was due. From there, we moved onto creating the tree aesthetic that we wanted and having moving particles interact with a seemingly static canvas. Our original idea was to have the raindrops roll off the leaf, but we liked the glass-like style that the change in opacity gave the leaves, so we made the raindrops slow down upon interacting with the leaf instead of rolling around its edge. We also originally had a bunch of random chimes but it turned out to be increasingly annoying whenever they all played at once upon hitting a leaf. The final incorporates three chimes that work pleasantly together.

I’m pretty happy with the way the project turned out, but still wish we had incorporated more interaction with it. Our initial sketch had the idea of incorporating wind to make the leaves shake. We had also tried to create a more randomized tree every time the canvas was drawn, or even when a key was pressed, but we had trouble implementing those. The chimes are very pleasing to listen to and I like the aesthetic of the leaves, so it’s a pleasing experience that incorporated what we had learned throughout the semester.

15-104 Final Project (JiTae + Albert) from Albert Yang on Vimeo.

Final Project – NF

For my final project, I wanted to explore music interaction through the use of user controlled and visual elements. My submission is an interactive canvas on which the participant can use certain keys to move my drawn instruments and “shoot” or play the music notes. I used particles to create parts of perpetual movement that also make sounds when they collide with other elements or boundaries of the screen.  In addition, if the user chooses to drag the mouse instead of click, boxes are created rather than particles, allowing for more control by the user and further exploring the elements of animation and movement through p5.js. Along with my theme of music, the project allows for players to initiate scrolling music notes, which they can follow on screen with the controlled ‘note’ objects to simulate a song. Depending on their location on the screen, the music notes play varying pitches.

I had trouble executing all of what I intended in this project, however, am overall satisfied with what I was able to create from the techniques we learned in class and assignments. If I were to continue with this project, I would focus on creating more precise and varying interactions depending on the different objects that collide, and make a larger range of differing sounds. I would also like to explore changing the movement of the elements to be more intricate/interesting instead of the current format(similar to the space invaders example). Lastly, I would want to try and implement more of an interaction with the scrolling music notes and make the canvas pitches more accurate so to allow for a better/more enjoyable experience and display of music through my project.

 

sketch40.js

Maggie Mertz: Final Project: Bound

So for my final project, I wanted to create a place to put my writing up on in the future. I did NaNoWriMo, and this is the story I created in a month. It still needs editing, and so does this code, but I enjoyed making both.

It uses an iframe to put the randomly generated circuit board in the back, while the front canvas uses mouseWheel and translate instead of a scroll bar. I tried to incorporate my own writing, art, and ideas into this project.  There are several responsive buttons and features, such as the responsive text at the top, the buttons, the arrows that click through, and the map. I don’t have really much to say about me as an author yet, hence the blank author space. But one day when I come up with an intriguing enough description of myself… I will fill it in. Thank you so much.

Here is the website:

http://www.andrew.cmu.edu/user/mmertz/

 

It wouldn’t allow me to upload the video capture, so here are some Screen Shots:

Screen Shot 2015-12-10 at 11.53.50 PM Screen Shot 2015-12-10 at 11.54.18 PM

Omar Cheikh-Ali – Final Project

This is my final product. Inspired by popular rhythm games, like DDR, and Guitar Hero, the beats drop from the top along paths to the buttons corresponding to QWER respectively. Points are awarded for accuracy of the beats, while points are deducted for misses (no points are deducted for letting a beat go however). In addition to this, I wanted to add an “Undertale” theme to the game. Undertale – the massively popular indie game that came out this past October by Toby Fox. If you’ve played Undertale, you might recognize the characters in the background and some of the color references. Wait till the end of the game if you want to see the end screen! (it’s a treat!)

Me sucking at my own game: (hey it’s hard to use the gif capture software at the same time as playing)
Click to view:
GameOperation

Click to view:
GameOperation2

My process started through rough sketches as seen in my proposal sketches. My first step was to create two distinct classes, Beats, and SweetSpots. This would allow me to create objects for both to get them to interact with each other and allow me to assign them unique traits (timers for example).

sketch

The second step, after creating these objects was to get them to interact with each other; I needed my beat objects to register when they were off the screen, and when they were “hit” by the sweetspot buttons – basically when they were no longer needed in the canvas. Originally I had used the filter() function similar to the space invaders code, however it was easier in the end to utilize the shift() method to remove the beginning of each beat array. In addition to this, I had originally added a “pitch” value to the Beat objects, however that method proved to be too broad of a way to correspond beats to sweetspots – I later sorted beats into four separate arrays based on their corresponding sweetspot.

screenshot

I had previously also used a massive hit-detection function that ran through every beat in the beat array, checked which sweetspot button was hit, and then checked if the beat was the right pitch as well as how close it was to the sweetspot (sounds complicated right? It was.). I eventually streamlined it into a switch statement with counters for each beat array so as to only move up an index when necessary, and I also took the distance functionality and put it in it’s own separate function to further optimize the code.

Finally, I went through the song in an audio editor and manually input each beat’s corresponding time, as well as what I thought the button mapping should be (created the beatmap by hand). There is likely software out there that could analyze audio and create beatmaps, but I have not found it yet, nor did I want to make the beatmapping process overly complicated with a software that may or may not meet my needs. (Ugh, there must be at least 100 lines of code that are just the beatmap.)

All in all, I’m pretty satisfied with the finished product. There are a few kinks in there, such as the “Q” and “W” buttons losing functionality until you intentionally miss a beat with them (then they start working again mysteriously), and the fact that I have been unable to get the timing of the music matched with the beat spawning (despite the beat timings being pulled directly from the audio that is used – I think it may be a problem with how the program loads???), but it runs fairly smoothly and works as intended otherwise.

sketch

Final Project- Samantha Mack

For my final project, I wanted to create a work that combined handmade elements, computational imagery, and storytelling. I chose to develop an interactive comic using scanned images of hand-painted work and bring them to life for the user. The program begins with a title screen complete with swirling smoke footage and a text button inviting the user to begin. Once on the panel screen, the user discovers the panels by clicking on each of the black curtains in order. Below the panels are three simple dots that light up when the mouse is over the corresponding panel. Each panel involves animated effects that appear while the mouse is hovering over that panel. In the second panel, the user can play with the waterbending effect by moving the mouse, and while hovering over the goddess’s hand, the animated streams of water snap to a dynamic position.The goddess’s eyes glow when she’s using her powers. The user can choose to view all the panels at once by clicking the text button in the bottom left, and bringing the mouse over to the side allows for viewing of the original artwork without animation. Finally, the user can also choose to return to the title screen at any time, which resets all of the curtains in the panel screen. The most challenging and rewarding aspects of the project were the incorporation of video, the sliding panel transitions (getting them to work only in a particular order), and the development of convincing smoky particle systems for the panels. I learned a great number of new skills from this project. I was glad to discover how to manipulate the position of the canvas and the color of the web page, and I also found new ways to get around creative problems. For example, the second panel involved an interesting approach: I wanted to work with transparent video, but discovered that the tint() function in p5 does not work for this. I researched further for solutions but found none, but then realized that I could simply overlay the static image of the panel over the video and make it transparent instead. This, however, faded the character too far into the background, so I incorporated another image layer that isolated the character (using Photoshop) and left her completely opaque, balancing her appearance in the middle panel with those in the other panels. Meeting these challenges in the second panel was so worthwhile, as the effects make the piece much more dynamic. It was really fun to work through challenges like these, and develop a consistent aesthetic for the program. This project made me feel capable of far more than I’d previously imagined, and the more challenges I invented for myself, the more I got to unlock. I’d love to look into other programming languages that make room for even more intense visual effects and development, and I fully intend to expand this comic with more integration of physical and digital work and even more innovative effects. This class has really inspired me to see what’s possible in computational creativity, and it’s opened so many doors for me.

I’d like to give special thanks to Math major Gidon Orelowitz and ECE Major Dan Saad for talking through various logical aspects of the program with me (particularly in restricting the order of curtain transitions), as well as providing their feedback on user experience. Special shoutout to Sydney Ayers for her feedback on aesthetic considerations as well- be sure to check out her amazing final project! Finally, I’d like to thank Golan and Aprameya for their help and encouragement in this and every project.

Visit my Looking Outwards 11 to see some inspirational interactive web comics:

Looking Outwards 11: Interactive Comics- Samantha Mack

 

 

Hand-painted panels in Progress: 

20150822_233638 20150822_233632  20150822_232332 20150828_124921   cropped SmokingGun

Program in Progress:

Earlier version of title screen (while the smoke’s interaction with the transparent blue bar was interesting, I found the black and white version to be much cleaner).

Screen Shot 2015-12-11 at 1.02.47 AM

Final version of Title Screen (“Click to Begin” is white when hovered over and dark gray otherwise)

Screen Shot 2015-12-11 at 1.05.16 AM

Character Cutout (The second panel has four layers:

-original panel

-video file

-transparent copy of second panel

-opaque character overlay)

Screen Shot 2015-12-11 at 9.59.45 AM

Panel Screen Interactions:

Screen Shot 2015-12-11 at 1.06.17 AM Screen Shot 2015-12-11 at 1.06.30 AM Screen Shot 2015-12-11 at 1.06.46 AM Screen Shot 2015-12-11 at 1.06.59 AM Screen Shot 2015-12-11 at 1.07.08 AM Screen Shot 2015-12-11 at 1.07.15 AM

Screen Capture Documentation:

Note: The soundtrack in the video below is not part of the program, but is added to enhance the documentation and fits the feel of the work. Shoutout to Dan Saad for helping me find the perfect soundtrack.

sketch

 

zislamha RogerSectionB Final Project: Save the World – One Drop at a Time

Working from my original idea, this project was a redemption of my failure of an assignment for the Landscapes week. This game was inspired by that, and also by the recent development in revamping the world’s outlook on climate change and conserving resources. The game is simple: collect water drops before time runs out. As you do, the earth in the center becomes greener and greener, becoming a way of data visualization. However, watch out— if time runs out, so does the planet. It is a simple game, but there are intricacies in there I did not even know I would have to pay attention to, and is something I am still really proud of. Of course I wish i had time to do things, like making a home screen, a replay, enemy water, but I still think this is fun to play. Even my friends got stressed from it because they didn’t know if they would make it in time. Enjoy, thanks for an interesting semester.

**Sorry for the typo in the autolab submission TA’s, Golan, and Roger!

sketch

Screen Shot 2015-12-11 at 12.17.52 AM

Screen Shot 2015-12-11 at 12.19.21 AM

Screen Shot 2015-12-11 at 12.18.39 AM

FinalProjectVideo

Final Project – A Study in Population Visualization

This project was an exploration in visualizing data. Taking city populations across the world, I mapped them across a canvas and represented them with pulsing circles. Each size and opacity relates to the size of the population. Hovering over the circle will display the name of the city, while clicking will produce a sound and display the population in the bottom right corner.

Screen Shot 2015-12-10 at 11.51.14 PM

Final Project Video

sketch

Final Project

Link to project:
zainab.co/nstellations

For my final project, I created an interactive information visualization of zodiac constellations in the night sky.
Through this project, I was able to utilize a myriad of skills that I have learned throughout the semester, like objects to create an animation of background stars in the night sky to create twinkling, shape primitives to create the constellations, rotation to create the earth rotating around the stars, and functions like mousePressed to show different levels of information.

Screen Shot 2015-12-11 at 12.29.39 PM

Screen Shot 2015-12-11 at 12.29.55 PM

Screen Shot 2015-12-11 at 12.30.16 PM

Screen Shot 2015-12-11 at 12.30.22 PM

“Final-Project” by Ashley Chen

For my project, I wanted to create a relaxing, aesthetically pleasing landscape, while still making it interactive. I made an under water window that displays subtle waves. There are also a series of buttons on the bottom of the screen that the user can click. Each button does something different. There are also other hidden things users can do if they click around the screen.

When I began this project, I wasn’t really sure what to do; there were so many possibilities, I was overwhelmed and made a broad proposal so I wouldn’t be restricted in what I made. I settled on this project because it was, at its core, an art project, which is what I enjoy creating the most. I also added user interaction to give my project another level of depth.


sketch

Final Project – Beneath The Strain

For my final project, I decided to implement user interactive effects to a video. This process was mainly influenced by Aaron Koblin, who does a lot interactive video work.

The project was created using Adobe Premiere Pro and Threejs. Premiere Pro was used to create the video that would be interacted with. Threejs was used for the implementation of effects.

“Three.js is a library that makes WebGL – 3D in the browser – easy to use.” (www.threejs.org/) It allows 3D rendering to be much faster, and easier, to those wanting to learn.

The entire process took about a month. In early November, I started to take video of nature; scenes of CMU campus that were absolutely beautiful, but were very much overlooked by those around me. I wanted to show that, although individuals may feel skeptical about their success at CMU, that “beneath the strain” things will be okay. In fact, nature shows that things will be more than okay.

Personally, I think I did a good job at hitting at this idea. However, I believe that I could have been more effective by implementing smoother, more attractive sprites. This project is definitely something I will continue to work and improve on.

 

Early Sketch –

xapostol-FinalProject

 

Shots I Enjoyed The Most –

BTS Pic 2 - xapostol BTS Pic 1 - xapostol

ThreeJS – http://threejs.org/