Omar Cheikh-Ali – Final Project

This is my final product. Inspired by popular rhythm games, like DDR, and Guitar Hero, the beats drop from the top along paths to the buttons corresponding to QWER respectively. Points are awarded for accuracy of the beats, while points are deducted for misses (no points are deducted for letting a beat go however). In addition to this, I wanted to add an “Undertale” theme to the game. Undertale – the massively popular indie game that came out this past October by Toby Fox. If you’ve played Undertale, you might recognize the characters in the background and some of the color references. Wait till the end of the game if you want to see the end screen! (it’s a treat!)

Me sucking at my own game: (hey it’s hard to use the gif capture software at the same time as playing)
Click to view:

Click to view:

My process started through rough sketches as seen in my proposal sketches. My first step was to create two distinct classes, Beats, and SweetSpots. This would allow me to create objects for both to get them to interact with each other and allow me to assign them unique traits (timers for example).


The second step, after creating these objects was to get them to interact with each other; I needed my beat objects to register when they were off the screen, and when they were “hit” by the sweetspot buttons – basically when they were no longer needed in the canvas. Originally I had used the filter() function similar to the space invaders code, however it was easier in the end to utilize the shift() method to remove the beginning of each beat array. In addition to this, I had originally added a “pitch” value to the Beat objects, however that method proved to be too broad of a way to correspond beats to sweetspots – I later sorted beats into four separate arrays based on their corresponding sweetspot.


I had previously also used a massive hit-detection function that ran through every beat in the beat array, checked which sweetspot button was hit, and then checked if the beat was the right pitch as well as how close it was to the sweetspot (sounds complicated right? It was.). I eventually streamlined it into a switch statement with counters for each beat array so as to only move up an index when necessary, and I also took the distance functionality and put it in it’s own separate function to further optimize the code.

Finally, I went through the song in an audio editor and manually input each beat’s corresponding time, as well as what I thought the button mapping should be (created the beatmap by hand). There is likely software out there that could analyze audio and create beatmaps, but I have not found it yet, nor did I want to make the beatmapping process overly complicated with a software that may or may not meet my needs. (Ugh, there must be at least 100 lines of code that are just the beatmap.)

All in all, I’m pretty satisfied with the finished product. There are a few kinks in there, such as the “Q” and “W” buttons losing functionality until you intentionally miss a beat with them (then they start working again mysteriously), and the fact that I have been unable to get the timing of the music matched with the beat spawning (despite the beat timings being pulled directly from the audio that is used – I think it may be a problem with how the program loads???), but it runs fairly smoothly and works as intended otherwise.


Omar Cheikh-Ali – Looking Outwards 11

In this final looking outwards, I want to take the opportunity to display some of the work that inspired me to attempt a rhythm game. I have always enjoyed playing rhythm games (but never been particularly good at them), like guitar hero. The first example I want to provide is an old iPod Nano game called Phase.

Phase is a guitar – hero type of game with the player having to click and rotate the click wheel to hit targets-all in all, not a very innovative design. However, this was the first time I’d encountered a game like this that allowed you to use your own music. What made Phase great in my eyes was the fact that I was able to import my own songs and play them in Phase.


The second example that I’d like to discuss are the rhythm games called StepMania, and OSU. StepMania is another DDR/Guitar Hero type of game, however it’s the first I’ve seen that moves targets and sweet spots actively on the screen. In any case, no matter how crazy the spawn locations get, the targets will always move towards the sweet spots. OSU however, works in a different fashion, where the player has to move the cursor to click on targets that spawn to the song. Perhaps I could integrate the “target to sweetspot” gameplay and the “cursor to target” gameplay? However, implementing an moving style of gameplay would mean that there would be less functionality in terms of importing custom songs – each course would have to be made by hand to ensure that it doesn’t get too hectic for the player. Perhaps I could use this in songs that I deem as “default,” with other songs being “custom,” to exemplify how the code can analyze and create basic beatmaps.



Phase was released in 2007 for the iPod Nano and Classic by Harmonix. It was designed by Kasson Crooker, and Chris Foster.

StepMania was released in 2001, developed by Chris Danford and Glenn Maynard for the PC.

OSU! was released in 2007 by Dean Herbert, for both PC and mobile devices, and has also been ported to tablets as of late (it is easier to play on tablet, rather than ruin a mouse and keyboard).

Omar Cheikh-Ali – Final Project Proposal

For my final project, I would like to try to make a rhythm game. Not only do I want to make a rhythm game, I also want to add the functionality where the game can analyze imported sound and create levels based on that imported sound (perhaps to analyze the sound levels of a song by amplitude or volume, in order to find beats and draw targets at those times?) In addition to this, I want to see if I could play around with where targets spawn and where the sweet spots can be placed (perhaps this is a bit of a reach, but I think it would make the game very interesting).

So for a step by step:
1. Be able to spawn falling targets, and implement a way to tell if the user has hit the button when the target is on the sweet spots.
2. Sync the drawing of targets to the beats of a song.
3. Add functionality to register beats in imported songs(?). Perhaps the user won’t be able to add songs, but for the sake of my putting multiple songs this would be much easier.
4. Experiment with the changing of target/sweet spot locations for increased difficulty?


Omar Cheikh-Ali – Looking Outwards 10

For this looking outwards, I would like to write about Madeline Gannon, a researcher, designer, and educator at our very own Carnegie Mellon University. I had the privilege to study under her in my freshman year, participating in her intro to digital media course in the architecture program. She has done a lot of work in integrating design, robotics, computer science, and human-computer interaction, and is currently pursuing a PhD in Computational Design from CMU.

Her website is at:

The project that I’d like to talk about specifically is Tactum, an augmented modeling tool that allows a user to design 3D printed wearables in real time. It uses depth sensing and projection mapping to analyze touch gestures on the user’s skin and is converted into ready-to-print, ready-to-wear forms. This system, while incredibly versatile, can be slightly imprecise in it’s modeling due to the fact that form is dictated by natural movement, the tolerance of the system is around 20mm. It’s incredibly interesting however, and provides insight into the future of wearable forms.

Tactum was presented at SXSW Interactive 2015, and CHI 2015. It was also featured on NOTCOT, Wired, Motherboard, The Creators Project, Gizmodo, Dezeen, Fast Company,,, 3D Printing Industry, and Leap Motion.

Omar Cheikh-Ali – Project 9

In this project I wanted to experiment with having the turtles react based on the surroundings that it was passing through. So I wrote this turtle to react to the boundaries of a box that can be changed with the click of the mouse: while it is inside the box, the turtle will turn to the right at random angles ranging from 0 to 180, however when the turtle gets outside the box, it turns towards a point within 10px lower or higher of the center and arcs outside the box.

Click to start the iteration.


Omar Cheikh-Ali – Project 8

In this project, learning about brightness values, I immediately thought of ASCII art (since there is 255 different ASCII values and 255 RGB values) and generators online that can translate images pixel data into corresponding text. In addition to this, I wanted to provide a contrast and show the base image for comparison. It runs a little slowly, I’m not quite sure how to optimize it yet.



Omar Cheikh-Ali – Looking outwards #8

The looking outwards post that piqued my interest this week was written by Nina. ( In it, she discusses how the school of architecture at Princeton university is researching the strength of gypsum shells through the use of computational algorithms.

The school of architecture is using an ABB 7600 robot to repeatedly break, analyze, and repair the provided gypsum shells. Through the process of breaking, the robot produces cracks in the shell, which it then analyzes and selectively glues, in order to create stronger bonds in the material than were originally present before the break. The hypothesis presented that in the end of the process, the gypsum shell will have stronger load bearing properties than that of the original, unbroken shell.

This project is interesting to me because it presents a previously unexplored method of producing stronger building materials through computational algorithms. In addition to this, it brings up the question of the state of the gypsum; at what point is the gypsum shell no longer comprised mostly of gypsum, and more by glue? Would it be more effective to create a shell of the glue instead, or do the properties of the gypsum combined with the glue make for a superior material?

Project 07 – Omar Cheikh-Ali

With this project I wanted to do a perspective view, as if you were moving through 3D space; walking on a pathway with trees on either side towards the horizon. Granted, this is a cartooned and exaggerated perspective, it was a challenge animating three sets of objects. In the end I couldn’t figure out how to animate the third set and since the console was returning “cannot read property of ‘move’ of undefined,” I’d assume that there was something weird going on with my move and display functions within Pathway.


Omar – Looking outwards #7

Burak Arikan is a contemporary artist from Turkey who creates his works using network structures and dynamics to explore issues in politics and economics. Using abstract machinery, he inputs current news and issues in capitalist society and generates network maps, performances, and predictions, making relationships visible among different sources. Arikan is currently an adjunct faculty at Interactive Telecommunications Program, Tich School of Arts, New York University.


I find Arikan’s work to be extremely compelling because it displays explicit relationships between different inputs, and through those connections, it reveals a greater pattern. In addition to this, the display of such networks can become artistic, almost in a generative manner since the relationships between different inputs is almost random in nature – you’ll almost never know what you’ll get.


Arikan’s presentation during his workshops are in addition, very interactive, with the help of his generative software he is able to create network webs on the fly to demonstrate to viewers how he creates his works. He also explains where the sources are coming from before he inputs data, and how it could possibly be related to each other.

His website is here:

Omar – Project 05 Curves

With this investigation into curves, I decided to pick a Devil’s Curve – and if we’re being entirely honest, I first looked into the Devil’s Curve because of the name. I used two different display methods for the Devil’s Curve: one with a continuous line, and another with points represented by small ellipses. In addition to this, the mousePressed function also moderates the detail of the curve, ranging from a very blocky display to a smooth one. I’m not sure why, but for some reason the curve isn’t displayed till the user clicks once.