yalbert-nannon-Automaton

For this project, we originally intended to make a Google Home-like device that reacted to speech input. We thought that if we used 2-3 microphones, we could triangulate the location of the user's voice. We wanted to use this information to make a robot that turned its head towards a user/person, and would reply with 3 pre-recorded soundbytes. Unfortunately, we really struggled to get useful inputs from the Arduino speakers and eventually had to shift directions.

We started looking at ways we could use the tools in front of us in interesting ways. In the physical computing labs where we were working, we played around with K'nex and pipe cleaners and light bulbs.

The light bulb was another point of challenge for us as we couldn't quite figure out how to power it correctly.

Ultimately, we focused on the servo, stepper motor, IR sensor to see what we could come up with. The K'nex pieces were especially great for interesting motion as they kind of operate as gears that only require 1 motor. We decided to lean in on these and create an interactive creature made up of K'nex. As you approach the object, it will flap its wings faster, either in fear or excitement for a new onlooker.