Capstone

A Project in the Clouds

I worked with a fellow student and friend Kadejha Jones to build my capstone project. It consisted of an Arduino, some ARGB strips, and a microphone neatly bundled into a cloud form. I then wrote the code, in C++, for it to turn said clouds into a visual representation of what is occurring around it.

The microphone listens to the world around it and my code dissects the wash of sound into frequency ranges. From there the frequencies are analyzed to determine their intensity and that data is sent to the sections of lights. This provides a real-time breakdown of the sounds in the world around us.

This data could be used for countless purposes as the code breaks what it hears down into simple values, mainly intensity of each frequency, but it can be made more targeted. You can narrow down the frequency ranges pretty far, to within a few hertz, allowing the code to listen for specific sounds. The main limitation on this was the Arduino Uno as it struggle to keep tabs on more than 20 frequency ranges.

As I spent most of the project never touching the physical hardware I have no pictures myself of it in action. I mostly just wrote the code from a distance. I'll let my partner handle the visual portion at the link below.