Embedded Systems

Course Intro

Welcome to one of the most stressful courses I have ever taken. A class that I was worried I would be unable to successfully complete due to a lack of documentation and changing hardware requirements. The professor was new to teaching the subject and overall it was an uphill battle to achieve a functional result.

A Minor Setback

The most straight forward part of this whole project was the assembly portion. We were given an array of parts from a kit and given no instructions for how it goes together or an actual parts list. With some help from my partner and some google searches we managed to find what looked like a match to the parts we had (shown to the right). With that for reference we determined that we were missing the sweep motor for the acoustic sensor, meaning the robot would be mostly blind.

Our Solution

Since our robot was unable to look at its surroundings we had to adapt. Rather than leave it a cyclops we felt that stereo vision would do the trick. Now the robot only had blind spots straight ahead and in the peripherals. So slightly better than before, plus the new looks was a great inspiration for a nick name for our friendly robot. Feldman.

Now for the Fun Bit

To use the saying horribly wrong "You can lead a horse to water, but you can't make it do rocket surgery." In this case, just building the robot is akin to bringing the horse to water, while making it actually do something is equivalent to rocket surgery. Most of the semester had been spent trying to even get a single wheel motor to spin on command and even doing that had been an uphill battle due to a startling lack of documentation. Getting any further than that seemed bordering on impossible. It wasn't until 2 am on the morning of the final demonstration that things started to work.

I found a similar-ish project online utilizing the same chipset and reverse engineered their code to figure out how to make my own work. I built code objects that could be easily called to ping the acoustic sensors, throttle up and down the motors and count the rotation the wheels. These function combined to create a robot that could drive headlong into a wall, but it was progress.

Trial and Error

Fueled by a healthy mix of anger, stubbornness and coffee I stayed up late writing and re-writing code to make anything reactionary happen with the robot. The first real break was when I wrote a program that was supposed to stop the robot when an object got too close and it actually stopped. 

From that moment on I worked in a fervor, tweaking existing code, throwing away and creating new subroutines to make the robot actually useful. I converted the acoustic sensor output to feet, just to find that it overflowed when you got within a certain distance. I tried to expand the cone of vision of the sensors so it wasn't so blind, with no luck. Made it ramp up the speed and stop faster to avoid collisions during those steps. This got it to a point where all the functions kind of worked, but were not seamless.

The final stroke of genius occurred at 2 am and my fiancée (who stayed up with me so I could have someone to bounce ideas off of) and I celebrated so much that we woke up our roommate. I made the robot drunk.

The Magic Moment

I know that previous sentence seems weird, but it was really the only way I could describe how the robot had to behave. Since it was missing the motor to swing the acoustic sensor the robot was really setup like a human would be in a neck brace. Since the can't turn their head they have to turn their body instead. I programmed the robot to swerve side to side by slowing and accelerating the wheel motors in an alternating pattern, thus creating a full 180 degree field of view for the robot.

Each acoustic sensor was set to ping simultaneously by using a breadboard to dupe the ping signal and then were processed individually so they could interrupt the swerving if either detected anything. Once this happened they would begin what I dubbed "Evasive Maneuvers." Basically it would swing out in a wide arc around the obstacle, using the wheel counters to track the current angle and help to straighten out its trajectory on the other side. If it determined that the obstacle was still blocking it would keep turning until it registered the sensor as clear or it had rotated past 90 degrees, thus marking the end of the hallway. 

The first time the robot actually made it around the object I absolutely lost my cool, between the caffeine and a horrible lack of sleep I was very excited.

Improvements

I spent at least another hour tinkering with the code to make it run better and remove down time. I wanted this robot to speed it's way down the hallway, to offset time lost to swerving and just because I could. So now, the robot stopped for nothing and no one. Instead of stopping a wheel to turn it would simply reduce it to around half speed so the turn would take more distance, but would happen faster. To accommodate this I set the detection distance for object to further away to give it more time. Now any and all course corrections would happen on the fly instead of needing to stop to do so.

I also found that making the robot sway, ironically, made the robot run in a straighter line then not. No two motors we were given were equals, so 100% on one was more or less than another. Trying to sync them to the same speeds was seemingly impossible, so using the difference on the wheel counters to create a controlled sway resulted in an even center line for the robot's path. An unexpected surprise, but a welcome one.

The Final Result

So, now for what you really came for, the robot actually doing robot things. I don't have much footage as it was late and after demonstrating the robot I had to give the robot back to the school. I do have at least one video though which I will attach here. I'll also attach the code that made it all work, just in case anyone down the line ever runs into this issue on their own project and so you can judge my code for this mess of a program.

Overall the project was messy and fraught with documentation and explanation issues. In the end only a select handful of groups had a robot ready for submission that were mostly functional, each using different means of object detection and trajectory correction. It was a bit of a logistical nightmare, but a great learning experience that made me a much more independent and resourceful programmer. The video below is not a great representation of it working, but it is the best I have and was more than cause for celebration after everything I went through. Hope you enjoyed.

RobotBrain.pdf