EEL5665L Website for Sean Frucht

Weekly Report 9

This week it was required that we show obstacle avoidance, however I am still in the process of developing a method for gait generation on my biped so I could not demo with my biped. To complete this weeks requirement I built a smaller quadrapod robot which I code named "Kitten". Kitten has a similar leg design to CAT from last semester, because the robots are similar in structure I was able to modify Seon Kim's code enough to get Kitten walking. Kitten has one IR sensor in the front which is calibrated at the robot's initialization and is used to detect stationary objects. The obstacle avoidance code I wrote for Kitten is very simple, after calibration Kitten will begin to walk forward displaying information on the LCD that explains where he is. This information comes in the form of three statements "I see far away", "That thing is getting closer", and "OMG TOO CLOSE". The robot will walk forward until it feels that an object that is way too close for comfort. It then stops and shakes in fear of the big scary obstacle. I did not want to spend too much time playing with Kitten so I have not modified the turning codes for CAT, yet because of this Kitten can only walk forward.

As far as my biped "Ender" goes, I developed a much nicer obstacle avoidance routine which is only missing finished walking functions to get him moving. This code is much more robust and will compensate for things such as walking down an alley and other simple tricks. The code is written assuming close to 90 degree turns but can be easily modified to use of smaller turns which would increase its functionality. I also generated a very extensive flow chart for which outlines in detail the behaviors I plan to implement in Ender, as soon as I can find a readable method for printing the massive flow chart I will be including a copy in my report. I'm currently working on learning enough about Inverse Kinematics to understand the outline Professor Crane created for me. I have been watching video lectures from EML6281 and following along fairly well. I hope to have this implemented fairly soon but I am not sure of a hard deadline to set for myself yet. In the mean time Seon wants to get together with me and try to write a machine learning algorithm that will allow for Ender to learn to walk on his own. I don't know anything about machine learning so the possibility of this is a mystery to me, however, Seon's enthusiasm for it makes me believe it might be possible.

Unrelated to my robots, I wrote a tutorial and some sample functions for anyone using the Dual Motor drivers given out in lab hopefully these functions will help the people who are still struggling with implementing their drive motors finish before the end of the day and the deadline for obstacle avoidance.