Monday, February 20, 2012

Two-wheeled, Self Balancing Robot Platform with OpenCV Path Finding: Phase I

This year my science fair project was a part of a much bigger project.  You can read about the project below.  I hope you like it.
-widdakay
Two-wheeled, Self Balancing Robot Platform with OpenCV Path Finding: Phase I
Purpose
The purpose of this project is to create a two wheeled balancing robot platform. This is the first phase of a project to build a robot which can autonomously follow hiking trails while recording its position, and taking 360º photos to create Google Street View-style immersive photos of hiking trails.


Background
For a project like this you need a very maneuverable platform.  Segways are perfect for that.  They can turn around on a dime and can navigate more uneven terrain than normal 4-wheeled platforms.  This design will also help to keep the camera level for good panoramic pictures.  Also it is just plain cooler.  
An easy way to measure tilt is to find out how much acceleration there is from side to side.  That lets you figure out how much gravity is pulling down on each side so you can infer the tilt.  To measure that, you need an accelerometer.
A couple months ago I built a device to record the flightpath of a rocket, mainly the apogee time and height, and the maximum acceleration of the rocket.  It also recorded the temperature, humidity, and light level.  Based on this work I learned about sensor fusion.  Sensor fusion helps get around the problems with certain sensors.  For example, gyros accurately show change over short periods, but drift over time.  So if a gyro is off by 1˚/sec counterclockwise when it really is not turning, over time that error can accumulate causing the robot to think that it is on its side when really it is straight up and down.  Accelerometers can be used to sense gravity and tilt, but are effected by movement.  Magnetometers do not react fast enough to be reliable for my use.  When you combine all the sensors into an orientation sensor, it is called an Inertial Measurement Unit (IMU).  I did not use an IMU for my 3rd version because It made my code more complex and I was just trying to debug my balancing routine.  The final system will require a full IMU in order to track the orientation of the camera.  
This IMU technology has been made more affordable by low cost MEMS sensors.  I was able to find a board with a 3D accelerometer, 3D gyro, and 3D magnetometer from Sparkfun to make my project possible.  
Now that we know what angle the robot is at, it is time to get it balancing.  If you turn the motors on forward when it is tilting forward or backward when it is tilting backward  then it may balance.  You are not done yet!  Currently, if it falls forward slowly, it puts the motors on the same amount as if were crashing toward the ground.  It will work better if you add in the value from the gyro, so the faster it falls, the more it will turn on the motors this is called PID (Proportional Integral Derivative controller).  
Inertial Measurement Unit


Materials
Rev 1
    • Vex robotics parts
      • 2 motors, 2 wheels
      • Assorted metal parts
    • Sparkfun 9DOF sensor stick
    • Arduino Fio 3.3v
    • Breadboard
    • Capacitors
    • Resistors
    • Wires
Rev 2
    • Vex robotics parts
      • Assorted metal parts
    • 2 roomba motors
    • Ardweeny arduino
    • 3.3v voltage regulator
    • Resistors
    • Wires
    • Capacitors
    • LEDs
    • 2 TA8080K motor controllers
    • Sparkfun 9DOF sensor stick
    • 12 volt battery
    • Breadboard
Rev 3
    • Lego parts (Not sturdy)
    • 2 NXT motors
    • 2 wheels
    • Ardweeny arduino
    • 3.3v voltage regulator
    • Breadboard
    • Resistors
    • Wires
    • Capacitors
    • LEDs
    • 2 TA8080K motor controllers
    • Sparkfun 9DOF sensor stick
    • Sparkfun bluetooth mate (for debug)
    • 9 volt battery (dies in an instant)
Procedure
Rev 1:
The process began with selecting parts.  My first robot used a 3.3v Arduino Fio.  I programmed it in C++ using the Arduino IDE.  Then I put it on a breadboard with some motor controllers to control the motors.  After that I soldered 4 wires to my sensor board (+, -, I2C data, I2C clock).  Then I built a metal frame out of Vex parts.  Next I attached continuous rotation servos to the frame.  After that, I secured the electronics on the metal frame.  Then I developed the code for the Arduino.  Then I tested it.  The magnetometer got confused because of the metal I had it fastened to.  It did not work.   
Rev 2 Platform
Rev 2:
I followed almost the same procedure for making the next version.  I used iRobot Roomba motors instead of the Vex motors.  The battery is a 12 volt 3.4Ah battery instead of a 4 AA battery pack.  
This time I decided I wanted to learn what the sensors read when it fell.  I let it fall from standing straight up to flat on its side and got the results shown in Graph 1.  They show that the robot falls in about 1 second.  I also did another test where I balanced it the best I could on its wheels and read the sensor readings.  I used the accelerometer readings as a zero point for the balance routine.  
Rev 3 Platform Free-Standing

Rev 3:
The body is made out of Legos™ and NXT motors.  The same Arduino and sensor board are used.  I completely rewrote the code, not using the IMU library by Fabio Varesano to avoid bugs and speed it up.  


Block Diagram

Results
IMU test
  • Successfully demonstrated sensor fusion based IMU
Rev 1:
    • Strengths:
      • Created basic balancing algorithm
    • Weaknesses:
      • It could not stand because the motors were too slow
      • The balancing algorithm was buggy
Rev 2:
    • Strengths:
      • Was well constructed and very sturdy
      • Completely rewrote balancing algorithm
      • Response time and wheel speed are key factors in a balancing system
    • Weaknesses:
      • Once again, it could not stand because the motors were too slow
Rev 3:
    • Strengths:
      • It is able to stand because of faster motors and optimized code
      • Good balancing algorithm
    • Weaknesses
      • Lego™ construction was not as sturdy and is inappropriate for final project
Algorithm Discussion
Some people use a simple equation which can be summarized as:
motor speed = A ( Δθ / t)  + B θ
Where θ is the angle of tilt of the robot as seen in Figure 1 and A and B are constants.  This results in jitter in the balance.  To smooth this out I identified an alternate approach.  If the robot is tilted θ´ degrees and is rotating at θ˝ degrees/second, then in one second the center of mass of the robot will move forward:
r ( sin ( θ´ ) - sin ( θ˝ ) )
cm/sec, where r is the distance from the point in cm.  Also during the same time period, the wheel will have moved forward:
( θ˝ - θ´ ) / 2 π r
This means that in order to keep the robot form tipping over further in that one second period, the robot must move forward by:
r ( sin ( θ´ ) - sin ( θ˝ ) ) + ( θ˝ - θ´ ) / 2 π r
Using this algorithm, I can smooth out the the robot balancing jitters
Conclusion
The platform that I have balances, but is way too small to hold a camera system and a powerful enough computer to run the required vision system.  I need to redo the whole robot platform to be much larger.  I also must add a tilting pole to keep the camera level side to side.  The next step is to create a reliable control system for the robot.  After that, I can get OpenCV running properly on a compact lightweight processor.  Then I can focus on the pathfinding algorithms.  I plan to use a C and C++ library called OpenCV to accomplish that.  It is a complex, open source library that helps with realtime image processing.  With that it can reconstruct the park in 3D though a process called structure from motion to maneuver through the paths in the park.  If I can get all these parts working together, then I can start mapping local hiking trails.  
References
Code:
Free IMU library by Fabio Varesano (for Rev 1 and 2 only)
Arduino Reference Webpage
OpenCV library by Willow Garage
http://en.wikipedia.org/wiki/Inverted_pendulum