Electronics distributor SparkFun held their annual Autonomous Vehicle Competition (AVC) last weekend in Boulder, Colorado - and we would like to congratulate for his mbed-powered Rover Robot coming in third place!
Simply put, the objective of the competition is to build an autonomous vehicle that can circumnavigate SparkFun headquarters without any human interference. Whoever does it the fastest, wins! What should also be mentioned is that the course is complicated by traversing through a narrow parking lot of a large building, giant red barrels, a pond, ramps and potholes to name a few.
The 1st placed Team 0x27 managed to navigate the 270m course in just 22.08 sec (officially 2.08sec after a bonus time deduction), with Rover Robot from Team Databus coming in at 37.16sec with an estimated top speed of 20mph.
Michael describes how the autonomous vehicle navigation information is calculated using only three key sensors.
The Robot Rover control system, consisted of three main sensors -
The GPS is a 20Hz Venus638FLPX on Sparkfun breakout board mounted inside with a roof-mounted patch antenna and a ground plane cut from a square of tin that's good for 5-10db signal gain. Serial communication runs at 38400 bps on one of the mbed UARTs. GPS supplies heading information. The robot ignores GPS position information.
Additional heading information comes from an STM L3G4200D gyro on a Pololu minIMU-9, mounted on an aluminum bracket up front. Communication is via I2C at 400kHz. The gyro is sampled at 100Hz.
Wheel encoders on both rear wheels provides accurate distance measurement. Sparkfun QRE1113 sensor boards mounted to the bearing carriers sense the stripes and send signals to a tiny surface mount interface board designed using comparators in a Schmitt-trigger configuration.
Heading is incredibly important in the Sparkfun AVC. An error of only a couple of degrees is the difference between crashing and finishing. The solution on Data Bus feeds lag-compensated gyro and GPS heading data into a Kalman Filter, using the results to update current heading and position with that historical estimate.
Gyro data is the foundation of the heading estimate. It's corrected for bias using heading data from the GPS. Unfortunately the GPS does its own massive amount of filtering and the result is a reduced dynamic range and lag. By saving a second's worth of gyro data and feeding that into a Kalman Filter, a very good estimate is generated. From this, the gyro-based heading is updated. The end result is a heading estimate with high dynamic range and negligible bias.
Meanwhile distance travelled is given by the average distance of the wheel encoders. I calibrated the wheel encoders to Google Earth, my waypoint editor, and found the error falls below 1%. So the robot knows how far it's gone and in what direction, giving a position estimate. The position is estimated in cartesian coordinates which I did for one very good reason: updating the position based on the historical heading estimate.
If we know what direction we were pointing a second ago, we can not only update gyro heading calculations up to present, but, using a rotation matrix, we can update the last second's worth of position estimates up to present very quickly.
success comes after a less fortunate attempt in last years competition. However lessons learnt about the previous year's issues, months of simulations, testing, and analysis ensured a much better result in 2012 for Team Databus. Well done!
Please login to post comments.