It took a little vial of starter fluid—aka, “jet fuel” according to Jim—and about 10 pulls of the cord, but then, success!
Jim gets the generator started with yr-old gasoline... nasty!
Of course, then the IGVC crew got the real generator going, moments later.
At least we have the generator prepped for tomorrow (or later tonight)!
Well it’s 1 pm on Sunday. This year, the first heat of the contest is today—2 pm to be precise.
We still haven’t qualified, but things are looking somewhat good:
Jeff and Mike T. have been working on their vision pipeline since last night. Mike walked the practice course and video’ed it, and Jeff wrote a ROS node that accepts and AVI file and imports it frame-by-frame into the vision pipeline.
Now, they’ve got the whole pipeline together:
- dilate white pixels, mean shift filter (smoothing out the grass), separate HSV planes, flood fill on hue plane only (this creates a binary mask), this publishes a binary image,
- then feed this to IPM (Inverse Projection Mapping), which uses the previously calculated transform from the calibration checkerboard,
- then this goes to the laser scan producer, based on the Bresenham filter, which “marches” out rays to produce a standard ROS laser scan data structure.
Skittles was making progress on the core nav stuff—he solved his memory leak with Jim’s help last night—but power to the tents just failed.
Jim is connecting up our generator to Skittle’s computer (the rest of us are running on laptops).
I just heard someone say the generators ran out of fuel. I suppose that’s a best case scenario!
Most of the afternoon has been ridiculously hot. It’s finally cooled off now (4:30p). Jim and Ken left for cooler climes (i.e., the hotel).
Right now, there’s a lot of pieces, but it’s not obvious it’s going to come together. Skittles is still debugging memory leaks in his nav stack. There’s a lot riding on him—popping a list of GPS waypoints, doing the A* based on occupancy grid map, and integrating WOAH. Early this afternoon, he said he had moved from debugging horrible horrible OS crashes (“illegal instruction”) to just debugging bad code. He seemed happy about that but I did not see that as a good sign.
Jeff got a pretty cool demo of stereo vision going, but this is not critical path, so I pulled him off of it.
Right now, Mike T. + Jeff are working on integrating their vision pipeline: Mike’s mean value smoothing (to make grass look uniform), followed by Jeff’s inverse plane projection, followed by Mike’s Bresenham March.
If this works, the vision should see the lines and other obstacles, and report them back as though they were a laser scan.
Since yesterday, Therrien has been working on an implementation of Bresenham’s Line Algorithm, which “determines which points in an n-dimensional raster should be plotted in order to form a close approximation to a straight line between two given points” (wikipedia).
In the vision processing pipeline, the camera image will be first converted to a bird’s-eye view (using Jeff’s inverse projection mapping) and then flood-filled to find safe and not-safe areas.
Once the binary safe/not-safe image is created, Bresenham’s algorithm will be used, tracing paths radially—like a laser scan—from the robot’s location.
Mike likens it to an “ant’s march” and therefore has renamed the process Bresenham’s March.
Here’s an image from the algorithm running on a test pattern. In practice, the algorithm will report final image pipeline results with a data structure that has the same format as a laser scan.
Bresenham's March on test data. The yellow rays represent the output of the algorithm, when it finds the black (not-safe) pixels.
Jim and Nat did a super job. Pacing was good and they focused on the new work, which made for a lively and relaxed presentation. Here are the judges checking out the MCP. They seemed to enjoy it!
Jim and Nat show off the MCP to the judges after the formal presentation
mcp design presentation 2011
Sadly, Nat has been spending an hour getting a light to flash on the robot. I am not kidding. This is a new contest requirement; when the robot is under autonomous control, it must flash a large warning lamp. For us, this means threaded code and semaphores at the driver level, because the board that’s running the main motors is also going to be the lamp-flasher.
Therrien and Spidey are working together on the flood-fill algorithm to figure out what’s grass and what’s obstacle.
Jeff and Jim went to Kinko’s to print out a better checkboard.
I just saw Nat flash the light for the first time. Ian was helping. Now, Nat is making it turn on steady after an E-stop has occurred.
Skittles has been working on the GPS based code for the last few hours. He’s just wandered off though. Wait, now he’s back.
This light need to be blinking.
Jeff Rousseau has demonstrated code that transforms a camera image into a virtual bird’s eye view of a scene. Check out the demonstration image below. Jeff used the checkboard as calibration, using four corner points as a known size and geometry reference.
The transform then takes the camera view and projects it as though you were viewing the image from above.
On the robot, the camera will be used to see the lines on the Autonomous course. The project mapping will let the robot know where and how far away the lines actually are.
The next step is to process this image and output line-distance information as a virtual laser—using angle and distance to report where the lines are.
Inverse perspective mapping demonstration -- the raw camera image on the left is computationally transformed into a virtual birds-eye view.
UMass Lowell is participating in the 19th annual Intelligent Ground Vehicles Competition (IGVC) at Oakland University, in Rochester Hills, MI. It’s our seventh trip to the contest!
The ground crew left at 2 am last night: Jim Dalphond, Nat Tuck, Michael McGuinness, Ian Ndicu, Ken Cramer, and Peter Galvin. The airborne personnel arrived this morning: Jeff Rousseau, Mike Therrien, and Fred Martin.
The ground team had check in, claimed adjoining tables, and started work by the time the air crew arrived!
The hardware team has decided to move ahead with a new E-Stop/Motor Controller Interface (MCI) in lieu of safety regulation changes for this years IGVC.
The new regulations state that the Wireless E-Stop System should be able to work from 100ft. Our solution is seemingly overkill, but allows for expandability later.
The solution is to use 900Mhz XBee modules through an Arduino to create a heartbeat. If that heartbeat is stopped then the robots motors shut down. The heartbeat will stop if the button is pressed or if it goes out of range therefore it is truly failsafe. The XBees are rated for ~6 miles outdoors.
If at any time the Arduino crashes or looses power the robot will also shut down and failsafe.
A proof of concept has already been built up using an Arduino Mega 2560 to show that the physical e-stops shut down the robot and that it will failsafe. It DOES!
The hardware team met on 12-2-10 to talk about current and future work including the RICC entry, and status of Stark and the MCP. The proceedings can be found here.