Jeff Rousseau has demonstrated code that transforms a camera image into a virtual bird’s eye view of a scene. Check out the demonstration image below. Jeff used the checkboard as calibration, using four corner points as a known size and geometry reference.
The transform then takes the camera view and projects it as though you were viewing the image from above.
On the robot, the camera will be used to see the lines on the Autonomous course. The project mapping will let the robot know where and how far away the lines actually are.
The next step is to process this image and output line-distance information as a virtual laser—using angle and distance to report where the lines are.