Competition Event:

Visual Odometry + Scavenger Hunt

Or...The

egnellahC dnarG APRAD

(The DARPA Grand Challenge in Reverse!)

Instead of using a supplied set of coordinates to determine the path of an autonomous vehicle, you will supply a system containing a computer and sensors that will be placed on top of a mobile robot.  The robot will transport your system along an unknown trajectory for 10 minutes. From these observations your system will generate a listing of the trajectory of the robot.

Some details:

  • The robot will move over a flat two-dimensional surface, so the trajectory will fall entirely within a 2-D plane.
  • The trajectory of the robot might intersect with itself at one or more locations.
  • The output of your system will be a list of the X,Y coordinates (feet) and orientation (degrees) for each second, relative to the initial position and orientation of the robot.
  • Your system will not have access to any data or signals from the robot.
  • The use of navigation instruments such as GPS, INS, gyros, accelerometers, or compasses will not be allowed.
  • Sensors such as mono or stereo cameras are recommended.
  • The use of active optical or acoustic rangefinders such as LIDAR and structured lighting sources such as laser pointers or light stripes will be allowed.  However, all entries that employ these sensors will compete in a separate class from non-emitting vision systems.
  • You will not be allowed to place any navigational markers or devices in the environment.
  • You will not be supplied with detailed information about the visual environment in which the robot will be moving.  However, it will be safe to assume that the robot will not be operating in total darkness, and the environment will not be totally devoid of visual or geometric features.
  • Dimensional and weight limits for your system will be specified well in advance of the event.
  • Because accumulated position error is so sensitive to orientation errors,  your score will be calculated from the sum of the absolute value of the incremental errors, in addition to the absolute position error at the very end of the path.

Bonus:

  • You will be given credit for correctly recognizing any moments when the robot, if it were to continue on it's current trajectory, is in danger of colliding with an obstacle within the next 5 seconds.
  • One day in advance, you will be supplied with images of one or more target objects.
    • You will be given credit for correctly recognizing these target objects whenever they come into view.
    • You will earn further credit if you are able to correctly determine the total number of target object that are present (hence not over-counting any target object that is observed more than once.)
    • You will earn additional credit for being able to accurately estimate the coordinates of target objects.

What you need to compete:

  • A battery powered computer, such as a laptop computer.

  • One or more cameras such as a webcam that can be directly interfaced to the computer.

  • A program development environment (such as C++) and a library of real-time image capture routines.  

Useful Reference Information for this Event 


Competition Event:

Kidnap Problem + Obstacle Course

 

Each team will supply a robot which will be placed at an initial location, turned on, and allowed to explore for 10 minutes.  This initial exploration may include the robot pushing or bumping various objects such as boxes, balls, or doors.  The robot will then be paused and repositioned to a new location.  The robot will attempt to return to the original starting location, taking the shortest possible path. Beyond mapping open space and recognizing locations, this task can be used to demonstrate the ability of a mobile robot to observe, understand, and utilize mechanical interactions between itself and surrounding three-dimensional objects.

Some details:

  • The robot will move over a flat two-dimensional surface, so the trajectory will fall entirely within a 2-D plane.
  • The use of navigation instruments such as GPS, INS, gyros, accelerometers, or compasses will not be allowed.
  • Sensors such as mono or stereo cameras are recommended.
  • The use of active optical or acoustic rangefinders such as LIDAR and structured lighting sources such as laser pointers or light stripes will be allowed. 
  • You will not be allowed to place any navigational markers or devices in the environment.
  • You will not be supplied with detailed information about the visual environment in which the robot will be moving.  However, it will be safe to assume that the robot will not be operating in total darkness, and the environment will not be totally devoid of visual or geometric features.
  • Dimensional limits for your robot will be specified well in advance of the event.

What you need to compete:

  • A battery powered computer, such as a laptop computer.

  • A mobile robot that can be directly interfaced and controlled by your computer.

  • One or more sensors such as a webcam that can be directly interfaced to the computer.

  • A program development environment (such as C++) and a library of real-time sensor data capture and robot control routines.

Comments: Nelson Bridwell