HW 2

by Drew Bagnell on September 6, 2009

Due: Tuesday, Oct 6, by end of day

1 Assignment

The goal of this homework is to familiarize yourselves with robot localization and particle filtering. If you have your own localization data to work with from any existing robot, we’ll be happy with that as well.

Please let Drew and Ranqi know ahead of time about any external data-set you intend to use. The problem should solve a global localization problem and not just a local localization problem, meaning that you do not know the initial pose of the robot. Feel free to utilize any techniques that we have discussed in class (e.g. Markov Localization, Particle Filtering) as well as extensions discussed in Probabilistic Robotics or elsewhere. Included is an optional data-set of Wean Hall in the BeeSoft format that you may use for the purposes of this assignment (courtesy Mike Montermelo).

hw2_data

ascii-robotdata1.log – Data set
instruct.txt – Format description for data set
wean.dat – Map of Wean to use for localization
robotmovie1.gif – Animation of Data (just for your info)
wean.gif – Image of Map (just for your info)
bee-map.c – Example data-reader from BeeSoft that you may use if desired.

Please form groups of up to 3 for this project.

2 What to turn in

A short (about 4 pages) report describing your approach and visualizing the results. Send that, as well as a copy of any code and links to pretty movies showing the visualization to dbagnell+16831@ri.cmu.edu and Ranqi. There is no requirement for real-time so a tool like Matlab might be excellent (at least for visualizing
results).


{ 2 comments… read them below or add one }

Drew Bagnell September 21, 2009 at 5:43 pm

FAQ:

Q: What is the relation between the frames in the map and data files. The “instruct.txt” file says the x y and theta are all in the “standard odometry frame” and the map frame uses some other coordinates implicitly.

A: The resolutions for the coordinate systems are different for the map and the data files. In the data files, everything is in cm (so a range of 235 = 2.35 meters). For the map, units are decimeters, so each pixel is 10cm by 10cm. The relationships between the thetas are undefined—figuring that out is part of the localization problem. Assume some fixed orientation for the map and assume that the orientation for the robot is completely unknown at the start. The given odometry-based poses are relative to some local origin so the only thing you care about are dx, dy, dTheta between iterations.
—-

Q: In wean.dat, which part of the matrix corresponds to (0,0) in the standard odometry frame? and in which direction do x and y point?

A: The location of (0,0) is unknown to you (hence the global localization problem). You don’t care where the robot’s local origin is—the only thing important to you are changes in x, y, theta between iterations (pose at time t+1 relative to pose at time t). The top left value in wean.dat is the top left cell of the map. +x is to the right and +y is up, but keep in mind that you could rotate all odometry readings around any point and still localize correctly since you’re only considering relative changes.
—-

Q: Reading instruct.txt, I notice that there are 2 separate entries for the coordinates of the robot and the coordinates of the laser. Could we have some more information on the robot? for example, whether the laser is fixed on the robot or not, and if not, how to transform between them.

A: The laser is fixed on the robot. You can consider this to be a problem of localizing the pose of the laser. The robot is just a fixed shape that can be trivially incorporated once the pose of the laser is known.
—-

Q: How are we do derive p(z|x)? Is the sensor data in ascii-robotdata1.log enough to derive this?

A: Deciding how to compute p(z|x), the sensor model, is one of the main tasks of this project. :-) You are free to try whatever techniques you desire (the approaches discussed in class and in the book are a good start). There is plenty of information in the logs to develop a good sensor model, but you have to account for the uncertainty and error of both the sensor and the environment (such as feet of people walking around you).

Drew Bagnell September 21, 2009 at 5:52 pm

Q: What motion model is reasonable for this data?

A: You can safely use a two-wheel “trash-can” robot motion model for this project, although many models have been applied successfully to the odometry data.

Previous post: HW 1

Next post: reading material