Syllabus • Schedule • Academic Integrity • Piazza Page

Gazebo is a high-quality robot simulator that is well integrated into ROS will use for your project as well as for this assignment. As described in class and on Piazza, a significant amount of the development you will be doing in this class will happen first under simulation, and then will be tested on the robot. The goal of this assignment is to get familiar with Gazebo simulator, simulating both a physical robot and input from its sensors.

1: Simulate a Husky Model in an empty world in a Gazebo environment

Your first goal is to get the Gazebo simulator running with ROS. You will:
  1. Load an empty "world" (the simulated environment the simulated robot is acting in)
  2. Load a model of the Husky mobile robot into it
  3. Drive the Husky around this world, either programmatically or using manual controls
  4. Take screenshots of your husky doing things in the rviz visualizer
→  Gazebo home page
→  ROS Gazebo Page
→  Simulating the Husky in Gazebo
→  Driving a Husky

2: Add an obstacle to the Husky's world

You initially loaded an empty world for the Husky to drive around in; now add one or more objects to that world, which the husky must avoid. You will:
  1. Add an object to the Husky's world
  2. Take screenshots of your Husky in the world with the obstacle
→  Reference picture of Husky with obstacles

3: Add a Kinect to your Husky

ROS OpenNI is an open source project focused on the integration of the Kinect with ROS. Using OPenNI to add a simulated Kinect RGB-D sensor to your Husky will let you see what the Husky sees. You will:
  1. Add a Kinect to the simulation, using a ROS Gazebo plugin.
  2. Take screenshots of your Husky in the world with the Kinect.
  3. Take screenshots that show the world as seen by the simulated Kinect.
→  A tutorial on using Gazebo plugins for a sensor.
→  A page of OpenNI tutorials, including a quick start.

4: Navigate Around the Simulation

Similar to step 1.3. Navigate around the new enriched world with your enriched husky. You will:
  1. Drive your Husky around, programmatically or manually, using the Kinect to see its viewpoint as well as the world model.
  2. Take screenshots of your Husky driving around, avoiding the obstacle, with its sensor.
  3. Store the output of the Kinect view while moving for a few seconds into a ROS bag file.
  4. Run 'rosbag info' on the resulting file.
→  Tutorial on cruising around in the Gazebo world, with sensors.
→  Tutorial: Recording and playing back Kinect data.

Turn in:

The following things will be your deliverables:
  1. A writeup of the process. This should be a PDF file, 200-400 words, containing:

    • The names of anyone with whom you discussed the homework (in general terms), and a little about how/with what.
    • Approximately how much time you spent on each step of the process.
    • The thing(s) you found hardest and any errors/problems you ran into.

  2. Screenshots (PNG, GIF, or JPG)
    • 1: Two visualization(s) of the Husky navigating in an empty world
    • 2: The Husky in a world with the obstacle
    • 3: Husky + obstacle in the world, with the Kinect attached
    • 4a-4c: Three views of the Husky navigating around the obstacle world, with the Kinect view showing

  3. Your scripts and code files if any (e.g., if you are controlling the Husky programmatically).

  4. The output of your rosbag info command, as a text file.

Upload a single zip file named yourlastname-asst2.zip, containing the above.

Remember, file types matter!