\(\def\dt{\Delta t}\)
\(\newcommand{\transpose}[0]{^\mathrm{T}}\)
\(\newcommand{\half}[0]{\tfrac{1}{2}}\)
\(\newcommand{\Half}[0]{\frac{1}{2}}\)
\(\newcommand{\norm}[1]{\left\lVert#1\right\rVert}\)
\(\newcommand\given[1][]{\:#1\vert\:}\)
UP | HOME

Introduction to MoveIt!

Table of Contents

1 Introduction

MoveIt! is the most commonly used ROS tool for motion planning and manipulation. Quoting from their website:

MoveIt! is state of the art software for mobile manipulation, incorporating the latest advances in motion planning, manipulation, 3D perception, kinematics, control and navigation. It provides an easy-to-use platform for developing advanced robotics applications, evaluating new robot designs and building integrated robotics products for industrial, commercial, R&D and other domains.

MoveIt! is the latest evolution of some packages that were specifically designed for the PR2 back when Willow Garage was operating. Back in ROS Electric, much of the PR2-specific functionality was moved to a general ROS stack called arm_navigation. Since ROS Groovy, all of this functionality has been transferred to MoveIt!.

2 Primary MoveIt! Components

2.1 User Interfaces

move_group node
The move_group node is an integrator node that pulls many of the MoveIt! capabilities together. It is similar to the move_base node in the navstack.
moveit_commander and move_group_interface APIs
These are Python and C++ APIs, respectively, that allow you to easily interact with the move_group node. Note that much of the functionality of the move_group node is also available by directly calling ROS topics/services/actions.
rviz
There is an rviz plugin that is specifically for interactive motion planning. This is not all that useful for solving real planning problems, but it can be very valuable for debugging.
MoveIt! ROS
The core components of MoveIt! are available to access directly through lower-level C++ APIs. This takes a bit more work to use than the move_group node, but it is more powerful.

2.2 Configuration

The move_group node is configured using a large number of configuration files. You must have a URDF representing the kinematics of your robot, and you must have an SRDF representing the semantic description of your robot. The SRDF includes information on collections of joints that should be considered together called groups (e.g. groups of joints that represent a robot's arm), it contains logical collision detection rules (e.g. we know that these two links can never contact each other, so don't bother collision checking), and it contains named, static configurations for the groups. The move_group node also requires a large number of configurations files that specify additional data such as joint limits (angles, velocities, and accelerations), motion planning configurations, kinematics configurations, and more.

2.3 Robot Interface

For MoveIt! to be able to work with a robot, several interfaces must be provided. The robot must be publishing all of its joint information on a /joint_states topic (move_group subscribes to this information), and a robot_state_publisher must be running to provide move_group with all required transforms. In order to have the move_group node move your robot, some node on your robot must provide a ROS action server of type control_msgs/FollowJointTrajectory, and it must have low-level controllers implemented to properly respond to these action requests.

2.4 Kinematics

Beyond basic transform data, MoveIt! requires more-advanced robot kinematic calculations. For example, tools for calculating forward/inverse kinematics and Jacobians are required. These calculation tools are configured via simple configuration files. Below, several common options for providing these capabilities are discussed.

2.4.1 KDL

KDL is the Kinematics and Dynamics Library; it is part of the Orocos project. Via the kdl_parser package, URDF descriptions of robots can be directly converted into KDL "chains", and then KDL libraries may be used for performing these calculations. This is the default strategy in MoveIt!, and likely the most common.

  • Pros
    • Easy-to-configure
    • Works for tons of robots
  • Cons
    • All IK solutions are purely numerical (they may be slow, and may suffer from numerical instabilities and difficulties finding solutions)

2.4.2 IKFast

IKFast is part of the of the OpenRAVE project. Ideally, you can use IKFast to scan the topology of your robot, and then output C++ headers that provide reliable, stable, and fast robot arm kinematics calculations. Once these headers are available, they can be wrapped in MoveIt! APIs to enable their functions to be called by move_group. The ROS wiki has a decent tutorial on creating a MoveIt! kinematics plugin from IKFast. As an example IKFast plugin, check out the baxter_ikfast_left_arm_plugin (there is also a right arm plugin).

  • Pros
    • Extremely fast, stable IK solutions.
  • Cons
    • Can be difficult to setup, and may not work for all robots.

2.4.3 trac_ik

This package was announced in November 2015, and it's benchmarking results indicate that it could be a much more reliable alternative to KDL. Several students have used it in the past, and it has worked great.

2.4.4 Custom

For some robots, it can be advantageous to hand-code kinematics plugins for your robot. This allows you to know exactly how your code is working, and to tailor your solutions to your specific use-case. For examples, check out the pr2_kinematics and the youbot_arm_kinematics_moveit packages.

2.5 Motion Planning

One of the primary goals of MoveIt! is to provide path planners for robot arms. Technically, MoveIt! is designed to provide C++ interfaces that allow users to write their own motion planning plugins for MoveIt!. However, in practice, it seems that most people end up using the default motion planners provided by OMPL.

2.5.1 OMPL

OMPL is the "Open Motion Planning Library". It provides implementations of a variety of motion planning algorithms. MoveIt! has created move_group plugins for many of these planners so that you only need to provide configuration files to be able to use the planners.

2.5.2 Plan requests and results

You send "plan requests" to your motion planners (typically OMPL) through the move_group node. These requests typically specify start and end locations in either joint space or Cartesian space. They may also specify a variety of kinematic constraints. The planner then searches for a collision free path from the start to the goal. You can then either just ask for the computed path to be returned, or you can have move_group forward the planned path onto your robot's control_msgs/FollowJointTrajectory action server.

2.5.3 Plan adapters

MoveIt! uses a variety of "plan adapter" plugins to pre- and post-process planning requests and results. Some of these adapters are simple heuristics that are designed to help deal with real-world nuisances that can cause issues for the planners. For example, let's say you start in a pose that has the robot in self-collision, then the planner will never succeed because there is no collision-free path. However, this self-collision state may be a perfectly acceptable state, and it may be exactly the situation that you are interested in. For example, let's say your robot is resting on a joint limit, and it was intentionally put at that joint limit during a calibration procedure. Then the FixStartStateCollision plan adapter plugin may be used to randomly perturb the initial condition by small amounts to try and find a nearby, collision-free start state.

Likely the most important adapter for real-world robot execution is the AddTimeParameterization adapter. Most planners simply return sequences of joint angles; there is no specification of how fast the arm should move between these waypoints. This plugin time-parameterizes the planned path by applying velocity and acceleration constraints.

2.5.4 The planning scene and collision detection

The robot itself, and the world around the robot (including grasped objects) are represented in the "Planning Scene". The move_group node implements a planning_scene_monitor that listens to several different sources to determine what the state of the world around the robot is.

  1. The /joint_states topic plus the collision geometries specified in the URDF and the semantic collision descriptions in the SRDF are all used in the planning scene.
  2. Various 3D sensors can be combined with the "World Geometry Monitor" to automatically add sensed collision objects to the planning scene.
  3. Users may manually add various collision primitives and meshes to the planning scene via the /planning_scene topic.

The actual collision detection algorithms are relatively transparent to the user. The user simply has to setup the planning scene well, and the move_group node makes calls to the "Flexible Collision Library" to determine collision states. Note that collision detection is computationally expensive. Care should be used in setting up the SRDF and the planning scene to minimize the number of collision checks that happen.

3 Practical MoveIt! for This Course

MoveIt! is a huge piece of software that is extremely configurable, and gaining an understanding of all of the components is a nontrivial task. Fortunately for this course, we really only need to be concerned with a small subset of MoveIt!. Further, extensive work has already been done on configuring Baxter and Sawyer to work with MoveIt!, so we have much less setup work than would typically be required. This section is a small set of pointers that is designed to help guide how you use MoveIt! for your final projects. Much of this section is my opinion based on my experiences with these robots and MoveIt!. Your group is free to do whatever you wish.

3.1 Which files are we interested in?

3.1.1 Configurations:

MoveIt! requires many configuration files. The following files are the most important files to analyze for this project. For Baxter, all of the files are part of the moveit_robots package, and they are located in moveit_robots/baxter/baxter_moveit_config/config. For Sawyer, all of these files can be found in sawyer_moveit_config/config. There are many other files in each of these repositories, but I think these are the most important.

baxter.srdf/sawyer.srdf
This is the SRDF file for Baxter/Sawyer that is used by default. This SRDF defines planning groups for each of the arms, and a planning group for both arms together. There are several group states that define "neutral" poses that can be directly planned to, i.e., you can refer to them by name rather than specifying all of the joint angles. One way you could leverage this file is by specifying your own named group states. Typically, we don't use this SRDF file directly; instead, we use xacro to build a properly configured SRDF at run-time from the baxter.srdf.xacro file. For Sawyer, see the sawyer.srdf.xacro file. One of the most common issues that people will encounter is having SRDF files that don't match the state of the grippers attached to the end of the robot's arm(s). For example, let's say you have grippers plugged into a Rethink arm, then your URDF should also have grippers (via the mutable_robot_state_publisher) and the robot will automatically publish the gripper position on the /robot/joint_states topic. You then need to also have an SRDF that properly represents these grippers or MoveIt! will constantly think that your fingers are in collision with the end of the arm (the SRDF would explicitly ignore this collision).
kinematics.yaml
This file specifies the inverse kinematics solvers for the robot. Baxter has been configured to use either IKFast or KDL. If you are having issues with IK, you can try editing this file. Sawyer doesn't currently have an IKFast plugin (that I'm aware of), but you could still edit this file to try a different IK solver (like trac_ik or bio_ik).
ompl_planning.yaml
This is the file that defines a variety of planner configurations for each of the move groups. Frankly, the parameters that are available for each planner are not well documented. In my opinion, your best bet is to search through source code, and refer to OMPL documentation. The best documentation I've found is in the MoveIt! Setup Assistant source code. Currently the default planner for Baxter is RRT Connect which, in my experience, works well for Baxter. Sawyer's planner file includes some nice comments about what the planner options mean. It doesn't explicitly set a default planner, so I believe it will default to the SBL planner.
joint_limits.yaml
This file defines the velocity and acceleration limits for each of the robot's joints. Looking at the values in this file, it is clear that these limits are not necessarily physically realistic. If you are having issues with the Joint Trajectory Action Server preempting because execution errors are too high, you could try lowering some of these limits. Note that Sawyer's copy of this file includes some commented out choices for "fast" and "slow" limits. Of course, you could also use the move_group/trajectory_execution/execution_velocity_scaling dynamic reconfigure parameter to modulate the execution speed.

3.1.2 Launch:

There are many launch files in the PATH_TO_WORKSPACE/src/moveit_robots/baxter/baxter_moveit_config/launch directory (or the sawyer_robot/sawyer_moveit_config/launch directory). However, the most important file is the move_group.launch file. This includes many other launch files, and those launch files eventually include the configuration files from the config directory. Note, this file may not work out-of-the-box with either robot. There are so many things that can be configured and must be configured correctly that you should expect to need to configure this launch file, and possibly other launch files. Don't be afraid to either fork these repos and push your edits to your fork, or to copy the files that need edited into your package. Here are a few primary things to look out for:

Gripper Settings
If your SRDF that gets loaded by your launch file stack does not include the grippers for the robot, but you are connected to a robot that has grippers you may have issues motion planning because MoveIt! will always detect collisions between parts of the gripper and the end of the arm. This would be because the URDF has collision objects associated with the grippers and you haven't explicitly told MoveIt! to ignore these collisions. Be sure that the URDF and SRDF agree with each other. To control this, you can use arguments to the SRDF xacro files, if you are not connected to a real robot you can use arguments to the URDF xacro files, and if you are connected to a real robot you can use the mutable_robot_state_publisher.
Plugin List
The move_group node used to look at a capabilities parameter to decide which plugins to load. Since Kinetic, all plugins are loaded automatically so this parameter isn't strictly necessary, but the launch files for both robots still set this parameter, and you may encounter some issues with legacy plugins. The one I for sure know can cause problems is the MoveGroupExecuteService plugin. This used to be a plugin that offered a service call to execute a trajectory that had been planned with MoveIt!, but it is now deprecated. If you have references to this plugin in your launch file you'll want to delete this. The replacement plugin is MoveGroupExecuteTrajectoryAction. Check out my change to Sawyer's move_group.launch.
Joint States
On both robots, the joint states are published on /robot/joint_states topic and the default launch files have multiple remapping steps to ensure that nodes are looking at this topic instead of /joint_states. One issue I've encountered occasionally is related to the fact that the robots publish joint state information on that topic for the arm and for the grippers separately. I've had numerous issues when I have grippers connected related to move_group complaining that it cannot get complete robot states. The way I've typically fixed this is by using the joint_state_publisher to assemble the messages containing incomplete joint state information and publish as complete joint state information on the normal /joint_states topic. Check out the current state of my launch files to see how I've implemented changes to deal with this.

3.3 Controlling grasped object mass

Baxter is capable of compensating for the mass of grasped objects. If you are picking up things with any substantial mass, it may be worth your time to let Baxter know about the weight of those objects. If you do this, you'll have better luck at accurately tracking reference trajectories if the weight of your objects is significant. Read more here. I haven't used this feature in Sawyer, and it isn't documented, but it should work very similarly.

3.4 Setting trajectory execution speed

If you find that Baxter/Sawyer is moving faster than you would like (it's usually a good idea to start with the robot moving very slowly), a very easy way to limit the speed is to edit the joint velocity and acceleration limits set in the joint_limits.yaml file. This is a bit of a hack, however. Wouldn't it be better to be able to change the execution speed online rather than having to edit the actual parameters of the robot? The MoveIt! developers thought this was a good idea too, and they implemented the /move_group/trajectory_execution/execution_velocity_scaling parameter. For years this worked well, but on newer versions of MoveIt! I haven't had much luck getting this to work (and I haven't really dug into the source code to figure out why). What I have noticed is that they've added new tutorials on time parameterization that specifically discuss how to modify the execution speed using the max_velocity_scaling_factor and max_acceleration_scaling_factor fields in the MotionPlanRequest.msg. Fortunately, we don't really have to worry about using this message directly because they have also added controls for these variables in the rviz MoveIt! plugin, and they've added interfaces to the moveit_commander. Here are links to the API documentation: set_max_acceleration_scaling_factor, set_max_velocity_scaling_factor, and retime_trajectory.

3.5 Path simplification and smoothing

In OMPL, once a path is found, there is a path simplification step that runs. During this step, they minimize the number of vertices, they smooth paths, they attempt to eliminate loops, etc. However, I've occasionally had issues where this simplification process ends up producing trajectories that actually have collisions. I think this is because the collision checking is not re-run after simplification (by default). Note that you can use the longest_valid_segment_fraction parameter inside of ompl_planning.yaml to control the distance between nodes in the simplified path. When I've required navigation very near obstacles, I've avoided collisions by setting this number to be small. The best way to do this is to create new planner configurations in ompl_planning.yaml and use the Python method move_group.set_planner_id() to change which planner you are using.

3.6 Joint trajectory action server

Note that the Joint Trajectory Action Server always has to be running in order to use MoveIt! for controlling either Baxter or Sawyer. Running the joint trajectory action server sometimes conflicts with the built-in arm movement API commands. So if you are trying a combination of the built-in API and the action server, and you find that you can't get the API to move the arm you'll need to figure something out (e.g. a ROS interface to start and stop the action server). If you find yourself stuck with this, let me know and I'll help you develop something.

3.7 Collision scenes

You will likely want to add some collision objects to your planning scene (e.g. you may want to add a table so that Baxter doesn't create plans that hit the table). The way that I'd recommend doing this is through the moveit_commander.PlanningSceneInterface() Python class. This class would also allow you to attach and detach objects to other frames in the scene. So, for example, you could add a collision object to the gripper after you've picked it up; this would ensure MoveIt! wouldn't plan paths that caused the object to run into things.

3.8 Octomaps and depth cameras

MoveIt! can be configured to generate collision scenes directly from sensor data. If you're interested in doing this, I'd start by looking at the demo_xtion.launch file in the moveit_robots package to see an example of how to configure this. We have plenty of Kinects and Xtions if you want to try this. I'm happy to help debug the sensors.

3.9 MoveIt! for grasping

MoveIt! does have pick and place capabilities that could potentially be useful in these projects. That being said, in my experience, getting this working is actually quite difficult. If you do use MoveIt!, I'd recommend using it for planning to a pre-grasp location, implementing your own custom grasping strategy, and then using MoveIt! again after you've successfully grasped the object.

3.10 Using fake execution controllers

MoveIt! has the moveit_fake_controller_manager package that implements the moveit_fake_controller_manager/MoveItFakeControllerManager plugin. This plugin implements a "fake" version of the control_msgs/FollowJointTrajectory action server required for MoveIt! to be able to control a robot. This plugin is loaded as a plugin for the move_group node and once it is loaded, move_group effectively provides its own action server for controlling a robot. When a plan execution request is submitted (via rviz, the moveit_commander, etc.) this plugin processes the execution request and simply has the robot move along the planned trajectory with no error. As it moves the robot along this trajectory, pretending that the robot is perfectly executing the planned path, it publishes the robot's joint state on the /move_group/fake_controller_joint_states topic. If you add a robot_state_publisher and re-map this topic, you would then be able to have a complete visualization of a robot in rviz. If you are debugging MoveIt!, this is a super easy way to see how the robot would move given a MoveIt! plan request without ever bothering to boot up Gazebo. Essentially you are only simulating the kinematics of the robot. You would not be reacting to inertia, friction, collision, etc., but it is much easier to configure and much lighter weight than a full-blown Gazebo simulation.

When one uses the MoveIt! setup assistant, there are launch files automatically configured to allow you to run MoveIt! with this plugin loaded. For Baxter, check out this demo_dummy.launch file and for Sawyer check out the changes that I pushed to sawyer_moveit.launch.

4 Resources

MoveIt! Concepts Page
This is a very useful page, I highly suggest reading through the whole thing.
moveit_robots
This repository contains MoveIt! configurations for a variety of commonly-used robots (including Baxter).
sawyer_moveit
Sawyer's MoveIt! configuration is in it's own repository.
moveit_ros
This is the ROS metapackage that provides much of the ROS-to-MoveIt! interfaces. Quoting from the GitHub page "This is where much of the functionality MoveIt! provides is put together."
MoveIt! Tutorials
Below are a few Python and Baxter-specific MoveIt! tutorials.
Dave Coleman ROSCon Talk
At ROSCon 2015, one of the primary MoveIt! developers gave a fantastic talk that provides a great description of the core MoveIt! components. Additionally, he provides many insights on what has been done well, and what needs more work. The video is 45 minutes, but is definitely worth your time. The slides are also worth looking at.
moveit_python
A few groups in the past have stumbled onto this repo from Mike Ferguson at Fetch Robotics. As far as I can tell, this doesn't add any functionality over moveit_commander, but it does have slightly easier-to-use interfaces.
Our Baxter Examples
Over the last few years, we have had a few people put together working demos that used MoveIt! well. Below are links and descriptions to a few demos that you may find useful.
Baxter Block Stacking
This demo from Athulya Simon shows how to use simple calls to the Python moveit_commander. Additionally, it has some nice examples of adding collision objects.
Baxter Dual Arm Manipulation
This demo from Josh Marino has nice examples of how to adjust planner configurations. He was having issues with path simplification producing trajectories that resulted in collisions. We solved this by creating two different planner configurations. One was used to plan up to a grasp, and the other was used to plan the actual grasp. Check out his planning configuration file here, and look at this code to see how he switched planners.
NxR Baxter Demos
This is the code that currently runs on Baxter when he is running his demos in the Willens Atrium. Much of the code is focused on implementing the Kinect-only user interface and the workarounds required to get the software to run day-in and day-out. However, MoveIt! is used in an interesting way. The moveit_interface.py script uses a service client that communicates with the baxter_controller.py script. Requests are sent asking the controller where Baxter's arms should be (either in Cartesian space or in joint space). Once a goal is known, MoveIt! is used to plan a collision-free trajectory to that goal, and immediately, Baxter begins executing this trajectory. Asynchronously, the moveit_interface.py node immediately requests a new goal and begins planning again. As soon as a new plan is found, the script then incorporates the new plan with the plan that is currently being executed.
Creative Commons License
ME 495: Embedded Systems in Robotics by Jarvis Schultz is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.