Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Thoughts about path-planning within roboticslab-uc3m #40

Open
jgvictores opened this issue Jan 6, 2018 · 12 comments
Open

Thoughts about path-planning within roboticslab-uc3m #40

jgvictores opened this issue Jan 6, 2018 · 12 comments

Comments

@jgvictores
Copy link
Member

Thoughts about path-planning within roboticslab-uc3m, and considerations on creating a path-planning repository within the organization.

Considerations to be taken into account before blindly doing this:

  1. Path-planning is not required for much of our research. 2D/3D path-planning is mostly required in the presence of obstacles, which is not our current general use case within our current research (this can change with time). Cases where we are doing some path-planning:
    1. [2D navigation] TIAGo navigation, via ROS navigation stack.
    2. [2D navigation] There should be some remanent code in asibot-main ravebot or tasks, where OpenRAVE was used. Should be migrated to openrave-yarp-plugins if it hasn't been done yet.
    3. [3D grasping] There has been some progress with TEO in OpenRAVE in https://github.com/roboticslab-uc3m/teo-grasp for grasping a bottle. This may remain independent or some day be merged into openrave-yarp-plugins.
  2. We already have some trajectory generation in kinematics-dynamics, which is a good place for it (not exactly path-planning, does not take obstacles into account).

As seen, the above candidates already have their place. Therefore, my recommendations are the following:

  1. Not creating a new path-planning repository within this organization, at least for now.
  2. Use this issue to track path-planning developments within this organization.
  3. If results are in Cartesian space, keep close integration with kinematics-dynamics, which treats Cartesian to joint space conversions. Therefore, new issues will potentially arise at kinematics-dynamics too.
@jgvictores
Copy link
Member Author

cc: @PeterBowman @rsantos88

@jgvictores
Copy link
Member Author

jgvictores commented Jan 6, 2018

If results are in Cartesian space, keep close integration with kinematics-dynamics, which treats Cartesian to joint space conversions. Therefore, new issues will potentially arise at kinematics-dynamics too.

Regarding trajectories in Cartesian space, new related issues for keeping close integration: roboticslab-uc3m/kinematics-dynamics#134 and roboticslab-uc3m/kinematics-dynamics#135

@jgvictores
Copy link
Member Author

Regarding trajectories in joint space, just a reminder that we have some tools in the, well, tools repository. Namely, as commented here:

Specifically, you'll want the PlaybackThread. You can find an example of use at examplePlaybackThread and its corresponding test.

@PeterBowman
Copy link
Member

See also: https://github.com/personalrobotics/aikido.

AIKIDO is a C++ library, complete with Python bindings, for solving robotic motion planning and decision making problems. This library is tightly integrated with DART for kinematic/dynamics calculations and OMPL for motion planning. AIKIDO optionally integrates with ROS, through the suite of aikido_ros packages, for execution on real robots.

@PeterBowman
Copy link
Member

Not exactly path-planning, but kinda related: https://github.com/robotology/navigation.

@jgvictores
Copy link
Member Author

@PeterBowman
Copy link
Member

I think the demo developed by @elisabeth-ms has some bits of path planning. It's an DL-based object detection app for grabbing stuff with one of TEO's arms.

@jgvictores
Copy link
Member Author

I think the demo developed by @elisabeth-ms has some bits of path planning. It's an DL-based object detection app for grabbing stuff with one of TEO's arms.

Cool, nice catch! I'm totally seeing some OMPL at https://github.com/elisabeth-ms/teo-sharon/blob/cfc3a62270e130d0f3a8a8418c18b1a901508bea/programs/TrajectoryGeneration/TrajectoryGeneration.hpp#L17-L24 in addition to the KDL and FCL code. Thanks!

@PeterBowman
Copy link
Member

See also this grasping demo featuring the iCub: robotology/community#573.

@PeterBowman
Copy link
Member

Moar grasping straight from the ongoing Nvidia GTC AI conference (thanks @imontesino):

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants