VVV09 Manipulation Group

From Wiki for iCub and Friends
Jump to: navigation, search

A place for meeting/discussion for people interested in doing manipulation with the iCub: Topics:

  • Reaching
  • Grasping
  • 3-D vision

Last year's reaching/grasping group: Link

NEW!! Videos in youtube.

  • iCub playing with lego using one hand/arm: [1]
  • iCub playing with lego using two hand/arms [2]

Contents

Ideas / Message wall

  • Demo idea: Two-handed manipulation, putting lego pieces together using both arms of the iCub.
  • A git repository for sharing our code is ready! Please send me (Alexis) your public ssh key (usually ~/.ssh/id_dsa.pub or ~/.ssh/id_rsa.pub). Then you will be able to pull and push to the GIT repository. Instructions below!

Divide and conquer

We had a small informal meeting on Tuesday (21.7.) before lunch and came out with a possible work distribution:

Perception team

This team has the following goals:

  • First we detect the lego pieces on the table, and after grasping them, we need to detect them again in the hand, to see where they really are held.
  • We need a 3D position in the coordinate frame of the cameras (or better, of the iCub).
  • Initially, we can keep the eyes fixed, but moving the head and eyes would be a nice plus.
  • Matteo has a working ball tracker that might be adjusted (if you want to use that, please see: 3D_ball_tracker).
  • We could put the legos together 'with eyes closed' (see first, then move pieces together), or use visual servoing to make the process better.
  • Potential field using inverse kinematics now takes into account cable and joint limits correctly
  • we have converted our code to handle to arms at the same time, and the state machine to control the complete demo is planned.
  • Right now, we are working in a state machine to control all the modules. We need to define how modules will talk to each other!


Participants:

  • Matteo
  • Jakob
  • Federico T.
  • Giovanni

Here you can find a set of images illustrating possible operating conditions for the grasp demo. They are also in the git repository, under vvvmanipulation/perception/testImages2009_07_21.

Grasping team

This team has the following goals:

  • make a small library of grasps (e.g. 3-finger pinch, 2-finger pinch, power grasp).
  • when a grasp movement is being executed, monitor the positions and currents of the hand -> Detect contact with the object and stop.
  • when holding the object, monitor the currents/positions and figure out if the piece is falling down. -> Learn a classificator using joint positions/currents?
  • Avoid destroying the iCub's hand!

Participants:

  • Julian
  • Theo
  • Yan
  • Kail

We tried to check for a collision to an object by observing the motor currents, but the data is very noisy. A better way to do is to observe the PID error and thresholding it. We now have implemented both and are goint to check how we can use it.

We have implemented a first version of a grasping library which provides a setting of grasp and pregrasp positions reduced to two degrees of freedom. Furthermore we provide routines for the grasping, including checking encoder data, motor currents and controller error. Examples for availible functions are

 int checkCurrents()
 int checkPidErrors()
 int checkCollisions()
 int check_encoders(Vector target_positions)

for checking if the fingers are in contact or if they have reached their target position and

 bool move_to_pregrasp_pos()
 bool doGrasp()

to execute the grasping.

The collision checking has been tested on both robots. It works great on the black (evil) robot, but not so well on the white (good) robot. The reason for this is, that the black (evil) robot has position sensors in the finger joints, so a raise of the PID error can be recognized much better. That's why we will use the black (evil) robot for our project.

We are going to test and optimize the whole grasping routine today (Saturday). After that we will implement additional functions.

We tested the grasping and it work quite good. See Video.

Then we tried to grasp two bricks (actually even three) and put them together. See Pictures. (We were cheating a very little bit).

The grasping team is doing well controlled grasps using the black iCub.

final status

The grasping library can be found in the cvs repository. To use our library, you have to import the header file 'grasp_vvv09.h' and instance an object of this type in your source code. After that you can send the commands shown above to grasp or to move to the pregrasp position. For executing the grasp we interpolate linear between two pregrasp positions and also between two end positions. Setting the interpolation parameters between 0.0 (pinch) and 1.0 (fist) you can adjust the grasp style as you need it.

We have also written the small programs 'grasp' which listens to the yarp port '/grasp/doGrasp' and accepts the commands '(left|right) (grasp|pregrasp)'. For example 'left grasp' closes the left hand. To run the program first compile it by typing:

 $ cmake .
 $ make

After that you can run the program by typing:

 $ ./grasp

Reaching team

This team has the following goals:

  • Use inverse kinematics to get the hand in the right position and orientation for grasping.
  • After grasping the piece, bring it close to the face, and rotate it until the perception team finds the exact position of the piece in the hand.
  • Redefine the tool of the arm as the position of the piece, and move in cartesian coordinates based on it.
  • Put the two pieces together, detect the event monitoring the force sensors, or vision?
  • Try the new iKin-based inverse kinematics, done by Ugo Pattacini

Participants:

  • Alexis
  • Federico
  • Boris


git repository

  • We will use GIT locally (inside the summer school) for now, since accessing the CVS server in sourceforge is very slow. But we will integrate the working system into the iCub repository before we finish next week. (Giorgio's suggestion)

Install git on your laptop, give Alexis your public ssh key, and then do the following:

 git clone gitosis@10.0.0.253:vvvmanipulation.git

You should get a directory called vvvmanipulation

When you make your changes, remember to do:

 git add files
 git commit            #Add some good comment
 git pull --rebase     #This synchronizes with the server
 git push              #Sends your changes to the server
  • Remember to tell git your name and email!: (only once)
 git config --global user.name "James Bond"
 git config --global user.email jb@icub_rules.com
  • The IP of the server changed! Please reconfigure the URL by running this in the repository:
 git config remote.origin.url gitosis@10.0.0.253:vvvmanipulation.git
Personal tools
Namespaces

Variants
Actions
Navigation
Print/export
Toolbox