This website uses browsing/session and functional cookies to ensure you get the best experience. Learn More

Simulator README

From Wiki for iCub and Friends
Jump to: navigation, search

Contents

Before trying to compile:

iCub Simulator prerequisites: Cmake, Ace, YARP, GTK, iCUB repository, ODE, SDL Please make sure you have everything installed correctly by following this link

CMAKE:

Once the library is installed, set your environment variables to:

ODE_DIR -> root directory of the compiled library
SDLDIR -> root directory of the SDL directory (At least in windows, plus don't forget to add sdl.dll to your path)

By default the simulator uses ODE compiled in double precision

Make sure that USE_ODE_DOUBLE is checked in the CMAKE flags

if by any chance you decide to use single precision in ode, make sure to uncheck the USE_ODE_DOUBLE flag.

RUNNING THE PROGRAM:

The iCubSimulator uses the ResourceFinder to read configuration parameters from a file placed in a directory called simConfig in $ICUB_ROOT/app. This makes changing the parameters used by the module by switching its "initialization context" much easier. The iCubSimulator looks for all the configuration parameters in a file called simulator.ini which sets up all the required parameters accordingly. By default the parameters are in $ICUB_ROOT/app/simConfig therefore by running run iCub_SIM from the source directory, it will pick up all the default configuration files. If the users would like to run the simulator with their own configuration files, we have to tell the module (the ResourceFinder) where to look for the file ‘simulator.ini’. We do this by specifying the initialization context, that is the name of the directory that should be used to locate the file ‘simulator.ini'.

A "iCub_parts_activation" initialization file is added to the "conf" subdirectory. This file contains setup for the simulated iCub and for the vision module (setting it directly on/off) the iCub setup file looks like this:

/// initialization file for the icub simulation parts

[SETUP]
elevation off
startHomePos on

[PARTS]
legs on
torso on
left_arm on
left_hand on
right_arm on
right_hand on
head on
fixed_hip on

[COLLISIONS]
self_collisions off
covers_collisions off
[SENSORS]
pressure off
whole_body_skin_emul off 
[VISION]
cam on

[RENDER]
objects off
screen off
head_cover on
legs_covers on
torso_covers on
left_arm_covers on
right_arm_covers on

Make sure you select the parts you would like to use otherwise they will not respond. It is also recommended to turn off the parts not required to save computational effort.

On linux, you'll need to be careful not to put windows over the simulator window - otherwise the output on the camera ports will be random in those areas :-).

USING THE FEATURES:

  • Please set the desired iCub parts activation before running.
  • Manual navigation in the environment is possible. Use keys w,a,s,d and mouse for rotating.
iCub_SIM or
iCub_SIM --name anyName       (where anyName can be any string---Remember that this will also change all the simulator port names accordingly)

iCub_SIM --verbosity VERBOSITY_LEVEL
        

CONTROLLING THE ROBOT:

Current robot ports: please refer to http://eris.liralab.it/wiki/ICub_joints for further informations (only reporting rpc, there are also command ports and state ports for all parts)

yarp rpc /icubSim/left_leg/rpc:i  (The left leg has 6 joints in the standard configuration)
yarp rpc /icubSim/right_leg/rpc:i  (The structure is identical to the left leg)
yarp rpc /icubSim/torso/rpc:i     (The Torso has 3 joints in the standard configuration)
yarp rpc /icubSim/left_arm/rpc:i  (The arm includes the hand for a total of 16 controlled degrees of freedom)
yarp rpc /icubSim/right_arm/rpc:i (The structure is identical to the left arm)
yarp rpc /icubSim/head/rpc:i      (The head has 6 joints in the standard configuration)

A user can see the commands that the "rpc:i" port supports by using "yarp rpc" to send it the message "help".

For example:

yarp rpc /icubSim/left_arm/rpc:i
set pos 0 45
will command axis 0 of the left_arm (Shoulder pitch) to 45 degrees.

Note, there is a simple short-cut to make the robot go to its home position. Just press 'H' while the window is active.


CAMERA:

images:

It is possible to get the images from the left and right eye as well as a "world" view. The format of the cameras are:

- Cartesian ( 320 x 240 )
- Fovea     ( 128 x 128 )
- Logpolar  ( 252 x 152 ) 

ports available:

/icubSim/cam/left (left camera Cartesian images)
/icubSim/cam/left/fovea (left camera fovea images)
/icubSim/cam/left/logpolar (left camera log polar images)
/icubSim/cam/right (right camera Cartesian images)
/icubSim/cam/right/fovea (right camera fovea images)
/icubSim/cam/right/logpolar (right camera log polar images)
/icubSim/cam/ for the "world" view

example:

eg: "yarpview /example"
eg: "yarp connect /icubSim/cam/left /example"


Triangulation basics:

To perform triangulation in the iCub simulator you can use the iKinHead module. The default configuration file used by iKinHead refers to the real cameras of the robot. To use iKinHead with the simulator you need to specify a different configuration file, located in $ICUB_ROOT/main/app/cameraCalibration/conf:

iKinHead --robot icubSim --from icubSimEyes.ini

Triangulation advanced:

This file contains different values for the parameters fx, fy, cx, cy, that for the simulator are:

fx 257.34
fy 257.34
cx 160
cy 120

Usually the camera parameters are expressed in terms of 'field of view'. The 'field of view' is related to the fx and fy parameters required by iKinHead. The relation is:

fx = 0.5* w* cotan(0.5*fovx)
fy = 0.5* h* cotan(0.5*fovy)

where:

w = window width in pixels = 320
h = window height in pixels = 240

The field of view specified in the simulator code is fovy (it is set to 50 degrees), and it is different from fovx. Since in the simulator the aspect ratio of the viewing frustum (set by the method gluPerspective) is equal to the aspect ratio of the cameras (set by the method glViewport), i.e. there is not distortion, the relation between fovx and fovy is:

Cotan(0.5*fovx) = cotan(0.5*fovy) * h/w

In the end it turns out that fx = fy, therefore there is no need to know fovx, since foxy is enough for computing both fx and fy. The current values of fx and fy set in the file "icubSimEyes.ini" are computed considering a 50 degree FOVY:

fx=fy=0.5*240*cotan(50/2) = 257.34

If you want to change the FOVY, then you also have to change fx and fy accordingly.

OBJECT INFORMATION:

There are currently different objects in the world ball, cube, and box. For the box please refer to 3.2

in the world port:

"yarp rpc /icubSim/world"

  • GET example:
"world get cube"  (or ball) this will return the x y z coordinates of the object
  • SET example:
"world set box x y z"  eg: "world set ball 1.0 0.1 2.0"

Object reference frame:

Simulator-reference-frames.jpg

Objects are placed in the <world> reference frame, placed on the floor, between the legs of the robot as in the picture (y axis pointing above, x towards the left of the robot and z towards the front). Notice that the robot kinematics (Cartesian interface) uses the <robot> reference frame.

The rototranslational matrix describing the homogeneous transformation from the <robot> reference frame to the <world> reference frame of the simulator is:

T = [ 0 -1 0 0; 0 0 1 0.5976; -1 0 0 -0.026; 0 0 0 1 ] (given in a matlab-like format, ordered by rows)

Note that this rototranslation applies as long as the 'elevation' flag in the simulator configuration file is set to off. Otherwise, the robot is lifted, changing the position of the root reference frame w.r.t. the world reference frame.

OBJECT MANIPULATION:

Object Creation:

It is possible to create boxes, spheres and cylinders in the world. It is also possible to specify if the actual object can collide with the robot. This is an extra boolean parameter while creating the object. By default the collision is on and can be removed only if specified by the user.

in the world port:

yarp rpc /icubSim/world
world mk box (three params for size) (three params for pos) (three params for colour)  ------- This creates a box affected by gravity
world mk sph (radius)(three params for pos) (three params for colour)   ------- This creates a sphere affected by gravity
world mk cyl (radius length) (three params for pos) (three params for colour)  ------- This creates a cylinder affected by gravity
world mk sbox (three params for size) (three params for pos) (three params for colour)  ------- This creates a static box
world mk ssph (radius) (three params for pos) (three params for colour)   ------- This creates a static sphere
world mk scyl (radius length) (three params for pos) (three params for colour)   ------- This creates a static cylinder

eg:

world mk box 0.03 0.03 0.03 0.3 0.2 1 1 0 0
world mk sph 0.04 0.0 1.0 0.5 1 0 1  
world mk cyl 0.1 0.2 0.0 0.9 1.0 0 0 1     

OPTIONAL: removing collision with the robot - by default the flag is set to "true". eg:

world mk box 0.03 0.03 0.03 0.3 0.2 1 1 0 0 false
world mk box 0.03 0.03 0.03 0.3 0.2 1 1 0 0 FALSE
world mk box 0.03 0.03 0.03 0.3 0.2 1 1 0 0 0

the first box will be named box/cyl 1, second box/cyl 2 and so on...

Get/set object position:

to get and set positions for these newly created objects:

world get box (num)   or     world set box (num) x y z

eg:

world get box 1    or     world set box 1 2 2 2
world get sph 1    or     world set sph 1 2 2 2 
world get cyl 1    or     world set cyl 1 2 2 2

Get/Set object rotation:

If you need to rotate the boxes or the cylinders just use the following function:

world rot (object)(num) rotx roty rotz 

(where: rotx = rotation in degrees on the x axis roty = rotation in degrees on the y axis rotz = rotation in degrees on the z axis )

To query the rotation of an object, omit the angles:

world rot (object)(num)

Change the color of objects

If you need to change the color of the created objects just use the following function:

world col (object)(num) R G B

objects for now can be box/sbox/cyl/scyl/sph/ssph


Deleting objects in the simulator:

world del all

(this will delete all objects in the world) objects for now can be box/sbox/cyl/scyl/sph/ssph

Importing 3D models into the simulator:

This sections describes how to add 3D models with their textures into the iCub Simulator. In order to easily import 3D objects created in a modelling package (such as Blender, 3DS Max, Milkshape, Maya, etc.) we use the .x (Directx) file format.

Creating a model with a texture in Blender:

Open Blender and create your 3D model, if you are a novice please look at documentations and tutorial available freely in the web. Select the model and go to Edit Mode -> click on Mesh -> UV unwrap -> unwrap. This is to unwrap the model in order to attach a texture. Go to the UV/Image editor, click on image and open. Browse for your texture image (bmp format) and load it. At this stage you can do any modifications to the texture as desired. Again if you are not familiar with how to do this please look at documentations and tutorial available freely in the web. Return to the 3D View, and press "Ctrl+ t". This will convert the model into triangles for easy import into the simulator. Return to Object mode, click on file -> export -> Directx (.x) make sure that you are using the right handed system by clicking on "flip z" and then export to the desired location.

When you rotate/scale objects, the transformations are not automatically applied. Before exporting, you need to select the mesh (in Object Mode), press space and select Transform->Clear/Apply->Apply Scale/Rotation. This will apply any transformation and will be exported correctly in the .x file

Run the simulator, connect to the world port

yarp rpc /icubSim/world

Here are the new commands for importing and manipulate your newly created 3D models.

The default path to the model is set to %ICUB_ROOT/app/simConfig/models

The user can verify this path by typing:

world get mdir 

this will return the current path of the model folder

This can be changed to whatever the user wants by typing:

world set mdir full_path_to_models    

e.g. world set mdir /home/user/Desktop/Models/

To import the model into the simulator:

world mk model name_of_model.x name_of_texture.bmp Xpos Ypos Zpos    

(this will create a dynamic 3D model)

world mk smodel name_of_model.x name_of_texture.bmp Xpos Ypos Zpos   

(this will create a static 3D model)

Extra commands to manipulate model:

world get model (num)  

(this will return the X Y Z position of the 3D model)

world set model (num)  Xpos Ypos Zpos 

(this will set the model to the required X Y Z position)

world rot model (num)  rotx roty rotz 

(this will rotate the model to the required X Y Z angle) (where: rotx = rotation in degrees on the x axis roty = rotation in degrees on the y axis rotz = rotation in degrees on the z axis


These commands are also valid for the static model (smodel).

Test models can be found in $ICUB_DIR/app/simConfig/models

HAND POSITIONS:

in the world port:

yarp rpc /icubSim/world

hand positions:

world get lhand 

or

world get rhand


COLLISIONS AND SELF-COLLISIONS:

Collisions and self-collisions:

In the standard regime, iCub's body parts can collide only with external objects. In addition, the plastic covers (meshes) are exempted from collision detection.

The iCub simulator was later modified, such that self-collisions for iCub's own body can be turned on, as well as collisions for covers. These options may slow down the simulator. The respective options in iCub_parts_activation.ini are:

[COLLISIONS]
self_collisions on
covers_collisions on

The self-collisions are implemented by dividing the iCub's body into several collision spaces (normally, the whole robot is one space and thus its parts are not checked for self-collisions). With "self-collisions on", the collision spaces are: head, torso, left arm, right arm, legs. Thus, internally, during collision detection in ODE, the bodies composing say the left arm space are checked against the bodies composing the other spaces. At the same time, no collisions are checked for within a space. This means that currently the two legs would not collide with each other, for example.

If covers_collisions are on, that means that the plastic covers will become proper collision bodies like the other bodies composing the robot. They will thus be checked for collisions with external objects, and - if "self-collisions on" - they will be part of respective subspaces of the robot and can thus collide with other subspaces.

TOUCH SENSORS:

Touch sensors on fingertips only:

This is the original implementation. Corresponds to the "whole_body_skin_emul off" option under [SENSORS] in iCub_parts_activation.ini.

6 touch sensors have been added on each of the hands and are now compatible with the physical iCub and can be used with the iCubSkinGui These are boolean (1 or 0) if the "pressure off" option is chosen, or pressure sensors ("pressure on"). The convention used in the compensated tactile ports of the real robot is used - higher values correspond to higher pressure.

  • last part of index finger
  • last part of middle finger
  • last part of ring finger
  • last part of little finger
  • last part of thumb
  • palm

the sensor data is streamed to these ports:

/icubSim/skin/left_hand_comp
/icubSim/skin/right_hand_comp

(in fact, only if someone is reading from the port)

recap of the 192 sensor data sent to the ports:

index(12) middle(12) ring(12) little(12) thumb(12) empty(12) empty(12) empty(12) palm(12) palm(12) palm(12) palm(12) empty(12) empty(12) empty(12) empty(12)

To run the iCubSkinGui with the simulator, you must run it with --useCalibration. eg:

iCubSkinGui --from lefthand.ini --useCalibration

to connect to it simply create a yarp read port eg:

yarp read /reader /icubSim/skin/left_hand_comp

Whole-body skin emulation:

In the iCub_parts_activation.ini config file, this corresponds to

[SENSORS]
whole_body_skin_emul on

For appropriate functionality, there should also be

[COLLISIONS] 
covers_collisions on

This implementation relies on a list of collisions (supplied by ODE) with the covers and fingertips - since these also host the skin in the real robot. At the moment, the resolution is very crude for most parts - if collision with a cover is detected, all the tactile sensors on the skin part are activated. The values on output are all set to maximum activation - 255 - for all taxels (with the exception of non-existent taxels - zero padding on the output port - see Tactile sensors wiki). Currently, the resolution was increased for:

  • the palms (skin version 2.1, without triangles); it is split into 5 subregions that selectively light up depending on the contact coordinates.
  • the forearms - resolution of roughly 2 triangles activated at once

This emulation relies on retrieving the contact coordinates and then comparing those with the position of the taxels in the real robot. It is thus subject to inaccuracies in the kinematics of the simulator vs. the real robot and in skin calibration of the real robot.

The iCub skin ports are emulated. Following the original setup for left and right hand, data is sent only if someone is reading. The ports created are thus:

/icubSim/skin/left_arm_comp
/icubSim/skin/left_forearm_comp
/icubSim/skin/left_hand_comp
/icubSim/skin/right_arm_comp
/icubSim/skin/right_forearm_comp
/icubSim/skin/right_hand_comp

In addition, skin_events port is emulated.

/icubSim/skinManager/skin_events:o

This is implemented as follows: For each contact joint created by ODE, a skinContact / skinEvent, is created and sent to the port. See also Tactile sensors wiki) - "High level contact data" for the details of the format. The coordinates of the contact are transformed from the simulator world coordinates to coordinates of the respective link to match the conventions on the real robot. The skinContactList can hence be visualized using the iCubGui. The force value and direction is set based on the contact feedback structure from ODE, but is not necessarily very meaningful (esp. the value) - this was reported in the ODE mailing list. For skin parts that have a higher resolution emulation implemented, the taxel IDs is a list of activated taxels. Otherwise, the taxel id is a FAKE_TAXEL_ID, currently set to 10000. Note also that the skinEvents are generated for any bodies experiencing contact, that is not only those containing skin in the real robot (covers).

Note, for skin emulation to work properly (including skin_events and higher resolution for indiv. skin parts), the iCub [PARTS] in iCub_parts_activation.ini need to be enabled. More specifically, for the upper body skin emulation that is currently implemented, head, torso, left and right arm need to be "ON".

Pressure sensors flag

In the iCub_parts_activation.ini file, there is a flag to switch to the pressure sensors. This is part of the original "Touch sensors on fingertips only" implementation, where an "off" setting results in boolean values, where "on" retrieves the pressure values from ODE.

In the "Whole-body skin emulation" mode, the flag has no effect. The values in the emulated compensated tactile ports are either 255 (when active) or 0 (when inactive). Actual pressure values as obtained from ODE are available in the skin_events structure.

INERTIAL SENSOR:

Streams out a sequence of inertial data taken from the head

inertial port

/icubSim/inertial

port contains a vector of 12 values:

0:2 :euler angles roll, pitch yaw (degrees)

Roll and Pitch are similar to the robot. Yaw is the actual yaw angle whilst the inertial sensor on the robot calculates it with respect to the magnetic field.

3:5 :calibrated acceleration along X,Y,Z (m/s^2)

These accelerations are not the free accelerations but they measure all accelerations, including the acceleration due to gravity. (as on the real robot)

6:8 :calibrated rate of turn (gyro), along X,Y,Z axes (rad/s)

Provides angular velocities in radians

9:11 :calibrated magnetic field X,Y,Z (arbitrary units)

The magnetic fields are not calculated. The simulator just sends 0.0 instead.

The inertial data have been compared with the values coming from the robot and similar data have been reported

A simple way of connecting to this port is:

yarp read /reader /icubSim/inertial

GRAB FUNCTION (MAGNET):

attach and release object, with a fixed joint possibility for the "no hand" (hand turned OFF) functionality (by popular demand)

PS: this is just a quick way to move things around without using the hand. It only works when the hand is NOT activated and it creates a fixed joint between the hand selected and the object. It will not move any fingers...just lift the arm to see the object move with it.

"grab" function:

world grab (obj) (hand) 1  -to grab
world grab (obj) (hand) 0  -to release

eg:

world grab ball left 1

to grab the newly created objects: (box)

world grab box (num) (hand) 1 or 0

eg:

world grab box 1 right 1

FACE EXPRESSIONS:

Face expressions can now be used with the simulator.

There are two ways of running the face expressions on the simulator. One is sending raw commands directly to the faceExpression module or using the emotionInterface (a more human readable way) that is currently being used by the real robot.

Emotion Interface:

Run the

  • simulator,
  • simFaceExpressions
  • emotionInterface

Connect to the following ports: (or use the script in the $ICUB_ROOT/app/simFaceExpression/scripts )

yarp connect /face/eyelids /icubSim/face/eyelids
yarp connect /face/image/out /icubSim/texture/face
yarp connect /emotion/out /icubSim/face/raw/in

Open writer so you can set face expressions in the simulator

yarp write /writer /emotion/in

Face expressions are set by sending high level commands:

       set mou <cmd>   -  set an expression in the mouth subsystem
       set eli <cmd>   -  set an expression in the eye-lids subsystem
       set leb <cmd>   -  set an expression in the left eyebrow subsystem
       set reb <cmd>   -  set an expression in the right eyebrow subsystem
       set all <cmd>   -  set an expression in the whole system

The several <cmd> are described in file emotions.ini. You can also define your own. Here are the default ones:

  neu (neutral)
  hap (happy)
  sad (sad)
  sur (surprised)
  ang (angry)
  evi (evil)
  shy (shy)
  cun (cunning)

raw commands:

Run the simulator and connect these ports: (or use the script in the $ICUB_ROOT/app/simFaceExpression/scripts without running emotionInterface)

yarp connect /face/eyelids /icubSim/face/eyelids 
yarp connect /face/image/out /icubSim/texture/face

Open writer so you can set face expressions in the simulator

yarp write /writer /icubSim/face/raw/in

Face expressions are set by changing hexadecimal values of the following subsystems:

left eyebrow (from L00 to L08)
right eyebrow (from R00 to R08)
mouth (from M00 to M3F)
eyelids (from S24(closed) to S48(opened))

The above commands in the brackets will set iCub's mood to sort of happy. Only one command is send at a time. The first letter in the command specifies the subsystem and the rest is hexadecimal number, which is then decoded to 8 bits binary number. For example L followed by 02 sends this binary number (0000 0010) to the left eyebrow subsystem, which activates 2nd port. In similar way, L followed by 04 would activated 3rd port of the left eyebrow subsystem as the binary number for hex 02 is (0000 0100).

PROJECT A VIDEO STREAM ON A SCREEN

This section describes how to create a simple "screen" in the simulator that would take video stream and display it as a texture. This is now possible to turn on and off in the iCub_parts_activation.ini located in $ICUB_ROOT/app/simConfig/conf

Setting up the port:

If you look in the $ICUB_ROOT/app/simConfig/conf directory, you'll find a file called Video.ini. Its contents are:

... textures sky

[sky] textureIndex 15 <---------HERE the texture number must be 15 to work with the screen port /icubSim/texture

What happens here is we set up a yarp "port" name for each texture in the simulator that you want to control via a stream.


To test these settings enter the following commands:

yarpdev --device test_grabber --name /test/video --mode ball
yarp connect /test/video /icubSim/texture/screen

At this stage you can send web cam streams or video sequences directly to the port /icubSim/texture and it will be displayed as a texture on an object.

CONFIGURING VIDEO INPUTS

As we've seen, the simulator can map live video streams onto surfaces. This feature is used to project views on a screen, as in the previous section. It is also used for generation of part of the robot's facial expression. For each numbered texture in the simulator, a port can be opened to update that texture live. This process is controlled by the Video.ini configuration file. The simulator can be configured to have two (or in fact any number of) video inputs by setting up the simulator configuration file app/simConfig/conf/Video.ini as follows:

 textures texture1 texture2
 [texture1]
 textureIndex NN1
 port /any/port/name/you/like1
 [texture2]
 textureIndex NN2
 port /any/port/name/you/like2

Where NN1 and NN2 are the indices of the surfaces you want to project onto.

Personal tools
Namespaces

Variants
Actions
Navigation
Print/export
Toolbox