From 59f2a253072693439bdf2267def80e263bd3694a Mon Sep 17 00:00:00 2001 From: Vinman Date: Tue, 14 Nov 2023 11:13:54 +0800 Subject: [PATCH] [docs] Updated hand-eye calibration instructions --- ReadMe.md | 8 ++++++++ ReadMe_cn.md | 10 +++++++++- .../launch/d435i_lite6_auto_calib.launch | 15 ++++++++++++--- .../launch/d435i_xarm_auto_calib.launch | 16 +++++++++++++--- 4 files changed, 42 insertions(+), 7 deletions(-) diff --git a/ReadMe.md b/ReadMe.md index 7b9a66cd..156b3fec 100755 --- a/ReadMe.md +++ b/ReadMe.md @@ -738,6 +738,14 @@ Sample calibration result TF publication: [publish_handeye_tf_lite6.launch](./xarm_vision/d435i_xarm_setup/launch/publish_handeye_tf_lite6.launch) +### 7.2.2 Precautions for Hand-eye Calibration: +Since the position and orientation of the robot arm generated by `easy_handeye` by default does not change much, the final calibration result may not be too accurate or stable. +In the actual calibration process, we do not need to use the position generated by `easy_handeye`. We can specify the startup parameter `freehand_robot_movement:=true` in the above command to start, and control the robotic arm to different positions through the xarm studio control interface or enable drag teaching. Then collect data through "__Take Sample__" of the activated hand-eye calibration window. After collecting about 17 data, calculate it through "__Compute__". It is recommended to rotate the rpy as much as possible to ensure that the calibration plate is within the field of view for the position of the robotic arm during each acquisition. +- The angle between the rotation axes of the two movements should be as large as possible +- The rotation angle corresponding to the rotation matrix of each movement should be as large as possible +- The distance from the camera center to the calibration plate should be as small as possible +- The distance moved by the end of the robot arm in each movement should be as small as possible + ## 7.3 Vision Guided Grasping Demo: [***find_object_2d***](http://introlab.github.io/find-object/) is used for this demo for simple object detection and grasping. Hardware used in this part: RealSense D435i depth camera, UFACTORY camera stand and the xArm Gripper. diff --git a/ReadMe_cn.md b/ReadMe_cn.md index 901ca2c5..e91198a5 100755 --- a/ReadMe_cn.md +++ b/ReadMe_cn.md @@ -706,7 +706,7 @@ $ catkin_make ```bash $ roslaunch d435i_xarm_setup d435i_xarm_auto_calib.launch robot_dof:=your_xArm_DOF robot_ip:=your_xArm_IP ``` -标定使用的aruco二维码可以在[这里下载](https://chev.me/arucogen/),请记住自己下载的`marker ID`和`marker size`,并在以上launch文件中修改。参考[官方](https://github.com/IFL-CAMP/easy_handeye#calibration)或其他网络教程通过图形界面进行标定,标定完成并确认保存后,默认会在 `~/.ros/easy_handeye`目录下生成`.yaml`后缀的结果文档,供后续与手臂一起进行坐标变换使用。如果固定件用的是UFACTORY提供的[camera_stand](https://www.ufactory.cc/products/xarm-camera-module-2020),在xarm_vision/d435i_xarm_setup/config/[xarm_realsense_handeyecalibration_eye_on_hand_sample_result.yaml](./xarm_vision/d435i_xarm_setup/config/xarm_realsense_handeyecalibration_eye_on_hand_sample_result.yaml)中保存了参考的标定结果。 +标定使用的aruco二维码可以在[这里下载](https://chev.me/arucogen/),请记住自己下载的`marker ID`和`marker size`,并在以上launch文件中修改。参考[官方](https://github.com/IFL-CAMP/easy_handeye#calibration)或其他网络教程通过图形界面进行标定,标定完成并确认保存后,默认会在 `~/.ros/easy_handeye`目录下生成`.yaml`后缀的结果文档,供后续与手臂一起进行坐标变换使用。如果固定件用的是UFACTORY提供的[camera_stand](https://www.ufactory.cc/products/xarm-camera-module-2020),在xarm_vision/d435i_xarm_setup/config/[xarm_realsense_handeyecalibration_eye_on_hand_sample_result.yaml](./xarm_vision/d435i_xarm_setup/config/xarm_realsense_handeyecalibration_eye_on_hand_sample_result.yaml)中保存了参考的标定结果。 ### 7.2.1 关于 UFACTORY Lite6 手眼标定: 请首先阅读和了解上面7.2章节关于xarm系列的标定示例,然后使用下面列出的替换文件应用于lite6的标定: @@ -719,6 +719,14 @@ $ roslaunch d435i_xarm_setup d435i_lite6_auto_calib.launch robot_ip:=your_xArm_I 标定结果发布启动文件示例: [publish_handeye_tf_lite6.launch](./xarm_vision/d435i_xarm_setup/launch/publish_handeye_tf_lite6.launch) +### 7.2.2 手眼标定注意事项: +由于`easy_handeye`默认生成的机械臂位置的位姿变化不大,导致最终的标定结果可能不是太准确也不那么稳定。 +在实际标定过程中我们可以不使用`easy_handeye`生成的位置,可以在上述命令指定启动参数`freehand_robot_movement:=true`启动,通过xarm studio控制界面或者开启拖动示教认为控制机械臂到不同位置,然后通过启动的手眼标定窗口的"__Take Sample__"采集数据,采集到大概17个数据后通过"__Compute__"计算,每次采集时机械臂的位置建议在保证标定板在视野范围内尽量多旋转rpy +- 两次运动的旋转轴的夹角越大越好 +- 每次运动的旋转矩阵对应的旋转角度越大越好 +- 相机中心到标定板的距离越小越好 +- 每次运动机械臂末端运动的距离越小越好 + ## 7.3 3D视觉抓取示例: 本部分提供利用[***find_object_2d***](http://introlab.github.io/find-object/)进行简单的物体识别和抓取的示例程序。使用了RealSense D435i深度相机,UFACTORY camera_stand以及xArm官方机械爪。 diff --git a/xarm_vision/d435i_xarm_setup/launch/d435i_lite6_auto_calib.launch b/xarm_vision/d435i_xarm_setup/launch/d435i_lite6_auto_calib.launch index 2f43ca91..aa0e5d09 100644 --- a/xarm_vision/d435i_xarm_setup/launch/d435i_lite6_auto_calib.launch +++ b/xarm_vision/d435i_xarm_setup/launch/d435i_lite6_auto_calib.launch @@ -1,6 +1,8 @@ + + @@ -27,10 +29,17 @@ - - + + + + + + @@ -42,7 +51,7 @@ - + diff --git a/xarm_vision/d435i_xarm_setup/launch/d435i_xarm_auto_calib.launch b/xarm_vision/d435i_xarm_setup/launch/d435i_xarm_auto_calib.launch index 3ec91fed..5af175fd 100644 --- a/xarm_vision/d435i_xarm_setup/launch/d435i_xarm_auto_calib.launch +++ b/xarm_vision/d435i_xarm_setup/launch/d435i_xarm_auto_calib.launch @@ -2,6 +2,9 @@ + + + @@ -28,7 +31,14 @@ - + + + + + + + + @@ -51,7 +61,7 @@ - +