Hanjie's Blog

一只有理想的羊驼

安装

官方编译方法1

1
2
3
4
5
6
7
8
9
10
11
12
sudo apt-get install python-rosinstall
sudo apt-get install ros-indigo-libg2o ros-indigo-cv-bridge liblapack-dev libblas-dev freeglut3-dev libqglviewer-dev libsuitesparse-dev libx11-dev
mkdir ~/SLAM/Code/rosbuild_ws
cd ~/SLAM/Code/rosbuild_ws
roses init . /opt/ros/indigo
mkdir package_dir
roses set ~/SLAM/Code/rosbuild_ws/package_dir -t .
echo "source ~/SLAM/Code/rosbuild_ws/setup.bash" >> ~/.bashrc
bash
cd package_dir
git clone https://github.com/tum-vision/lsd_slam.git lsd_slam
rosmake lsd_slam

以上是官方的方法,但是我出现了编译错误:

1
2
3
4
5
opt/ros/indigo/include/sophus/sim3.hpp:339:5: error: passing ‘const RxSO3Type {aka const Sophus::RxSO3Group}’ as ‘this’ argument of ‘void Sophus::RxSO3GroupBase::setScale(const Scalar&) [with Derived = Sophus::RxSO3Group; Sophus::RxSO3GroupBase::Scalar = double]’ discards qualifiers [-fpermissive]
rxso3().setScale(scale);
^
make[3]: *** [CMakeFiles/lsdslam.dir/src/DepthEstimation/DepthMap.cpp.o] Error 1
make[3]: *** [CMakeFiles/lsdslam.dir/src/SlamSystem.cpp.o] Error 1

于是尝试使用Ros的catkin对LSD-SLAM进行编译

使用catkin对LSD-SLAM进行编译23

1
2
3
4
mkdir -p ~/catkin_ws/src
git clone https://github.com/tum-vision/lsd_slam.git
cd lsd_slam
git checkout catkin

change to the Catkin "branch" of the LSD-SLAM project, then:

1
sudo apt-get install ros-indigo-libg2o ros-indigo-cv-bridge liblapack-dev libblas-dev freeglut3-dev libqglviewer-dev libsuitesparse-dev libx11-dev

lsd_slam/lsd_slam_viewerlsd_slam/lsd_slam_core文件夹下的package.xml中添加:

1
2
<build_depend>cmake_modules</build_depend>
<run_depend>cmake_modules</run_depend>

lsd_slam/lsd_slam_viewerlsd_slam/lsd_slam_core文件夹下的CMakeFiles.txt中添加:

1
find_package(cmake_modules REQUIRED)

并且在所有的target_link_libraries中添加X11,如:

1
target_link_libraries(lsdslam ${FABMAP_LIB} ${G2O_LIBRARIES} ${catkin_LIBRARIES} sparse cxsparse X11)

然后开始编译:

1
2
cd ~/catkin_ws/
catkin_make

运行离线测试程序4

Download the Room Example Sequence and extract it.

打开一个终端:

1
roscoe

打开另外一个终端:

1
2
3
cd ~/catkin_ws/
source devel/setup.sh
rosrun lsd_slam_viewer viewer

打开另外一个终端:

1
2
3
cd ~/catkin_ws/
source devel/setup.sh
rosrun lsd_slam_core live_slam image:=/image_raw camera_info:=/camera_info

打开另外一个终端:

1
2
cd ~/catkin_ws/
rosbag play ~/LSD_room.bag

使用摄像头运行LSD_SLAM

安装驱动5

1
2
3
4
5
6
7
8
cd ~/catkin_ws/
source devel/setup.sh
cd ~/catkin_ws/src
git clone https://github.com/ktossell/camera_umd.git
cd ..
catkin_make
roscd uvc_camera/launch/
roslaunch ./camera_node.launch

会显示一些摄像头的信息,如:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
opening /dev/video1
pixfmt 0 = 'YUYV' desc = 'YUYV 4:2:2'
discrete: 640x480: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 160x120: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 176x144: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 320x180: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 320x240: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 352x288: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 424x240: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 480x270: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 640x360: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 800x448: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 800x600: 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 848x480: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 960x540: 1/15 1/10 2/15 1/5
discrete: 1024x576: 1/15 1/10 2/15 1/5
discrete: 1280x720: 1/10 2/15 1/5
discrete: 1600x896: 2/15 1/5
discrete: 1920x1080: 1/5
discrete: 2304x1296: 1/2
discrete: 2304x1536: 1/2
pixfmt 1 = 'MJPG' desc = 'Motion-JPEG'
discrete: 640x480: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 160x120: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 176x144: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 320x180: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 320x240: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 352x288: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 424x240: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 480x270: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 640x360: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 800x448: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 800x600: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 848x480: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 960x540: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 1024x576: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 1280x720: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 1600x896: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
discrete: 1920x1080: 1/30 1/24 1/20 1/15 1/10 2/15 1/5
int (Brightness, 0, id = 980900): 0 to 255 (1)
int (Contrast, 0, id = 980901): 0 to 255 (1)
int (Saturation, 0, id = 980902): 0 to 255 (1)
bool (White Balance Temperature, Auto, 0, id = 98090c): 0 to 1 (1)
int (Gain, 0, id = 980913): 0 to 255 (1)
menu (Power Line Frequency, 0, id = 980918): 0 to 2 (1)
0: Disabled
1: 50 Hz
2: 60 Hz
int (White Balance Temperature, 16, id = 98091a): 2000 to 7500 (1)
int (Sharpness, 0, id = 98091b): 0 to 255 (1)
int (Backlight Compensation, 0, id = 98091c): 0 to 1 (1)
menu (Exposure, Auto, 0, id = 9a0901): 0 to 3 (1)
int (Exposure (Absolute), 16, id = 9a0902): 3 to 2047 (1)
bool (Exposure, Auto Priority, 0, id = 9a0903): 0 to 1 (1)
int (Pan (Absolute), 0, id = 9a0908): -36000 to 36000 (3600)
int (Tilt (Absolute), 0, id = 9a0909): -36000 to 36000 (3600)
int (Focus (absolute), 0, id = 9a090a): 0 to 255 (5)
bool (Focus, Auto, 0, id = 9a090c): 0 to 1 (1)

配置camera_node.launch文件6,如:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
<launch>
<node pkg="uvc_camera" type="uvc_camera_node" name="uvc_camera" output="screen">
<param name="width" type="int" value="640" />
<param name="height" type="int" value="480" />
<param name="fps" type="int" value="30" />
<param name="frame" type="string" value="wide_stereo" />

<param name="auto_focus" type="bool" value="False" />
<param name="focus_absolute" type="int" value="0" />
<!-- other supported params: auto_exposure, exposure_absolute, brightness, power_line_frequency -->

<param name="device" type="string" value="/dev/video1" />
<param name="camera_info_url" type="string" value="file://$(find uvc_camera)/example.yaml" />
</node>
</launch>

注意官方程序默认分辨率为640*480。

打开一个窗口运行roscore; 打开另外一个窗口:

1
2
3
cd ~/catkin_ws/
source devel/setup.sh
rosrun lsd_slam_viewer viewer

再打开另外一个窗口:

1
2
3
cd ~/catkin_ws/
source devel/setup.sh
roslaunch uvc_camera camera_node.launch

再打开另外一个窗口:

1
rosrun lsd_slam_core live_slam /image:=image_raw _calib:=<calibration_file>

校正文件calibration_file可参考lsd_catkin_ws/src/lsd_slam/lsd_slam_core/calib中的cfg文件。


  1. https://github.com/tum-vision/lsd_slam↩︎

  2. http://visbot.blogspot.jp/2014/11/tutorial-building-of-lsd-slam-on-ros.html↩︎

  3. https://github.com/tum-vision/lsd_slam/issues/206↩︎

  4. http://visbot.blogspot.jp/2014/11/tutorial-building-of-lsd-slam-on-ros.html↩︎

  5. https://defendtheplanet.net/2014/11/05/using-ros-indigo-webcam-by-the-uvc_camera-usb-video-class-package↩︎

  6. http://www.cnblogs.com/xtl9/p/4694881.html↩︎

项目需要,需要使用cv2.cv,但新版本的OpenCV 3.0已经移除了这旧版的库,所以需要再安装一个旧版的OpenCV 2.7版。

参考这篇文章1,可以实现同时使用多个版本的OpenCV:

  1. 安装Opencv 2.7和3.0版本

    1
    2
    brew install opencv
    brew install opencv3
    文件会分别安装在/usr/local/Cellar/opencv//usr/local/Cellar/opencv3/

  2. Python模块cv2.so分别位于

    1
    2
    /usr/local/Cellar/opencv/2.4.13.1/lib/python2.7/site-packages/cv2.so
    /usr/local/Cellar/opencv3/3.1.0_1/lib/python2.7/site-packages/cv2.so

  3. 使用Python,查看当前默认使用版本:

    1
    2
    3
    import cv2
    print cv2.__version__
    3.1.0

  4. 在'/usr/local/lib/python2.7/site-packages'文件夹中建立一个子文件夹,放入2.7版本的cv2.so

    1
    2
    $ ln -s /usr/local/Cellar/opencv/2.4.13.1/lib/python2.7/site-packages/cv2.so \
    /usr/local/lib/python2.7/site-packages/opencv2/cv2.so

  5. 当我们想在Python中使用2.7版本的OpenCV时:

    1
    2
    3
    4
    import sys
    sys.path.insert(0, '/usr/local/lib/python2.7/site-packages/opencv2')
    import cv2
    print cv2.__version__


  1. http://sam-low.com/vision/python/opencv/osx/2016/05/18/parallel-opencv.html↩︎

罗汉杰. 一种机械臂的轨迹规划方法及装置:中国,201610452034[P/OL]. 2016-11-23 [2018-03-20]. https://patentimages.storage.googleapis.com/8a/5f/bc/325d10a3dee6fc/CN106041941A.pdf

SCARA(Selective Compliant Articulated Robot for Assembly)型机械臂是机械臂的一种。它拥有4个关节(图.1),其中\(J_1\)\(J_2\)\(J_4\)为转动关节,\(J_3\)为滑动关节。

scara_rea
图. 1: SCARA型机械臂。
scara
图. 2: SCARA型机械臂工作区域。

图.2显示了机械臂的工作空间。在机械臂工作的过程中,很多任务需要知道机械臂可到达的边缘位置,比如在示教功能中,用户通过手持设备发送指令,使得机械臂一直沿某个方向直线运动,直至到达到工作区域的边缘时,速度和加速度将降为零。

传统的方法中,可以通过不断监测当前机械臂的是否已经是到达极限位置来实现。但是这种方法需要对路径上所有的位置点进行检测,效率低,计算量大。

在本文中,我们提出了一种基于几何的方法来确定SCARA型机械臂工作区域边缘的方法。给定起始点和方向后,能给出与该射线相交的最近的工作区域边缘位置。利用该方法,在机器人运动前就可以提前知道终点的位置,方便机器人进行运动学规划。

数学模型

我们按照图.3所示对SCARA型机械臂进行DH(Denavit-Hartenberg)坐标系分配。其中转动关节\(J_i\)的转动轴与各自的\(z_i\)轴互相平衡,臂长为\(\alpha _i\)\(\lbrace i\mid i \in \lbrace 1,2,4 \rbrace \rbrace\);滑动关节\(J_3\)的轴跟\(J_4\)的轴平行。

scara_mode
图.3: SCARA型机械臂的DH坐标系分配。
Scara_range_report
图.4: SCARA型机械臂工作区域俯视几何图。

图.3显示了SCARA型机械臂工作区域的俯视几何图。由图可知,机械臂工作区域边界是由\(\overparen{HAB}\)\(\overparen{BC}\)\(\overparen{CEG}\)\(\overparen{GH}\)四段圆弧所围成的。而圆弧分别位于\(\odot{O_1}\)\(\odot{O_2}\)\(\odot{O_3}\)\(\odot{O_4}\)四个圆上。圆心\(O_1\)\(O_4\)\(J_1\)轴重合;\(O_2\)\(O_3\)分别为\(J_1\)轴到转动到正负极限时\(J_2\)轴的位置。\(j_{Max}^i\)\(j_{Min}^i\)分别是电机轴\(J_i\)的正/负方向的最大活动范围,\(j_{Min}^i \le j_{Max}^i\)\(\lbrace j^i \mid - \pi \le j^i \le \pi \rbrace\)

由几何关系,我们可以得到:

表.1: 组成工作区域的4个圆弧的几何参数
圆弧 所在圆 圆心\((x_{O_i},y_{O_i})\) 半径\(r_i\) \(\theta _{Min}^i\) \(\theta _{Max}^i\)
\(\overparen{HAB}\) \(\odot{O_1}\) \({[0,0]}^{^\mathrm{T}}\) \(\alpha _1 + \alpha _2\) \(j_{Min}^1\) \(j_{Max}^1\)
\(\overparen{BC}\) \(\odot{O_2}\) \({[\alpha _1\cos(\theta _{Max}^1), \alpha _1\sin(\theta _{Max}^1)]}^{^\mathrm{T}}\) \(\alpha _2\) \(0\) \(j_{Max}^2\)
\(\overparen{CEG}\) \(\odot{O_3}\) \({[\alpha _1\cos(\theta _{Min}^1), \alpha _1\sin(\theta _{Min}^1)]}^{^\mathrm{T}}\) \(\alpha _2\) \(j_{Min}^2\) \(0\)
\(\overparen{GH}\) \(\odot{O_4}\) \({[0,0]}^{^\mathrm{T}}\) \(\left\lvert \overrightarrow {CO_4} \right\rvert\) \(\theta _{Min}^4\) \(\theta _{Max}^4\)

其中:

\[\begin{aligned} \overrightarrow {CO_4} &= \left[ {\begin{array}{*{20}{c}} {\alpha _2\cos \theta _{Max}^2\cos \theta _{Max}^1 - \alpha _2\sin \theta _{Max}^2\sin \theta _{Max}^1 + \alpha _1\cos \theta _{Max}^1} \newline {\alpha _2\cos \theta _{Max}^2\sin \theta _{Max}^1 + \alpha _2\sin \theta _{Max}^2\cos \theta _{Max}^1 + \alpha _1\sin \theta _{Max}^1} \end{array}} \right] \newline &= \left[ {f_x}(\theta _{Max}^1,\theta _{Max}^2), {f_y}(\theta _{Max}^1,\theta _{Max}^2) \right]^{^\mathrm{T}} \newline \newline \theta _{Max}^4 &= \arctan({f_y}(\theta _{Max}^1,\theta _{Max}^2), {f_x}(\theta _{Max}^1,\theta _{Max}^2)) \newline \newline \theta _{Min}^4 &= \arctan({f_y}(\theta _{Min}^1,\theta _{Min}^2), {f_x}(\theta _{Min}^1,\theta _{Min}^2)) \end{aligned}\]

假设点\(s(x,y)\)是圆\(\odot{O_i}\)边上的一个点,则点\(s\)相对于圆\(\odot{O_i}\)的弧度\(\theta ^i\)为:

\[\begin{equation} \label{SCARA_range_e1} \theta ^i = \begin{cases} \arctan(y, x), & \text{if i=1} \newline \arctan( - x\sin \theta _{Max}^1 + y\cos \theta _{Max}^1,x\cos \theta _{Max}^1 + y\sin \theta _{Max}^1), & \text{if i=2} \newline \arctan( - x\sin \theta _{Min}^1 + y\cos \theta _{Min}^1,x\cos \theta _{Min}^1 + y\sin \theta _{Min}^1), & \text{if i=3} \newline \arctan(y, x), & \text{if i=4} \end{cases} \end{equation}\]

我们假设有射线\(R(u)=I+u \cdot \lVert \mathbf{n} \rVert\),起点为\(I\),方向向量为\(\mathbf{n}\)\(\lVert \mathbf{n} \rVert\)为方向向量\(\mathbf{n}\)的单位向量。如果起点在工作区域内,则射线必与某圆\(\odot{O_i}\)有交点\(\kappa _j^{i}\)\(\kappa ^i\)为射线与圆\(\odot{O_i}\)的所有交点。\(\xi = \lbrace \kappa ^i \rbrace\)\(\xi\)为所有交点的集合。

图.5显示了其中的一种情况,射线\(IP\)分别与圆\(\odot{O_1}\)相交于点\(N\);与圆\(\odot{O_2}\)相交于点\(K\)和点\(M\);与圆\(\odot{O_4}\)相交于点\(J\)和点\(L\);与圆\(\odot{O_3}\)没有交点。我们也可以发现,点\(K\)\(N\)虽然在圆上,但是并不在工作区域的弧线上,所以我们在得到交点后,还要查看此时的弧度\(\theta ^i\)是否满足\(\theta _{Min}^i \le \theta ^i \le \theta _{Max}^i\)。点\(J\)为离射线起点最近的有效交点,是我们所寻找的点。

Scara_range_report2 图.5: 射线IP与圆\(\odot{O_1}\)相交于点\(N\)\(\odot{O_2}\)相交于点\(M\)\(K\)\(\odot{O_4}\)相交于点\(L\)\(J\)。其中点\(J\)为射线方向的最近工作区域边缘位置。

算法流程

射线与圆的交点

为了确定与给出射线相交的最近的工作区域边缘位置,我们分别计算出射线与每一个圆的交点集\(\kappa ^i\)。求射线与圆的交点,可以使用传统的参数方程法求交点,另一种方法是使用Akenine-Möller等提出的优化法12,算法流程如下:

SCARA射线圆伪代码
算法.1:Akenine-Möller优化法

该算法的优点是在计算交点前,先排除掉没有交点的情况,从而减少运算量。

整个算法流程

输入: 1. 射线\(R(u)=I+u \cdot \lVert \mathbf{n} \rVert\) 2. 轴\(J_1\)\(J_2\)各自的臂长\(\alpha_{1}\)\(\alpha_{2}\)以及活动范围$j_{Max}^1 \(,\) j_{Min}^1 \(,\) j_{Max}^2 \(,\) j_{Min}^2$。

输出: 1. 最近的工作区域边缘位置\(\kappa_{intersection}\)

SCARA边缘计算流程图
算法.2: 边缘计算流程图

  1. http://www.twinklingstar.cn/2015/2479/programmers_computational_geometry-bounding_volumes/↩︎

  2. Tomas Akenine-Möller, Eric Haines, and Naty Hoffman. Real-time rendering, second edition. AK, 2002.
↩︎

0%