田光兆,顧寶興,Irshad Ali Mari,周 俊,王海青
?
基于三目視覺的自主導航拖拉機行駛軌跡預測方法及試驗
田光兆1,顧寶興1※,Irshad Ali Mari2,周 俊1,王海青1
(1. 南京農(nóng)業(yè)大學工學院,南京 210031;2. 巴基斯坦信德農(nóng)業(yè)大學凱爾布爾工程技術學院,凱爾布爾 66020)
為了實現(xiàn)自主導航拖拉機離開衛(wèi)星定位系統(tǒng)時能夠持續(xù)可靠工作,該文提出了基于三目視覺的拖拉機行駛軌跡預測方法。該方法將三目相機分解為長短基線2套雙目視覺系統(tǒng)分時獨立工作。通過檢測相鄰時刻農(nóng)業(yè)環(huán)境中同一特征點的坐標變化反推拖拉機在水平方向上的運動矢量,并通過灰色模型預測未來時刻的運動矢量變化,最終建立不同速度下的前進方向誤差模型。試驗結果表明:拖拉機行駛速度為0.2 m/s時,46.5 s后前進方向誤差超過0.1 m,對應行駛距離為9.3 m。行駛速度上升到0.5 m/s時,該時間和行駛距離分別降低到17.2 s和8.6 m。當行駛速度上升到0.8 m/s時,該時間和距離分別快速降低至8.5 s和6.8 m。行駛速度越高,前進方向誤差增速越高。該方法可用于短時預測拖拉機的行駛軌跡,為自主導航控制提供依據(jù)。
拖拉機;自主導航;機器視覺;軌跡預測;灰色模型
為了降低人工成本、提高作業(yè)效率、改善作業(yè)質(zhì)量,具有自主導航功能的農(nóng)業(yè)機械越來越多地應用到農(nóng)業(yè)生產(chǎn)中來。比如,自主導航拖拉機能夠作為牽引機械進行田間播種、施肥、耕地[1-7]。自主導航聯(lián)合收割機能夠在無人干預的情況下收獲小麥、水稻和玉米[8-14]。自主導航插秧機能夠在水田里精準插秧,大幅提高作業(yè)精度,其效率是人工的50倍[15-18]。
視覺系統(tǒng)是自主導航農(nóng)業(yè)裝備的重要組成部分。視覺系統(tǒng)主要用來識別作物行、溝壟或障礙物,是農(nóng)機智能化作業(yè)的重要外界環(huán)境和自身姿態(tài)感知工具[19-21]。尤其是當GPS或北斗定位系統(tǒng)受到干擾無法正常工作時,視覺系統(tǒng)能夠進行輔助相對定位,保證導航工作能夠繼續(xù)進行[22-27]。同時,通過視覺系統(tǒng)也能夠對未來進行預測,其預測結果為導航?jīng)Q策與控制提供數(shù)據(jù)基礎。由于農(nóng)業(yè)機械導航控制具有嚴重時滯性,為了提高常規(guī)PID控制效果,文獻[28]和[29]都提到通過預測數(shù)據(jù)能夠顯著改善PID控制效果,具有很強的工程實際意義。
現(xiàn)有研究中,大多是在GPS或北斗可靠工作的前提下討論導航控制方法問題。而本文探討的問題是當GPS或北斗失效時,如何單獨利用視覺系統(tǒng)為自主導航拖拉機進行行駛軌跡預測,并提出一種基于灰色理論的軌跡預測方法。
本研究中視覺系統(tǒng)由Point Gray公司BBX3三目相機、1394B采集卡和工控機組成。
三目相機由右、中、左3個子相機構成。其中右、中2個子相機構成短基線雙目視覺系統(tǒng),右、左2個子相機構成長基線雙目視覺系統(tǒng)。三目視覺系統(tǒng)由長短基線2套雙目視覺系統(tǒng)疊加而成。2套雙目視覺系統(tǒng)空間坐標系原點和各軸正方向相同,原點在右相機光心,水平向右為軸正方向,垂直向下為軸正方向,水平向前為軸正方向。為了提高開發(fā)效率,Point Gray公司已經(jīng)直接將雙目系統(tǒng)的另外一個相機的抓圖和系統(tǒng)的視覺測量功能固化到API[30]。使用者無需采用傳統(tǒng)的雙目抓圖、圖像特征點檢測與匹配、視差法測距等一系列過程,只需根據(jù)單幅右相機圖像即可獲取環(huán)境深度信息。
圖1 三目相機結構
1394B采集卡用于高速接收相機回傳的數(shù)字圖像。工控機是圖像處理的核心部件,用于程序控制相機采集圖像,執(zhí)行圖像處理程序,輸出解算結果。
拖拉機運動矢量檢測的基本原理是:利用同一組靜止的特征點相鄰時刻在相機坐標系中的坐標變化,反推拖拉機的運動矢量,其具體檢測流程如圖2所示。
圖2 拖拉機運動矢量檢測流程
由于2套視覺系統(tǒng)空間坐標原點重合,所以同一個實際物理點在2套視覺系統(tǒng)中的坐標理論上也是完全吻合的。但是由于2套系統(tǒng)的基線長度不一樣,就會導致測量結果略有偏差。為了得到精確的測量結果,長短基線2套視覺系統(tǒng)執(zhí)行相同的運動檢測方法,然后求取均值。
其步驟包括:
1)在復雜背景農(nóng)業(yè)環(huán)境中,右相機采集圖像,圖像編號自增1,并將圖像存儲。
2)對右相機采集到的圖像進行SIFT特征點檢測,計算環(huán)境中所有特征點圖像坐標[31]。由于相機的圖像將不可避免地發(fā)生畸變,所以還需要對每個特征點的近似權值進行估算。圖3為某一特征點的近似權值估算方法如式(1)所示。
3)根據(jù)特征點的圖像坐標和步驟1)采集到的深度圖像,利用相機提供的API函數(shù)進行圖像坐標到相機坐標的轉換。
4)判斷當前處理的是否為第1幅圖像。若是,將每個有效特征點的圖像坐標、相機坐標以及權值存儲到數(shù)組中,然后重復步驟1)~4)。若不是第1幅圖像,則將以上數(shù)據(jù)存儲到數(shù)組中。
5)與前1幅圖像進行SIFT特征點匹配。將匹配成功的特征點對的圖像坐標保存到數(shù)組中。
圖3 特征點近似權值計算
6)遍歷數(shù)組,分別從數(shù)組和中找出匹配成功的特征點對所對應的相機坐標和近似權值,保存到數(shù)組。
式中表示有效特征點的總個數(shù)。
農(nóng)用拖拉機大多數(shù)都是勻速低速作業(yè),根據(jù)其作業(yè)特點,本文設計了灰色理論的軌跡預測方案。
圖4 通過滑動窗口獲取預測數(shù)據(jù)
假設時刻滑窗內(nèi)個運動矢量組成樣本(0),其中(0)形式為式(4)。
為了降低干擾數(shù)據(jù)對有效數(shù)據(jù)的影響,對(0)進行一次累加,得到(0)的1-AGO序列為(1),如式(5)所示。
其中
則GM(1,1)模型的表達式為一階微分方程,如式(6)所示。
其中
對式(8)離散化后,可得出一次累加后+1時刻的預測模型,如式(9)所示。
以東方紅SG250型拖拉機為試驗平臺,將BBX3型三目相機以水平姿態(tài)安裝在拖拉機頭部的配重梁前端,距離地面0.6 m,如圖5所示。同時拖拉機頂部安裝精度為厘米級的RTK-GPS系統(tǒng)。在光線條件良好的晴天上午,在具有大量砂石的硬路面開展試驗。視覺檢測和RTK-GPS檢測同步,頻率都是10 Hz。拖拉機分別以0.2、0.5、0.8 m/s的低速直線行駛。通過工控機采集GPS數(shù)據(jù)和視覺預測數(shù)據(jù),繪制實測軌跡和預測軌跡,提取相同行駛距離內(nèi)的有效數(shù)據(jù),分析預測精度。所用工控機型號為研華ARK3500P,CPU型號為i7-3610,內(nèi)存4 GB。由于GPS采用了載波相位實時差分技術,定位精度可達厘米級,因此可將GPS定位數(shù)據(jù)作為參考標準,以此驗證三目視覺系統(tǒng)的運動檢測與預測精度。
圖5 三目相機安裝位置
試驗過程中,GPS初始時刻的全局定位數(shù)據(jù)作為基準。按照文中方法,通過視覺系統(tǒng)獲得的下一時刻的增量數(shù)據(jù)加上GPS基準數(shù)據(jù),就形成了視覺系統(tǒng)的測量數(shù)據(jù)(也是絕對坐標)。由多個視覺系統(tǒng)測量數(shù)據(jù)可以形成視覺系統(tǒng)預測數(shù)據(jù)。在某時刻的視覺系統(tǒng)預測數(shù)據(jù)和該時刻的GPS的定位數(shù)據(jù)之間必然存在一定的誤差。本文得到這個誤差后再向前進方向(方向)和側向(方向)進行分解,得到2個方向上的誤差分量。
拖拉機分別在0.2、0.5、0.8 m/s的恒定速度下直線行駛,軌跡預測試驗結果分別如圖6~圖7和表1所示。
圖6a、6c、6e中,實線是根據(jù)GPS數(shù)據(jù)繪制的拖拉機行駛軌跡。虛線是根據(jù)前文所述三目視覺預測方法得到的預測軌跡。視覺預測軌跡基本與GPS實測軌跡一致。但是隨著行駛距離的增大,預測的累積誤差越來越明顯。
圖6b、6d、6f表明,方向的誤差是導致預測軌跡和實測軌跡偏差越來越大的主要原因。方向誤差在震蕩中不斷增大。根據(jù)試驗數(shù)據(jù),分別建立了不同速度下方向累積誤差的2次多項式模型。當拖拉機行駛速度為分別0.2、0.5、0.8 m/s時,該模型分別如式(11)~(13)所示,對應2分別為0.93、0.97、0.98。式(11)~(13)中,對的一階導數(shù)反映出方向累計誤差的變化。由于二次項系數(shù)均大于0,故一階導數(shù)均為遞增函數(shù),即方向誤差的變化呈線性遞增。通過計算,線性遞增的斜率分別為0.000 2、0.002 6、0.005 0,表明拖拉機行駛速度越高,方向誤差增速越高。該模型可以用來估計當前時刻誤差狀態(tài)。
式中表示行駛時間,表示方向累積誤差。
圖7反映出不同速度下方向的誤差變化很小。主要原因是試驗過程中拖拉機直線行駛,在方向上位移很小,因此累積誤差也很小,在±5 cm以內(nèi)。
圖7 不同恒定速度下x方向累積誤差
表1 x和z方向累積誤差數(shù)據(jù)
表1定量分析了不同速度下,、方向累積誤差變化。行駛速度越快,方向累積誤差上升越快。速度為0.2 m/s時,需要46.5 s的時間方向累積誤差超過0.1 m,當速度上升到0.8 m/s時,這個時間縮短到8.5 s。方向誤差變化沒有明顯規(guī)律性。通過對表1的數(shù)據(jù)進行非線性擬合,得到方向累積誤差變化速率(單位m/s)與行駛速度之間的關系
據(jù)式(14)可以直接計算不同速度下的方向累積誤差變化速率。
在國內(nèi)外近期類似研究中,文獻[32]中提到采用粒子濾波方式對改裝的農(nóng)業(yè)機器人進行了60 m的直線跟蹤,橫向偏差為4±0.7 cm。文獻[33]中提到改裝后的茂源250拖拉機以0.58 m/s速度視覺導航,最大誤差18 cm,平均誤差4.8 cm。這些關于機器視覺在農(nóng)機導航上的最新研究成果與本文研究最大的區(qū)別在于研究內(nèi)容的不同。上述研究均是以機器視覺和其他傳感器聯(lián)合,進行直線跟蹤研究。而本文是研究預測軌跡與實際軌跡的偏差。由于本文是單獨通過視覺傳感器對運動軌跡進行檢測,并在此基礎上再次預測,那么累積誤差將不可避免。
誤差產(chǎn)生的原因主要包括:自然光線影響和圖像處理的時間延遲造成。為了減小累積誤差,得到更好的試驗效果,建議使用更高性能工控機和高速快門相機。
1)通過長短基線2套雙目視覺系統(tǒng)疊加構建三目視覺系統(tǒng),并通過灰色預測算法,確實能夠預測拖拉機在平面上的運動軌跡。
2)通過視覺系統(tǒng)得到的預測軌跡與真實軌跡之間存在累積誤差。該誤差主要由前進方向的測量誤差引起。
3)拖拉機行駛速度越高,前進方向累積誤差增速越高。速度為0.2 m/s時,前進方向累積誤差超過0.1 m的時間和行駛距離分別為46.5 s和9.3 m。速度上升到0.5 m/s時,該時間和行駛距離分別降低到17.2 s和8.6 m。當速度上升到0.8 m/s時,該時間和距離分別快速降低至8.5 s和6.8 m。
[1] Adam J L, Piotr M, Seweryn L, et al. Precision of tractor operations with soil cultivation implements using manual and automatic steering modes[J]. Biosystems Engineering, 2016, 145(5): 22-28.
[2] Gan-Mor S, Clark R L, Upchurch B L. Implement lateral position accuracy under RTK-GPS tractor guidance[J]. Computers and Electronics in Agriculture, 2007, 59(1/2): 31-38.
[3] Timo O, Juha B. Guidance system for agricultural tractor with four wheel steering[J]. IFAC Proceedings Volumes, 2013, 46(4): 124-129.
[4] Karimi D, Henry J, Mann D D. Effect of using GPS auto steer guidance systems on the eye-glance behavior and posture of tractor operators[J]. Journal of Agricultural Safety and Health, 2012, 18(4): 309-318.
[5] 劉柯楠,吳普特,朱德蘭,等.太陽能渠道式噴灌機自主導航研究[J].農(nóng)業(yè)機械學報,2016,47(9):141-146. Liu Kenan, Wu Pute, Zhu Delan, et al. Autonomous navigation of solar energy canal feed sprinkler irrigation machine[J]. Transactions of the Chinese Society for Agricultural Machinery, 2016, 47(9): 141-146. (in Chinese with English abstract)
[6] Cordesses L, Cariou C, Berducat M. Combine harvester control using real time kinematic GPS[J]. Precision Agriculture, 2000, 2(2): 147-161.
[7] Jongmin C, Xiang Y, Liangliang Y, et al. Development of a laser scanner-based navigation system for a combine harvester[J]. Engineering in Agriculture, Environment and Food, 2014, 7(1): 7-13.
[8] 張美娜,呂曉蘭,陶建平,等.農(nóng)用車輛自主導航控制系統(tǒng)設計與試驗[J]. 農(nóng)業(yè)機械學報,2016,47(7):42-47. Zhang Meina, Lü Xiaolan, Tao Jianping, et al. Design and experiment of automatic guidance control system in agricultural vehicle[J]. Transactions of the Chinese Society for Agricultural Machinery, 2016, 47(7): 42-47. (in Chinese with English abstract)
[9] 姬長英,周俊.農(nóng)業(yè)機械導航技術發(fā)展分析[J].農(nóng)業(yè)機械學報,2014,45(9):44-54. Ji Changying, Zhou jun. Current situation of navigation technologies for agricultural machinery[J]. Transactions of the Chinese Society for Agricultural Machinery, 2014, 45(9): 44-54. (in Chinese with English abstract)
[10] 張漫,項明,魏爽,等.玉米中耕除草復合導航系統(tǒng)設計與試驗[J].農(nóng)業(yè)機械學報,2015,46(增刊1):8-14. Zhang Man, Xiang Ming, Wei Shuang, et al. Design and implementation of a corn weeding-cultivating integrated navigation system based on GNSS and MV[J]. Transactions of the Chinese Society for Agricultural Machinery, 2015, 46(Supp.1): 8-14. (in Chinese with English abstract)
[11] 謝斌,李靜靜,魯倩倩,等.聯(lián)合收割機制動系統(tǒng)虛擬樣機仿真及試驗[J].農(nóng)業(yè)工程學報,2014,30(4):18-24. Xie Bin, Li Jingjing, Lu Qianqian, et al. Simulation and experiment of virtual prototype braking system of combine harvester[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2014, 30(4): 18-24. (in Chinese with English abstract)
[12] 任述光,謝方平,王修善,等.4LZ-0.8型水稻聯(lián)合收割機清選裝置氣固兩相分離作業(yè)機理[J].農(nóng)業(yè)工程學報,2015,31(12):16-22. Ren Shuguang, Xie Fangping, Wang Xiushan, et al. Gas-solid two-phase separation operation mechanism for 4LZ-0.8 rice combine harvester cleaning device[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2015, 31(12): 16-22. (in Chinese with English abstract)
[13] 焦有宙,田超超,賀超,等.不同工質(zhì)對大型聯(lián)合收割機余熱回收的熱力學性能[J].農(nóng)業(yè)工程學報,2018,34(5):32-38. Jiao Youzhou, Tian Chaochao, He Chao, et al. Thermodynamic performance of waste heat collection for large combine harvester with different working fluids[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2018, 34(5): 32-38. (in Chinese with English abstract)
[14] 偉利國,張小超,汪鳳珠,等.聯(lián)合收割機稻麥收獲邊界激光在線識別系統(tǒng)設計與試驗[J].農(nóng)業(yè)工程學報,2017,33(增刊1):30-35. Wei Liguo, Zhang Xiaochao, Wang Fengzhu, et al. Design and experiment of harvest boundary online recognition system for rice and wheat combine harvester based on laser detection[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2017, 33(Supp.1): 30-35. (in Chinese with English abstract)
[15] Yoshisada N, Katsuhiko T, Kentaro N, et al. A global positioning system guided automated rice transplanter[J]. IFAC Proceedings Volumes, 2013, 46(18): 41-46.
[16] Tamaki K, Nagasaka Y, Nishiwaki K, et al. A robot system for paddy field farming in Japan[J]. IFAC Proceedings Volumes, 2013, 46(18): 143-147.
[17] 胡煉,羅錫文,張智剛,等.基于CAN總線的分布式插秧機導航控制系統(tǒng)設計[J].農(nóng)業(yè)工程學報,2009,25(12):88-92. Hu Lian, Luo Xiwen, Zhang Zhigang, et al. Design of distributed navigation control system for rice transplanters based on controller area network[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2009, 25(12): 88-92. (in Chinese with English abstract)
[18] 胡靜濤,高雷,白曉平,等.農(nóng)業(yè)機械自動導航技術研究進展[J].農(nóng)業(yè)工程學報,2015,31(10):1-10. Hu Jingtao, Gao Lei, Bai Xiaoping, et al. Review of research on automatic guidance of agricultural vehicles[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2015, 31(10): 1-10. (in Chinese with English abstract)
[19] 宋宇,劉永博,劉路,等.基于機器視覺的玉米根莖導航基準線提取方法[J].農(nóng)業(yè)機械學報,2017,48(2):38-44. Song Yu, Liu Yongbo, Liu Lu, et al. Extraction method of navigation baseline of corn roots based on machine vision[J]. Transactions of the Chinese Society for Agricultural Machinery, 2017, 48(2): 38-44. (in Chinese with English abstract)
[20] Leemans V, Destain M F. Line cluster detection using a vartiant of the Hough transform for culture row localisation[J]. Image and Vision Computing, 2006, 24(5): 541-550.
[21] Gee C, Bossu J, Jones G, et al. Crop weed discrimination in perspective agronomic image[J]. Computers and Electronics in Agriculture, 2007, 58(1): 1-9.
[22] 姜國權,柯杏,杜尚豐,等.基于機器視覺的農(nóng)田作物行檢測[J].光學學報,2009,29(4):1015-1020. Jiang Guoquan, Ke Xing, Du Shangfeng, et al. Crop row detection based on machine vision[J]. Acta Optica Sinica, 2009, 29(4): 1015-1020. (in Chinese with English abstract)
[23] Han Y H, Wang Y M,Kang F. Navigation line detection basedon support vector machine for automatic agriculture vehicle[C]// International Conference on Automatic Control and Artificial Intelligence (ACAI 2012), Xiamen, 2012: 1381-1385.
[24] English A, Ross P,Ball D, et al. Vision based guidance for robot navigation in agriculture[C]// 2014 IEEE International Conference on Robotics & Automation (ICRA), Hong Kong, 2014: 1693-2698.
[25] Cariou C, Lenain R, Thuilot B, et al. Motion planner and lateral-longitudinal controllers for autonomous maneuvers of a farm vehicle in headland[C]// 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, USA, 2009: 5782-5787.
[26] 林桂潮,鄒湘軍,張青,等.基于主動輪廓模型的自動導引車視覺導航[J].農(nóng)業(yè)機械學報,2017,48(2):20-26. Lin Guichao, Zou Xiangjun, Zhang Qing, et al. Visual navigation for automatic guided vehicles based on active contour model[J]. Transactions of the Chinese Society for Agricultural Machinery, 2017, 48(2): 20-26. (in Chinese with English abstract)
[27] 項明,魏爽,何潔,等.基于DSP和MCU的農(nóng)機具視覺導航終端設計[J].農(nóng)業(yè)機械學報,2015,46(增刊1):21-26. Xiang Ming, Wei Shuang, He Jie, et al. Development of agricultural implement visual navigation terminal based on DSP and MCU[J]. Transactions of the Chinese Society for Agricultural Machinery, 2015, 46(Supp.1): 21-26. (in Chinese with English abstract)
[28] 任俊如.改進的預測PID控制器的研究與設計[D].武漢:武漢科技大學,2011. Ren Junru. The Research and Design of Improved Predictive PID Controller[D]. Wuhan: Wuhan University of science and Technology, 2011. (in Chinese with English abstract)
[29] 余天明,鄭磊,李頌.電控機械式自動變速器離合器灰色預測PID控制技術[J].農(nóng)業(yè)機械學報,2011,42(8):1-6. Yu Tianming, Zheng Lei, Li Song. Gray prediction PID control technology of automated mechanical transmission clutch[J]. Transactions of the Chinese Society for Agricultural Machinery, 2011, 42(8): 1-6. (in Chinese with English abstract)
[30] Point Grey Research, Inc. Triclops software kit Version 3.1 user’s guide and command reference [EB/OL]. [2018-08-25]. https://www.ptgrey.com/support/downloads
[31] 陳晗婧.SIFT特征匹配技術研究與應用[D].南京:南京理工大學,2017. Chen Hanjing. Research and Application of SIFT Feature Point Technology[D]. Nanjing: Nanjing University of Science and Technology, 2017. (in Chinese with English abstract)
[32] Hiremath S, Evert F K V, Braak C T,et al. Image-based particle filtering for navigation in a semi-structured agriculturalenvironment[J]. Biosystems Engineering, 2014, 121(5): 85-95.
[33] 沈文龍,薛金林,汪東明,等.農(nóng)業(yè)車輛視覺導航控制系統(tǒng)[J].中國農(nóng)機化學報,2016,37(6):251-254. Shen Wenlong, Xue Jinlin, Wang Dongming, et al. Visual navigation control system of agricultural vehicle[J]. Journal of Chinese Agricultural Mechanization, 2016, 37(6): 251-254. (in Chinese with English abstract)
Traveling trajectory prediction method and experiment of autonomous navigation tractor based on trinocular vision
Tian Guangzhao1, Gu Baoxing1※, Irshad Ali Mari2, Zhou Jun1, Wang Haiqing1
(1.210031,; 2.66020,)
In order to make the autonomous navigation tractors work steadily and continuously without the satellite positioning system, a traveling trajectory prediction system and method based on trinocular vision were designed in this paper. The system was composed of a trinocular vision camera, an IEEE 1394 acquisition card and an embedded industrial personal computer (IPC). The right and left sub cameras constituted a binocular vision system with a long base line. The right and middle sub cameras constituted another binocular vision system with a narrow base line. To obtain more precise measurement results, the two binocular vision systems worked independently and in time-sharing. Then the motion vectors of tractor, which were in presentation of horizontal direction data, were calculated by the feature point coordinate changing in the working environment of the tractor. Finally, the error models which were in the direction of heading were established at different velocities, and the motion vectors of tractor were predicted by the models based on grey method. The contrast experiments were completed with a modified tractor of Dongfanghong SG250 at the speed of 0.2, 0.5 and 0.8m/s. During the experiments, the IPC was used to collect RTK-GPS data and predict movement tracks. The RTK-GPS used in the experiments was a kind of high-precision measuring device, and the measuring precision can reach 1-2 cm. Therefore, the location data of RTK-GPS were supposed as the standard which was used to compare with the data from trinocular vision system. The experimental results showed that the method mentioned above could accurately predict the trajectory of the tractor on the plane with an inevitable error which was mainly caused by the visual measurement error of the forward direction (direction). When the tractor travelled at the speed of 0.2 m/s, the time and the distance that the error in forward direction exceeded 0.1 m equaled 46.5 s and 9.3 m, respectively. When the speed increased to 0.5 m/s, the time and the distance decreased to 17.2 s and 8.6 m, respectively. When the driving speed increased to 0.8 m/s, the time and distance quickly decreased to 8.5 s and 6.8 m, respectively. It showed that the higher the tractor traveling speed, the faster the error in forward direction increased. After that, the relationship between errors in forward direction and traveling time was acquired and analyzed by the way of nonlinear data fitting. In addition, the experimental results showed that the trend of lateral error (direction) which was perpendicular to forward direction was not regular. When the speed was 0.2 m/s, the average error was 0.002 5 m with a standard deviation (STD) of 0.003 9. When the speed increased to 0.5 m/s and 0.8 m/s, the average error in lateral direction was 0.008 2 m with an STD of 0.012 4 and 0.003 6 m with an STD of 0.006 4. The result showed that the lateral error was very small and almost invariable. Therefore, the errors of trinocular vision were mainly caused by the errors of the forward direction. The root causes of the error were the natural light and time-delay during the image processing. According to the experimental data and results, the system and method proposed in this paper could be used to measure and predict the traveling trajectory of a tractor in the dry agricultural environment with the sudden loss of the satellite signal in a short period of time. The measured and predicted data could provide temporary help for the operations of autonomous tractors.
tractor; automatic guidance; machine vision; trajectory prediction; gray model
10.11975/j.issn.1002-6819.2018.19.005
S219.1
A
1002-6819(2018)-19-0040-06
2018-06-13
2018-08-27
中央高?;緲I(yè)務費資助項目(KYGX201701);國家自然科學基金資助項目(31401291);江蘇省自然科學基金資助項目(BK20140729)
田光兆,講師,博士,主要從事農(nóng)業(yè)機械導航與控制研究。 Email:tgz@njau.edu.cn
顧寶興,講師,博士,主要從事智能化農(nóng)業(yè)裝備研究。 Email:gbx@njau.edu.cn
田光兆,顧寶興,Irshad Ali Mari,周 俊,王海青. 基于三目視覺的自主導航拖拉機行駛軌跡預測方法及試驗[J]. 農(nóng)業(yè)工程學報,2018,34(19):40-45. doi:10.11975/j.issn.1002-6819.2018.19.005 http://www.tcsae.org
Tian Guangzhao, Gu Baoxing, Irshad Ali Mari, Zhou Jun, Wang Haiqing. Traveling trajectory prediction method and experiment of autonomous navigation tractor based on trinocular vision [J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2018, 34(19): 40-45. (in Chinese with English abstract) doi:10.11975/j.issn.1002-6819.2018.19.005 http://www.tcsae.org