Hostname: page-component-586b7cd67f-2plfb Total loading time: 0 Render date: 2024-11-25T21:20:17.986Z Has data issue: false hasContentIssue false

Development of a sensor fusion method for crop row tracking operations

Published online by Cambridge University Press:  01 June 2017

B. Benet*
Affiliation:
IRSTEA, UR TSCF, 9 avenue Blaise Pascal, 63178 Aubière, France
R. Lenain
Affiliation:
IRSTEA, UR TSCF, 9 avenue Blaise Pascal, 63178 Aubière, France
V. Rousseau
Affiliation:
IRSTEA, UR TSCF, 9 avenue Blaise Pascal, 63178 Aubière, France
*
Get access

Abstract

A sensor fusion method was developed in order to track crop rows, considering various vegetation levels, for various crops. This application consisted to use a laser sensor, an inertial measurement unit and a color camera, in a fusion mode, to get a set of points corresponding to crop rows and eliminate noise like grass or leaves in environment, in real time. After applying a method such as Hough or Least Square (LS) technique for obtaining the geometric data of the crop line, automatic control operations were applied to realize the crop row tracking operation, with the desired lateral deviation parameter, taking into account the robot angular deviation and the temporal aspect, to realize the task with accuracy and without oscillations. The results showed the robustness of fusion method, to get a stable autonomous navigation for crop row tracking, particularly in the vineyards, with many perturbations such as bumps, hole and mud, and speeds between 1 and 2 m s-1. The mean lateral error between desired and obtained trajectory varied between 0.10 and 0.40 m, depending of speed and perturbations.

Type
Data analysis and Geostatistics
Copyright
© The Animal Consortium 2017 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Åstrand, B and Baerveldt, AJ 2002. An Agricultural Mobile Robot with Vision-Based Perception for Mechanical Weed Control. Autonomous robot, July 2002 13 (1), 2135.Google Scholar
Billingssley, J and Schoenfisch, M 1995. Vision-Guidance of Agricultural Vehicles. Autonomous Robots 2 (1), 6576.Google Scholar
Choi, KH, Han, SK, Han, SH, Park, KH, Kim, KS and Kim, S 2015. Morphology-based guidance line extraction for an autonomous weeding robot in paddy fields. Computers and Electronics in Agriculture 113, 266274.Google Scholar
Duda, RO and Hart, PE 1972. Use of the Hough transformation to detect line and curves in pictures. ACM 15 (1), 1115.Google Scholar
Garcia-Alegre, S, Martin, D, Garcia-Alegre, G and Guinea Diaz, D 2011. Real-Time Fusion of Visual Images and Laser Data Images for Safe Navigation in Outdoor Environments. Center for Automation and Robotics Spanish Council for Scientific Research, Spain.Google Scholar
Han, S, Zhang, Q, Ni, B and Reid, JF 2003. A guidance directrix approach to vision-based vehicle guidance systems. Computers and Electronics in Agriculture 43 (3), 179195.Google Scholar
Haselick, M, Arends, M, Lang, D and Paulus, D 2012. Terrain Classification with Markov Random Fields on fused Camera and 3D Laser Range. DataVision Group, AGAS Robotics, University of Koblenz-Landau, 56070 Koblenz, Germany.Google Scholar
Hyeran, B and Lee, SW 2002. Applications of Support Vector Machines for Pattern Recognition: A Survey. In Pattern Recognition with Support Vector Machines: First International Workshop, SVM 2002 Proceedings, Springer Berlin Heidelberg, Germany, pp. 213–216.Google Scholar
Keicher, R and Seufert, H 2000. Automatic guidance for agricultural vehicles in Europe. Computers and Electronics in Agriculture 25, 169194.Google Scholar
Ming, L, Kenji, I, Katsuhiro, W and Shinya, Y 2008. Review of research on agricultural vehicle autonomous guidance. International Journal of Agricultural and Biological Engineering 2 (3), 116.Google Scholar
Shalal, N, Low, T, McCarthy, C and Hancock, N 2013. A preliminary evaluation of vision and laser sensing for tree trunk detection and orchard mapping. Proceedings of Australasian Conference on Robotics and Automation, 2-4 Dec 2013, University of New South Wales, Sydney Australia.Google Scholar
Subramanian, V, Burks, TF and Arroyo, AA 2006. Development of machine vision and laser radar based autonomous vehicle guidance systems for citrus grove navigation. Computers and Electronics in Agriculture 53, 130143.Google Scholar
Takai, R, Barawid, O, Ishii, K and Noguchi, N 2014. Development of Crawler-Type Robot Tractor based on GPS and IM. Engineering in Agriculture, Environment and Food 7 (4), 143147.CrossRefGoogle Scholar