1
Hao Gan Department of Agricultural and Biological Engineering, University of Florida, USA : [email protected] : 18659748379 : 18659744513 Introduction Yield mapping is considered as an initial step for applying precision agriculture technologies. Although many yield mapping systems have been developed for agronomic crops, it remains a difficult task for mapping yield of tree crops. In Florida, efforts have been made toward developing citrus yield mapping methods. In this study, an autonomous immature citrus yield mapping system was developed. The long-term goal of this study was to develop an autonomous yield mapping system for immature green citrus fruit. Specifically, there were four sub-objectives. 1. Create an imaging platform that associates images with GPS locations automatically; 2. Train deep learning models that can detect immature citrus fruit fast enough for real-time application; 3. Develop fruit tracking algorithms for counting fruit numbers from videos; 4. Develop an unmanned ground vehicle platform that carries the imaging platform and provides autonomous navigation in a citrus grove. 1 System Overview 2 Autonomous Navigation System 3 Multi - modal Imaging System 4 Final Remark(s) n Autonomous navigation as one of the robots’ abilities, could save labors for driving vehicles and provide accurate and consistent localizations to perform farm operations. The goal was to develop a navigation system that could guide a field robot to travel from a farm station to a citrus grove and visit each tree autonomously with obstacle avoidance ability. The system was developed with consideration of the concept of future smart farms, in which internet of things and big data analysis would be implemented. The sensor data and the robot’s status were monitored and recorded from the farm station in real-time. The system also enabled the computer at the farm station to interrupt and control the robot manually in emergency scenarios. The communication between the robot and the farm station was established under a Wi-Fi network, in which more sensing systems could be added easily. Immature green citrus fruit detection, which aims to provide valuable information for citrus yield mapping at earlier stages is difficult because the fruit and leaf colors are very similar. This study combines color and thermal images for immature green fruit detections. A novel image registration method was developed for combining color and thermal images and matching fruit in both images. A new Color-Thermal Combined Probability (CTCP) algorithm was created to effectively fuse information from the color and thermal images to classify potential image regions into fruit and non-fruit classes. Algorithms were also developed to integrate image registration, information fusion and fruit classification and detection into a single step for real-time processing. The fusion of the color and thermal images effectively improved immature green citrus fruit detection. This study focused on developing an autonomous robotic system for yield mapping of specialty crops. Specifically, this study developed and integrated systems in the following areas, autonomous field navigation, machine vision with various imaging technologies, photogrammetry, and wireless communication. The overarching goal is to develop commercial robot for rapid and accurate yield mapping in fruit orchard before harvest. The successful development of such a system will greatly reduce the labor and improve accuracy for data collection. The system can be integrated with other mobile and in-situ sensors to form the agriculture internet of things (IoT). Breakthroughs in the area of agricultural robots will be critical for promoting IoT systems and big data analysis for next- generation agricultural production systems. The system could detect fruit and create yield maps at early growth stages of citrus fruit so that farmers could apply site-specific management based on the maps. There were two sub-systems, a navigation system and an imaging system. Robot Operating System (ROS) was the backbone for developing the navigation system using an unmanned ground vehicle. An inertial measurement unit Figure 2. Overview of the immature citrus yield mapping system design. The system includes an autonomous platform and an imaging platform. Figure 1. The yield mapping robot featured in a citrus grove (IMU), wheel encoders and a GPS were integrated using an extended Kalman filter to provide reliable and accurate localization information. A LiDAR and a Kinect camera were added to support simultaneous localization and mapping (SLAM) algorithms for autonomous navigations at open fields and citrus groves. A multimodal imaging system, which consisted of two color cameras and a thermal camera, was carried by the vehicle for video acquisitions. Multimodal registration and information fusion algorithms were developed to combine information from color and thermal cameras. Faster R-CNN and optical flow algorithms were applied to detect and count the fruit in real-time. Figure 2 shows an overview of the system design. The system is expected to improve the accuracy of fruit detection to approximate 90%. Figure 3. Machine vision for real-time central line (orange solid line) detection and robot’s relative positions and orientations determination. Figure 4. Developed program that visualizes the Google map, the GPS locations (the yellow line) and the live video stream (top-left corner of the map) from the Kinect. Figure 6. The multi-modal imaging system (left) and two examples (right) that show the detection results after information fusion. The images (a) & (d) show the results from Faster R-CNN detection. The images (b) & (e) are the corresponding thermal images. The images (c) & (f) are the detection results after information fusion. In (c) & (f), red circles are confirmed fruit locations and blue circles are rejected fruit locations.

1 2 System Overview Introduction€¦ · Introduction Yield mapping is considered as an initial step for applying ... A novel image registration ... Overview of the immature citrus

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: 1 2 System Overview Introduction€¦ · Introduction Yield mapping is considered as an initial step for applying ... A novel image registration ... Overview of the immature citrus

Hao GanDepartment of Agricultural and Biological Engineering, University of Florida, USA

: [email protected] : 18659748379 : 18659744513

IntroductionYield mapping is considered as an initial step for applyingprecision agriculture technologies. Although many yieldmapping systems have been developed for agronomic crops, itremains a difficult task for mapping yield of tree crops. InFlorida, efforts have been made toward developing citrus yieldmapping methods. In this study, an autonomous immaturecitrus yield mapping system was developed. The long-termgoal of this study was to develop an autonomous yieldmapping system for immature green citrus fruit. Specifically,there were four sub-objectives.1. Create an imaging platform that associates images with

GPS locations automatically;2. Train deep learning models that can detect immature citrus

fruit fast enough for real-time application;3. Develop fruit tracking algorithms for counting fruit numbers

from videos;4. Develop an unmanned ground vehicle platform that carries

the imaging platform and provides autonomous navigationin a citrus grove.

1 System Overview2

Autonomous Navigation System3 Multi-modal Imaging System4

Final Remark(s)n

Autonomous navigation as one of the robots’ abilities, could save labors fordriving vehicles and provide accurate and consistent localizations to performfarm operations. The goal was to develop a navigation system that couldguide a field robot to travel from a farm station to a citrus grove and visiteach tree autonomously with obstacle avoidance ability. The system wasdeveloped with consideration of the concept of future smart farms, in whichinternet of things and big data analysis would be implemented.The sensor data and the robot’s status were monitored and recorded fromthe farm station in real-time. The system also enabled the computer at thefarm station to interrupt and control the robot manually in emergencyscenarios. The communication between the robot and the farm station wasestablished under a Wi-Fi network, in which more sensing systems could beadded easily.

Immature green citrus fruit detection, which aims to provide valuableinformation for citrus yield mapping at earlier stages is difficult because thefruit and leaf colors are very similar. This study combines color and thermalimages for immature green fruit detections. A novel image registrationmethod was developed for combining color and thermal images andmatching fruit in both images. A new Color-Thermal Combined Probability(CTCP) algorithm was created to effectively fuse information from the colorand thermal images to classify potential image regions into fruit and non-fruitclasses. Algorithms were also developed to integrate image registration,information fusion and fruit classification and detection into a single step forreal-time processing. The fusion of the color and thermal images effectivelyimproved immature green citrus fruit detection.

This study focused on developing an autonomous robotic system for yield mapping of specialty crops. Specifically, this study developed and integrated systems inthe following areas, autonomous field navigation, machine vision with various imaging technologies, photogrammetry, and wireless communication. Theoverarching goal is to develop commercial robot for rapid and accurate yield mapping in fruit orchard before harvest. The successful development of such asystem will greatly reduce the labor and improve accuracy for data collection. The system can be integrated with other mobile and in-situ sensors to form theagriculture internet of things (IoT). Breakthroughs in the area of agricultural robots will be critical for promoting IoT systems and big data analysis for next-generation agricultural production systems.

The system could detect fruit and create yield maps at early growth stages of citrus fruit so thatfarmers could apply site-specific management based on the maps. There were two sub-systems, anavigation system and an imaging system. Robot Operating System (ROS) was the backbone fordeveloping the navigation system using an unmanned ground vehicle. An inertial measurement unit

Figure 2. Overview of the immature citrus yield mapping system design. The system includes an autonomous platform and an imaging platform.

Figure 1. The yield mapping robot featured in a citrus grove

(IMU), wheel encoders and a GPSwere integrated using an extendedKalman filter to provide reliable andaccurate localization information. ALiDAR and a Kinect camera wereadded to support simultaneouslocalization and mapping (SLAM)algorithms for autonomousnavigations at open fields and citrusgroves. A multimodal imagingsystem, which consisted of two colorcameras and a thermal camera, wascarried by the vehicle for videoacquisitions. Multimodal registrationand information fusion algorithmswere developed to combineinformation from color and thermalcameras. Faster R-CNN and opticalflow algorithms were applied todetect and count the fruit in real-time.Figure 2 shows an overview of thesystem design. The system isexpected to improve the accuracy offruit detection to approximate 90%.

Figure 3. Machine vision for real-time central line (orange solid line) detection and robot’s relative

positions and orientations determination.

Figure 4. Developed program that visualizes the Google map, the GPS locations (the yellow line) and the live

video stream (top-left corner of the map) from the Kinect.

Figure 6. The multi-modal imaging system (left) and two examples (right) that show the detection results after information fusion. The images (a) & (d) show the results from Faster R-CNN detection. The images (b) & (e) are the corresponding thermal images. The images (c) & (f) are the detection results after information fusion. In (c) &

(f), red circles are confirmed fruit locations and blue circles are rejected fruit locations.