1
Early Pest Detection in Greenhouses
Vincent Martin, Sabine Moisan
INRIA Sophia Antipolis Méditerranée, Pulsar project-team, France
2
2/12
Motivation: reduce pesticide use
• Agricultural issues:• Temperature and hygrometric conditions inside a greenhouse favor
frequent and rapid attacks of bioagressors (insects, spider mites, fungi).
• Difficult to know starting time and location of such attacks.
• Reduce time overhead of workers in charge of greenhouse biological monitoring
• Understand better pest population behaviors
• Computer vision issues:• Automatically identify and count populations to allow rapid decisions
• Improve and cumulate knowledge of greenhouse attack history
3
3/12
DIViNe1: A Decision Support System1Detection of Insects by a Video Network
Identification and counting of pests
Manual method DIViNe system
Result delivery Up to 2 days Near real-time
Advantages Discrimination capacityAutonomous system, temporal sampling,
cost
DisadvantagesNeed of a specialized
operator (taxonomist); precision vs. time
Predefined insect types; video camera
installation
4
4/12
Proposed Approach
IntelligentAcquisition
IntelligentAcquisition
DetectionDetection
ClassificationClassification
TrackingTracking
BehaviourRecognition
BehaviourRecognition
Regions of interest
Pest identification
Pest trajectories Scenarios (laying, predation…)
Image sequences with moving objects
Pest counting results
Current work Future workCurrent work Future work
Automatic vision system for in situ, non invasive, and early detection
• Based on a video sensor network
• Lined up with cognitive vision research (machine learning, a priori knowledge…)
5
5/12
First DIViNe Prototype
• Network of 5 wireless video cameras (protected against water projection and direct sun).
• In a 130 m2 greenhouse at CREAT planted with 3 varieties of roses.
• Observing sticky traps continuously during daylight.
• High image resolution (1600x1200 pixels) at up to 10 frames per second.
400€
6
6/12
Intelligent Acquisition Module
• Scheduled image sequence acquisition:
• at specific time intervals,
• on motion detection
• Distant tuning of each sensor settings (resolution, frame rate)
• Storage and retrieval of relevant video data
7
7/12
Detection Module
• Handle illumination changes• due to sun rotation, shadows, reflection…
• Adapt algorithms to deal with different image contexts
Cloudy context with reflections and low contrast
Sunny context with shadows and high contrast
video clip
8
8/12
Detection Module: Preliminary Results
Acquisition: sticky trap Detection without context adaptation
with context adaptation
• Weakly supervised learning to acquire context knowledge from global image characteristics
• Context identification for background model selection
video clip
9
9/12
Classification Module: Preliminary Results
Regions labeled according to insect types based on a visual concept ontology:
Domain Class WhiteFly SuperClass Bioagressor { ShapeConcept Descriptors: circularity [0.20 0.50 0.60 ] excentricity [ 0.10 0.20 0.40 0.50 ] rectangularity [ 0.50 0.6 0.8 0.85 ] elongation [ 0.30 0.35 0.70 0.80 ] convexity [ 0.70 0.75 1.0 1.1 ] compacity [ 0.10 0.25 0.9 1.0 ] ColorConcept Descriptors: saturation [ 0.0 0.0 0.2 0.3 ] lightness [ 120 130 240 260 ] hue [ 80 90 170 180 ] SizeConcept Descriptors: area [ 0.5 0.6 1.2 1.3 ] length [ 0.6 0.8 2.5 3.5 ] width [ 0.2 0.3 1.0 1.3 ]}
10
10/12
Conclusion and Future Work
• A greenhouse equipped with a video camera network
• A software prototype:• Intelligent image acquisition
• Pest detection (few species)
• Future:• Detect more species
• Observe directly on plant organs (e.g. spider mites)
• Behaviour recognition
• Integrated biological sensor
See http://www-sop.inria.fr/pulsar/projects/bioserre/
11
11/12
Behavior description based on a generic declarative language relying on a video event ontology
Scenario models based on the concepts of states and events related to interesting objects.
• state = spatiotemporal property valid at a given instant and stable on a time interval.
• event = meaningful change of state.
• scenario = combination of primitive states and events by using logical, spatial or temporal constraints between objects, events, and states.
Behavior Recognition ModuleLaying scenario example
state: insideZone( Insect, Zone )
event: exitZone( Insect, Zone )
state: rotating( Insect )
scenario: WhiteflyPivoting( Insect whitefly, Zone z ) {
A: insideZone( whitefly, z ) // B: rotating( whitefly );
constraints: duration( A ) > duration( B );
}
scenario: EggAppearing( Insect whitefly, Insect egg, Zone z ) {
insideZone( whitefly, z ) then insideZone( egg, z );
}
main scenario: Laying( Insect whitefly, Insect egg, Zone z ) {
WhiteflyPivoting( whitefly, z ) //
loop EggAppearing( egg, z ) until
exitZone( whitefly, z );
then send(”Whitefly is laying in ” + z.name);
}
12
12/12
Plant Organs Monitoring
• Issues:• Plant motion estimation ( + need of auto focus sensors)
• Non planar field of view
• choice of the sensor positions