21
Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Embed Size (px)

Citation preview

Page 1: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Sensing for Robotics & Control – Remote Sensors

R. R. Lindeke, Ph.D

Page 2: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Remote Sensing Systems:

Radar – uses long wavelength microwaves for point or field detection

Speed and range analysis Trajectory analysis

Sonar – uses high energy/high frequency sound waves to detect range or create images in “conductive media”

Vision Systems – Operations in Visible or near visible light regimes. Use structured light and high contrast environments to control data mapping problems

Page 3: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Parts of The Remote Sensors – field sensors

Source information is a structured illumination system (high contrast)

Receiver is a Field detector – in machine vision it is typically a CCD or CID

CCD is a charge coupled device – where a detector (phototransistor) stores charge to a capacitor which is regularly sampled/harvested through a field sweep using a “rastering” technique (at about 60 hz)

CID is a charge injected device where each pixel can be randomly sampled at variable times and rates

Image analyzers that examine the raw field image and apply enhancement and identification algorithms to the data

Page 4: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

The Vision System issues:

Blurring of moving objects – A result of the data capture rastering through the receiver’s 2-D

array, here, the sampling system lags the real world as it processes field by field with parts of the information being harvested and adjacent pixels continuing to change in time

Limits speed of response, speed of objects and system thru-put rates

Contrast Enhancements are developed by examining extrema in field of view:

min

max min

applicable at each pixel

recenh

I II

I I

Page 5: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Some Additional Issues:

Must beware of ‘Bloom’ in the image Bloom is a problem when a high intensity pixel overflows into

adjacent pixels increasing or changing the size of an information set

Also consider Lensing and Operational errors: Vignetting – lenses transmits more effectively in the center

than at their edges leading to intensity issues across the field of the image even without changes in the image field information itself

Blur – caused by lack of full field focus Distortion – parabolic and geometric changes due to lens

shape errors Motion Blur – moving images “smeared” over many pixels in

capture (for CCD system we typically sample up to 3 to 5 field to build a stable image limiting one to about 12 to 20 stable images/sec)

Page 6: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Data Handling Issues:

Typical Field Camera (780x640 or 499,200 pixels/image) with 8-bit color – means 3 separate 8 bit words (24 bit color) per pixel (32 bit color typically includes a saturation or brightness byte too)

Data in one field image as captured during each rastering sweep: 499200/8 = 62400 bytes/image*3 bytes of color = 187200 bytes/image

In a minute: 187200*60fr/s*60s/m = 673.9 MBytes (raw – ie. without compression or processing) (40.4 Gigs/hour of video information)

Page 7: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Helping with this ‘Data Bloat’

Do we really need Color?– If no, the data is reduced by a factor of 3

Do we really need “shades”? – If no, the data set drops by a factor of 8– but this requires ‘thresholding’ of the data field

Thresholding is used to construct ‘bit maps’ After sampling of test cases, setting a level of pixel intensity

corresponding to a value of 1 or ‘on’ while below this level of intensity the pixel is 0 or ‘off’ regardless of image difficulties and material variations

Consideration is reduced to 1 bit rather than the 8 to 24 bits in the original field of view!

Page 8: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Analyzing the Images

Do we really need the entire field – or just the important parts?

– But this requires post processing to analyze what is in the ‘thresholded’ image

Image processing is designed to build or “Grow” field maps of the important parts of an image for identification purposes

These field maps then must be analyzed by applications that can make decisions using some form of intelligence as applied to the field data sets

Page 9: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Image Building

First we enhance the data array

Then we digitize (threshold) the array

Then we look for image edges – an edge is where the pixel value changes from 0 to 1 or 1 to 0!

Raw image before thresholding and image analysis:

Page 10: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Working the Array – hardware and software

Bottles for selection, After Reorganizing

the Image Field

Field Image after Thresholding:

Page 11: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

After Threshold

The final image is a series of On and Off Pixels (the light and dark parts of the 2nd Image as seen on the previous slide)

The image is then scanned to detect edges in the information

One popular method is using an algorithm “GROW” that searches the data array (of 1 and 0’s) to map out changes in pixel value

abcdef

g

Page 12: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Using Grow Methods

We begin a directed Scan – once a state level change is discovered we stop the directed scan and look around the changed pixel to see if it is just a bit error

If it is next to changed bits in all “new” directions, we start exploring for edges by stepping forward from the 1st bit and stepping back and forth about the change line as it “circumvents” the part

The algorithm then is said to grow shapes from full arrays but done without exhaustive enumeration!

Page 13: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

So lets see if it works:

________

---

Page 14: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Once Completed:

An image must be compared to standard shapes

The image can be analyzed to find centers, sizes or other shape information

After analysis is completed, objects can then be handled and or sorted

Page 15: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Sorting Routines:

Based on Conditional Probabilities:

This is a measure of the probability that x is a member of class i (Wi) given a knowledge of the probability that x is not a member of the several other classes in the study (Wj’s)

i i

i

j

(x| )(x| )

(x| ) jj

p w p wp w

p w p w

Page 16: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Typically a Gaussian Approximation is Assumed:

We perform the characteristic measurement (with the Vision System)

We compute the conditional probability that X fits each class j – that with the highest value is accepted as a best fit (if the classes are each mutually exclusive)

2

2i

1(x| )

2where:

Z

j

j

p w e

xZ

Page 17: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Lets Examine The Use Of This Technique:

Step One: present a “Training Set,” to the camera system including representative sizes and shapes of each type across its acceptable sizes

Step Two: For each potential class, using its learned values, compute a mean and standard deviation for each class

Step 3: Present unknowns to the trained system and make measurements – compute the appropriate dimensions and compute conditional probabilities for each potential class

Step 4: Assign unknown to class having the highest conditional probability – if the value is above a threshold of acceptability

Page 18: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Using Vision To determine Class & Quality

A system to sort by “Body Diagonals” (BD) for a series of Rectangular pieces:

A is 2±.01” x 3±.01” B is 2±.01” x 3.25±.01” C is 1.75±.01” x 3.25±.01”

Body Diagonals with part dimensions at acceptable limits:

– A: (1.992 + 2.992) to (2.012 + 3.012) 3.592 to 3.619 (mean is 3.606”)

– B: (1.992 + 3.242) to (2.012 + 3.262) 3.802 to 3.830 (mean is 3.816”)

– C: (1.742 + 3.242) to (1.762 + 3.262) 3.678 to 3.705 (mean is 3.691”)

Page 19: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Computing Class Variances:

Can use Range techniques: find range of samples for a class then using a statistic d2 to compute σ: σclassi = (Rsample)/d2

Can also compute an estimate of σclass using sample standard deviations and a c4 statistic: σclassi = (ssample)/c4

c4 or d2 are available in any good engineering statistics text! – see handout

Page 20: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Computing Variances:

Here using estimates from ideal values on BD range, σclassi is:

from "range"

0.0277.02394

1.128

0.02752.02439

1.1280.02709

.024021.128

B

C

A

If sample size is 2 – changes based on sample size

Page 21: Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D

Fitting an unknown:

Unknown Body Diagonal is measured at 3.681”

Compute Z and Cond. Probability (each class)

From our analysis we would tentatively place the Unknown in Class C – but more likely we would place it in a hand inspect bin!

2(3.1928)2

A

3.681 3.6063.1928

0.023491

(x| ) 0.002442

AZ

p w e

2( 5.5351)82

B

3.681 3.8165.5351

0.024391

(x| ) 8.876 102

BZ

p w e x

2( 0.4248)2

C

3.681 3.69120.4248

0.024011

(x| ) 0.36452

CZ

p w e