6

Click here to load reader

Construction of system for controlling the dimensions of large-scale parts based on a 3D machine vision system

Embed Size (px)

Citation preview

Page 1: Construction of system for controlling the dimensions of large-scale parts based on a 3D machine vision system

A 3D machine vision system based on a television camera and laser for controlling the large dimensions of

large-scale parts is considered. The laser ensures structured illumination of the surface of the object,

making it possible to obtain three-dimensional coordinates of representative points of the measurement

object using a triangulation method and a parametric model. An approach to calibration of the model by

means of a special target is presented.

Key words: image processing, laser, television camera, triangulation, transformation of coordinates.

Controlling the large dimensions of large-scale parts in an on-line regime constitutes a critical problem particularly

for such branches of industry as automobile construction and heavy machine-building.

In automobile construction, the quality of the automobile and its operating characteristics are related directly to the

precision with which the basic units such as, for example, the chassis, are manufactured and assembled. In this case, a large

volume of measurements must be conducted directly on the conveyor belt, rapidly, and for 100% of the parts. In fact, some

authors have attributed the lag in quality of American-made automobiles and automobiles produced in certain other countries

from that of Japanese-made vehicles to the absence of precisely this type of control [1].

A number of different sensors, both impact and contactless, may be used in systems for measuring large dimensions

for the purpose of determining three-dimensional coordinates. In impact sensors, which have entered into widespread use in

plate-measuring engines [2], position is established at the moment of contact with the object. However, the presence of phys-

ical contact with the object represents a substantial drawback, hence among the important advantages of contactless sensors

over impact sensors we may cite their higher speeds, flexibility, and reliability.

Progress in the field of systems of machine vision has made it possible to apply these devices as contactless sen-

sors for the solution of analogous problems, thus ensuring sufficiently high speeds and precision without any contact with

the unit and has also made it possible to “build” the system into the production process [3]. In this case, different variant

systems are also possible. There exists an approach based on stereoscopic vision in which information obtained from two

television cameras is used to obtain the three-dimensional coordinates of the check points [4]. Systems that incorporate an

independent range finder in addition to a television camera have also been developed. However, these systems do not pos-

sess sufficiently high precision and are used chiefly whenever high precision is not required, for example, in the sensory

systems of robots [5].

Still another approach is based on the construction of a system using structurized illumination and a triangulation

principle. Such a system usually comprises a television camera as image receiver and laser diode as source of structurized

illumination. The camera supports the production of an image in an ordinary 2D regime (two-dimensional coordinates) as

well as information on the third coordinate (distance) by means of an image of laser bias lighting (usually a line) and a tri-

angulation principle.

Measurement Techniques, Vol. 47, No. 12, 2004

CONSTRUCTION OF SYSTEM FOR CONTROLLING

THE DIMENSIONS OF LARGE-SCALE PARTS

BASED ON A 3D MACHINE VISION SYSTEM

I. I. Dunin-Barkovskii UDC 620.1.08:531.7

Translated from Izmeritel’naya Tekhnika, No. 12, pp. 24–26, December, 2004. Original article submitted June 10,

2004.

0543-1972/04/4712-1168 ©2004 Springer Science+Business Media, Inc.1168

Page 2: Construction of system for controlling the dimensions of large-scale parts based on a 3D machine vision system

The basic units of an automobile possess quite large dimensions (up to several meters), hence it is necessary to move

the laser significant distances in space or to use several sensors for each of the measurement points. An industrial robot [6, 7]

or plate-measuring engine [8] is used to move the sensor.

Model of Laser Sensor. The model of the laser sensor is constructed from the physical principles underlying its

operation and is based on geometric optics (Fig. 1). The sensor consists of a laser light source with objective that ensures a

linear form of the beam (image in the form of a line) and a television camera the optical axis of the objective of which is

directed at an angle α to the plane of the laser beam. Let us list the basic parameters of the model: working distance L0, or

distance from a point of the front focus of the television camera objective to the reference point (point of intersection of opti-

cal axis of television camera objective and plane of laser beam); plan approach angle α of sensor, or angle between optical

axis of television camera objective and plane of laser beam; coefficient Kc of camera, which depends on the dimensions of

the CCD matrix, the number of its elements (pixels) along the horizontal and vertical, and the focal distance of the camera

objective; and the turning angle ϕ of the sensor about an axis passing through the reference point.

Let us determine the coordinate system of a laser sensor with origin at the reference point (point of intersection of

optical axis of television camera objective with plane of laser beam), Z-axis directed towards laser objective, X-axis perpen-

dicular to plane of laser beam, and Y-axis, which belongs to the plane of the laser beam). All the measurements will be car-

ried out in this system.

Basic Operating Principle of Laser Sensor. Using the formula of a thin lens and other relationships of geometric

optics [9], we obtain an expression that relates the physical dimensions of the object, the dimensions of its image, and the

basic parameters of the camera objective:

where h is the dimension of the image of the object; H, physical dimension of object; Kp, proportionality factor between pix-

els and millimeters; L0, distance from point of object-side focus of objective to object that ensures a sharp image of object;

ƒ, focal distance of objective; and l, distance from point of objective-side focus of objective to object. The coefficient Kpdepends on the number of pixels along the horizontal and the vertical and the physical dimensions of the CCD matrix:

Kp = Kpv = mv/CCDv = Kph = mh/CCDh,

where mv and mh are the number of pixels along the vertical and along the horizontal, respectively, and CCDv and CCDh are

the dimensions of the CCD matrix along the vertical and along the horizontal, respectively.

Usually, the coefficient along the vertical and the coefficient along the horizontal are equal. For example, for the

SONY XC-ES50 television camera with CCD matrix measuring 1.27 cm and number of pixels 576 × 768 the proportionali-

ty factor is determined as

Kp = Kpv = 768/6.4 = Kph = 576/4.8 = 120.

h H KL

L l= ƒ

+ ƒ+ ƒp

0

0( ),

1169

Laser

Television cameraObject-side focus

Coordinate origin

Fig. 1. Layout of laser sensor.

Page 3: Construction of system for controlling the dimensions of large-scale parts based on a 3D machine vision system

The expression for the dimension of the image of the object may be rewritten in the following form:

(1)

where Kc = Kpƒ.

Determining the Position of a Plane in Space Using the Laser Sensor. To determine the position of a plane arbi-

trarily oriented in space in the coordinate system of the laser sensor, it is necessary to use at least two positions of the sensor

with different turning angles ϕ. To simplify the computations, we select ϕ1 = 0° and ϕ2 = 90° (Fig. 2). In the former case, we

find the position of the line YPZP (cf. Fig. 2a).

In parametric form, the plane is specified by the coordinates XP, YP, and ZP, whence the parametric equation of the

plane assumes the form

x /XP + y /YP + z /ZP = 1,

where x, y, and z are three-dimensional coordinates.

Let us determine the coefficients XP, YP, and ZP by analyzing Fig. 3. We write out the following relationships (cf.

Fig. 3a):l = L0 – ZPcosα; H = ZPsinα.

h HKL

L lHK

L

L l=

+ ƒ+ ƒ

=+ ƒ+ ƒp c

0

0

0

0( ) ( ),

1170

LaserLaser

Television cameraTelevision camera

Object-side focus

a b

Object-side focus

a b

Fig. 2. Determining the position of a plane arbitrarily oriented in space for different

turning angles ϕ of the sensor: a) ϕ = 0°; b) ϕ = 90°.

Fig. 3. Plane ZOY (a) and image of laser line (b).

Page 4: Construction of system for controlling the dimensions of large-scale parts based on a 3D machine vision system

Substituting these expressions into (1), we obtain

whence we find

The distance from the point YP to the point of the object-side focus of the camera objective is equal to L0, hence

The position of the laser line in the plane relative to the origin of the coordinate system of the laser sensor is deter-

mined by YP and ZP. To find the position and orientation in space of the entire plane an analysis of yet another image is need-

ed, one that may be obtained by rotating the sensor by an angle ϕ = 90° (cf. Fig. 3b).

The expression for ZP will assume the analogous form:

whence ZP is determined as ZP = (ZP1 + ZP2)/2.

For XP, we obtain the following expression:

XP = hIY2L0/Kc.

The values of the coordinates XP, YP, and ZP entirely determine the plane situated arbitrarily in space.

The slope of the plane

ϕ = arctan(ZP /YP).

Calibration of Laser Sensor. Most of the parameters of the model of a laser sensor, such as α, Kc, ϕ, and f, are

known with a high degree of precision, however, the working distance L0 is unknown and must be determined. Direct mea-

surement of L0 is a difficult task, hence we will apply a procedure using a calibration target, which is shown in Fig. 4.

The calibration target used as reference standard is a precisely manufactured object with faces situated at different

distances from the datum surface. The dimensions of the calibration target are shown in Fig. 4. The step of the target faces

adopted as the reference dimension in conducting the calibration amounts to 5 mm ± 1 µm.

Zh L L

h L K LPIX

IX2

2 0 0

2 0 0=

+ ƒ+ + ƒ

( )

cos ( ) sin,

α αc

Yh L L

K L

h L

KPIY IY=

+ ƒ+ ƒ

=0 0

0

0( )

( ).

c c

Zh L L

h L K LPIX

IX=

+ ƒ+ + ƒ0 0

0 0

( )

cos ( ) sin.

α αc

h KZ L

L L ZP

P=

+ ƒ+ ƒ −csin ( )

( cos ),

αα

0

0 0

1171

15

5

2

25

35 mm

Fig. 4. Calibration target.

Page 5: Construction of system for controlling the dimensions of large-scale parts based on a 3D machine vision system

When the laser beam illuminates the calibration target, several bright lines become visible on the image obtained by

the television camera. The distance between these lines may be used for purposes of calibration. The target must be placed

at the coordinate origin of the laser sensor, moreover, the image of the line in the datum surface of the target must be at the

center of the image. In conducting the measurements in the operating mode, the object (if it is the dimension of an automo-

bile chassis which is to be controlled, the surface of the flanges and the spherical bearings serve as the “object”) is also found

near the coordinate origin of the sensor. This is important for ensuring that the conditions under which calibration is per-

formed will be identical to the conditions typical of the operating mode of the system and also for ensuring reliability of the

calibration data. This is achieved by precise positioning of the calibration target along the Z-axis of the sensor.

Using (1) and the fact that the dimensions and distance for the laser lines on the target are connected by the rela-

tionship

Hi = idZsinα; li = L0 + idZcosα,

we obtain

The quadratic equation for L0 has only the single positive root:

For the laser sensor being considered here, the basic parameters of the model are as follows: ƒ = 25 mm, α = 30.2°,

and Kc = Kpƒ = 3000.

The results of calibration of the laser sensor are presented in Table 1. Excluding line 2, which produces a large error

due to the small distance, we calculate the mean value thus: L0 = 107.56.

Thus, a 3D machine vision system may be used as a sensor for high-precision measurements of large-scale parts. To

achieve a required precision, calibration of the sensor is necessary. Calibration may be performed using a calibration target.

REFERENCES

1. R. K. Gilbert, Quality Magazine, 18 (August, 2002).

2. N. N. Markov and G. M. Ganevskii, Construction, Design, and Operation of Measurement Instruments and Devices

[in Russian], Mashinostroenie, Moscow, 1981.

3. I. I. Dunin-Barkovskii, Prib. Sist. Upravl., No. 8, 12 (1994).

4. F. Quafe, Interaction of Robots with the Environment [Russian translation], Mir, Moscow, 4985.

LK id h id

h

K id h id h K id

hZ Ii Z

Ii

Z Ii Z Ii Z

Ii0

2

2

4

2=

− + ƒ+

− + ƒ +c c csin ( cos ) [ sin ( cos )] sin.

α α α α α

h KL id

L L idiZ

Z=

+ ƒ+ + ƒc

( ) sin

( cos ).0

0 0

αα

1172

TABLE 1. Results of Calibration of Laser Sensor

ParameterLine

1 2 3 4 5 6

X-coordinate, pixels 384.05 451.42 515.52 575.87 632.56 685.62

Distance from center, pixels 0 67.37 131.47 191.82 248.51 301.57

Calculated value of L0, mm – 108.486 107.768 107.488 107.426 107.568

Page 6: Construction of system for controlling the dimensions of large-scale parts based on a 3D machine vision system

5. I. I. Dunin-Barkovskii and V. A. Klevalin, Mikroprotses. Sredst. Sist., No. 3, 15 (1990).

6. K. Park et al., “Intelligent robots and systems,” IEEE/RSJ Intern. Conf. (1992), Vol. 3, p. 2080.

7. W. Brunk, “Geometric control by industrial robots,” Proc. 2nd Intern. Conf. on Robot Vision and Sensory Controls,

Stuttgart, 1982, p. 223.

8. T.-S. Shen, J. Huang, and C.-H. Meng, “Multiple-sensor integration for rapid and high-precision coordinate metrol-

ogy,” IEEE/ASME Intern. Conf. on Advanced Intelligent Mechatronics, Atlanta (1999), p. 908.

9. G. S. Landsberg, Optika [in Russian], Nauka, Moscow (1976).

1173