Upload
adithya-reddy
View
219
Download
0
Embed Size (px)
Citation preview
8/8/2019 2005p3ps055-Welding Research Institute
1/32
1
A REPORT
ON
LASER VISION BASED SEAM TRACKING SYSTEM FOR WELDING
AUTOMATION
BY
Name of the Student: 1) Adithya Reddy Gangidi I.D. No.: 1) 2005P3PS055
AT
BHARAT HEAVY ELECTRICALS LTD, TRICHY
A Practice SchoolI station of
BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCE, PILANI
JULY, 2007
8/8/2019 2005p3ps055-Welding Research Institute
2/32
2
A REPORT
ON
LASER VISION BASED SEAM TRACKING SYSTEM FOR WELDINGAUTOMATION
BY
Names of the Student I.D. No. Discipline
1) Adithya Reddy Gangidi 2005P3PS055 Electrical & Electronics
Prepared in partial fulfillment of the
Practice SchoolI Course
AT
BHARAT HEAVY ELECTRICALS LTD., TRICHY
A Practice SchoolI station of
BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCE, PILANI
JULY 2007
8/8/2019 2005p3ps055-Welding Research Institute
3/32
3
BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCE
PILANI (RAJASTHAN)
Practice School Division
Station: Bharat Heavy Electrical Ltd. Centre: Trichy
Duration: 7 weeks Date of Start: 24th
May, 07
Date of Submission: 15th
July 2007
Title of the Project: Laser vision based seam tracking system for welding automation
ID No.: 2005P3PS055
Name: Adithya Reddy Gangidi
Discipline: B.E. (Hons.) Electrical and Electronics
Name of Guide: Dr. S. Manoharan
Designation of Guide: Senior DGM, WRI, BHEL
Name of the PS Faculty: Dr. Srinivasan
Key Words: Welding automation, Image processing, Machine Vision
Project Areas: Machine Vision
Abstract: PC based machine vision systems can be used for online control of welding
process. In this project, a seam tracking system which uses high intensity laser source,
camera and a PC has been developed. The system takes images at regular intervals of timeand estimates the edge gap and root gap of a groove. The position of edge centre is thus
calculated and feedback is given to torch so as to keep it at the edge centre. Measures to
further reduce the processing time are proposed.
Signature of Students Signature of PS faculty
Date: Date:
8/8/2019 2005p3ps055-Welding Research Institute
4/32
8/8/2019 2005p3ps055-Welding Research Institute
5/32
5
ABSTRACT:
Visual checking of weld joints by operators is no longer sufficiently reliable or cost effective.
Seam tracking technologies that use machine vision are often cost effective and they enhance
the productivity of the welding process. The project deals with designing such a seam
tracking system.
PC based machine vision systems can be used for online control of welding process. In this
project, a seam tracking system which uses high intensity laser source, camera and a PC has
been developed. The system takes images at regular intervals of time and estimates the edge
gap and root gap of a groove. The position of edge centre is thus calculated and feedback is
given to torch so as to keep it at the edge centre. Measures to further reduce the processing
time are proposed.
The images are processed using MATLAB 7.0 software and the corresponding feedback is
given to stepper motor. Programming is done with image acquisition, image processing and
data acquisition toolbox of MATLAB. The motor is interfaced to the parallel or serial port
through an interface circuit. The system is made user friendly trough a graphical user
interface. Image acquisition and control is achieved in 0.41 seconds, (in a PC with 1.66 GHz
Core Duo processor) which is well in agreement with the need.
8/8/2019 2005p3ps055-Welding Research Institute
6/32
6
CONTENTS:
Topic Page No.
A. Cover Page 1
B. Title page 2
C. Certificate 3
D. Acknowledgement 4
E. Abstract 5
F. Contents 6
1. Introduction 72. MATLAB-Overview 83. Apparatus Used 104. The Algorithm 125. Implementation of Algorithm in MATLAB 126. Software for stepper motor control 177. Hardware interface between stepper motor and PC 178. Conclusion 229. References 2310.Appendix 2411.Index 31
8/8/2019 2005p3ps055-Welding Research Institute
7/32
7
1. INTRODUCTION:The use of robots in manufacturing industry has increased rapidly during the past decade. Arc
welding is an actively growing area and many new procedures have been developed for use
with new lightweight, high strength alloys. One of the basic requirements for such
applications is seam tracking. Seam tracking is required because of the inaccuracies in joint
fit-up and positioning, war page, and distortion of the work piece caused by thermal
expansion and stresses during the welding. These effects cannot be compensated for by most
robotic welding machines; especially those using open-loop control, and frequently lead to
poor welding quality. As a consequence, to maintain weld integrity, automated seam tracking
based on real-time feedback is required.
Robotic welding sensors, researched during the past few years, have used a variety of
different techniques for seam tracking. The most commonly used techniques include acoustic,
magnetic, electrical, and mechanical methods. The electrical through-the-arc sensor based on
the welding arc properties is the dominant method, where the current (voltage) through the
weld arc is used to control the position of the welding torch. However, the preferred systems
are based on optical or visual sensors.
These vision based seam trackers have many advantages. The sensor system is less sensitive
to electrical and magnetic interferences from the welding arc compared to the traditional
through-the-arc method. Also the vision sensor can provide more information about the joint
than merely the seam position. It is possible for the same sensor system to achieve seam
tracking and also obtain dimensional parameters about the seam, during a single pass.
With the sensor mounted on the welding torch, the seam tracking and control can be online.
The problems encountered by the early vision system applications are the speed of processing
and the cost of the hardware to implement such systems. This has since become a less
important factor as developments have taken place in parallel processing which have
decreased both the cost of the vision system hardware and the processing time.
PC based machine vision systems can be used for online control of welding process. In this
project, a seam tracking system which uses high intensity laser source, camera and a PC has
been developed. The system takes images at regular intervals of time and estimates the edge
8/8/2019 2005p3ps055-Welding Research Institute
8/32
8
gap and root gap of a groove. The position of edge centre is thus calculated and feedback is
given to torch so as to keep it at the edge centre. MATLAB platform is used for coding.
2. MATLAB:2.1 Introduction:
MATLAB is a high-performance language for technical computing. It integrates
computation, visualization, and programming in an easy-to-use environment where problems
and solutions are expressed in familiar mathematical notation.
Typical uses include
Math and computation
Algorithm development
Data acquisition
Modeling, simulation, and prototyping
Data analysis
exploration, and visualization
Scientific and engineering graphics
Application development, including graphical user interface building
MATLAB is an interactive system whose basic data element is an array that does not require
dimensioning. This allows you to solve many technical computing problems, especially those
with matrix and vector formulations, in a fraction of the time it would take to write a program
in a scalar non interactive language such as C or FORTRAN.
The name MATLAB stands for matrix laboratory. MATLAB was originally written to
provide easy access to matrix software developed by the LINPACK and EISPACK projects.
MATLAB has evolved over a period of years with input from many users. In university
environments, it is the standard instructional tool for introductory and advanced courses in
mathematics, engineering, and science. In industry, MATLAB is the tool of choice for high-
productivity research, development, and analysis.
MATLAB features a family of add-on application-specific solutions called toolboxes. Very
important to most users of MATLAB, toolboxes allow you to learn and apply specialized
8/8/2019 2005p3ps055-Welding Research Institute
9/32
9
technology. Toolboxes are comprehensive collections of MATLAB functions (M-files) that
extend the MATLAB environment to solve particular classes of problems. Areas in which
toolboxes are available include signal processing, image acquisition, image processing,
data acquisition, control systems, neural networks, fuzzy logic, wavelets, simulation, and
many others.
2.2 The MATLAB System:
The MATLAB system consists of five main parts:
1. Development Environment: This is the set of tools and facilities that help you use
MATLAB functions and files. Many of these tools are graphical user interfaces. It includes
the MATLAB desktop and Command Window, a command history, an editor and debugger,
and browsers for viewing help, the workspace, files, and the search path.
2. The MATLAB Mathematical Function Library: This is a vast collection of computational
algorithms ranging from elementary functions, like sum, sine, cosine, and complex
arithmetic, to more sophisticated functions like matrix inverse, matrix eigenvalues, Bessel
functions, and fast Fourier transforms.
3. The MATLAB Language: This is a high-level matrix/array language with control flow
statements, functions, data structures, input/output, and object-oriented programming
features.
4. Graphics: MATLAB has extensive facilities for displaying vectors and matrices as
graphs, as well as annotating and printing these graphs. It includes high-level functions for
two-dimensional and three-dimensional data visualization, image processing, animation, and
presentation graphics.
5. The MATLAB Application Program Interface (API): This is a library that allows you to
write C and FORTRAN programs that interact with MATLAB.
8/8/2019 2005p3ps055-Welding Research Institute
10/32
10
3. APPARATUS:
3.1 Apparatus used:
Laser line source
Solid state camera
PC with MATLAB 7.0
Electronic components for interfacing the motor to PC
Stepper motor
1394 cable, parallel or serial port connector
Fig 1: Apparatus of the system with interconnections
3.2Description of apparatus arrangement:
A laser beam is projected on to a measurement surface where it is scattered from surface and
its image is detected by an optical detector, a camera. The images contain the detail of groove
measurements. PC processes the images, infers the details and gives the feedback to motor.
8/8/2019 2005p3ps055-Welding Research Institute
11/32
11
3.3Laser Source:A monochromatic laser source is used so that the ambient and weld torch light will not
have much effect on source light .Thus information at a precise point can be obtained
correctly.
3.4Solid state camera:A Panasonic solid state camera is used for image acquisition. It has a maximum
resolution of 2048*1536. However we grab the images for our purpose at a resolution of
640*480. It is connected to the PC trough a frame grabber interface. Instead a 1394 cable
can also be used to interface the camera with PC.
3.5PC with MATLAB 7.0:The PC used is an independent computer (not connected to LAN) with a Pentium 4
processor and MATLAB 7.0 software installed. MATLAB is computational tool. Image
acquisition, image processing and data acquisition tool box of MATLAB are used for
programming in the current context.
3.6Interface between PC and stepper motor:The electronic components are soldered on PCB so as to make a secure interface between
the PCs port and the stepper motor. Interface is connected to the port trough a male
connector. Detailed description of circuit diagram is given in later part.
3.7 Stepper motor:Stepper motor used is of 6V and full step angle 1.8 degree. The four output lines of the
interface are taken in by the stepper motor. It corrects the torch position so as to align it
with the edge centre.
8/8/2019 2005p3ps055-Welding Research Institute
12/32
12
4 ALGORITHM:Calibrate the image with help of set of blocks of known dimensions.
Get the image and store it into an array variable.
Convert the true color image into gray scale image.
Select a region of interest for further processing.
Filter the image by removing noise.
Detect the edge of the images using edge detection operator.
Once edge detection is done program, execute the MATLAB codes to find various
parameters required.
Calculate the pixel values of edge and root centers.
Estimate the amount of deviation in successive values of edge center using the
calibration measurements.
Give corresponding feedback to the motor so as to correct the deviation.
5 IMPLEMENTATION OF ALGORITHM USING MATLAB:5.1 Calibration:
The result of processing the acquired image is the deviation of edge centre value in terms of
number of pixels. But this cant be directly given as an input to the feedback mechanism. So
we need to know the length which each pixel corresponds to. For this purpose before starting
of the process we put a specimen of known length and measure the number of pixels it
corresponds to. This process is repeated for different lengths and the results are combined to
obtain a calibration curve.
The figure below (fig.2) shows the calibration process. After processing such an image of a
specimen of known length, it is converted to a binary image. From the profile of edge we get
the value of length in terms of pixels. The process is repeated for various lengths and thus a
calibration graph is made. The graphical user interface communicates between the user and
the MATLAB platform.
8/8/2019 2005p3ps055-Welding Research Institute
13/32
13
Figure 2: MATLAB window showing the calibration process. The difference in pointsof sudden rise and fall in the graph gives the length of specimen in terms of pixels.
5.2 Image acquisition:
Image is taken with the camera and it is transferred on to the MATLAB platform with the
help of winvideo driver readily available in latest version of MATLAB. In order to get
images in to the platform we need to create a video input object in MATLAB.
Imaqhwinfo is the function used to know the adaptors available on the system.vid=videoinput('winvideo',1,'RGB24_352x288') creates a video input object taking device
adopter, ID and the format as inputs. Once the object is created we use i= getsnapshot(vid);
statement to get the snapshot and store it in an array variable i.
8/8/2019 2005p3ps055-Welding Research Institute
14/32
14
5.3 Conversion to gray scale image:
RGB images are converted to grayscale by eliminating the hue and saturation information
while retaining the luminance. It is achieved with a command rgb2gray(RGB).
5.4 Selection of region of interest:
Processing of unnecessary regions of image will increase the processing time. Further as our
region of interest remains almost constant, we can crop the region of interest and processing
can be done on the selected region.
This can be done with the function imcrop( I, [region of interest]). Instead if we are enough
sure of the region of interest, then we can only acquire that specific region of image. This
action amazingly brings down the processing time and can be done by changing the
properties of video input object using the statement vid.roiposition=[ ].
5.5 Filtering the image:
It has to be taken care that the image is free from noise (due to surface reflections) before
performing edge detection. Thus filtering is a very important and sensitive step in the
algorithm.
8/8/2019 2005p3ps055-Welding Research Institute
15/32
15
Fig 3: Difference between the unfiltered and filtered images after edge detection
5.6 Edge detection:
Now we need to carry out edge detection on the filtered image so that we get a clear profile
of the focused light. This helps us in getting the required parameters.
In MATLAB edge detection is carried out with help of statement s=edge(bw,'canny'); The
inputs to the edge detection function are input image array variable and the algorithm used.
The canny detection algorithm suits our purpose.
8/8/2019 2005p3ps055-Welding Research Institute
16/32
16
Fig.4: MATLAB user window showing original image, image after filtering and image after
edge detection and then the measured values of edge and root gap.
5.7 Parameter evaluation:
The output of the edge detection is the binary image with the required edges detected. Now
we need to exactly graph of the edge for parameter evaluation. The parameter required
ultimately from each image is the location of the edge centre.
For graphing the edge we search for white pixel in each column of pixels and store the row
number of the white pixel in an array. In our case we begin the search from bottom of a
column and continue the search till we get the first white pixel in nthrow and it is this n we
store into an array variable. The array variable readily graphs the edge profile.
At the edges there is huge variation of slope of the graph. This fact helps to evaluate the pixel
values of two edges and take the mean of them to get the pixel value of edge center. This
8/8/2019 2005p3ps055-Welding Research Institute
17/32
17
procedure is repeated for all the images. Looping commands like for ... endand conditional
statements likeif then are used for this purpose.
5.8 Feedback:
For every image, edge centre value is compared to that of the previous image and the
difference in values with the help of calibration data is given as correction to the feedback
mechanism. The stepper motor by moving in either clockwise or anti-clockwise direction
corrects the torch position. Thus the torch position is always aligned with the edge centre.
6. SOFTWARE FOR STEPPER MOTOR CONTROL:For controlling the stepper motor trough the port an output object needs to be created in
MATLAB using parport=digitalio('parallel','LPT1');
The stepper motor moves clockwise when it outputs [1 1 0 0; 0 1 1 0; 0 0 1 1; 1 0 0 1]; that is
four sets of four bit data are sent out in a sequential order as shown in fig.4. If the order is
reversed, then the motor moves anti-clockwise.
For output and input through the port putvalue(parport,pval);andgval = getvalue(parport);
are the functions used. But between each output, the gap should be a minimum of one
millisecond. This can be achieved with tic .. toc command. Complete code is given in the
appendix.
7. HARDWARE INTERFACE BETWEEN STEPPER MOTOR AND PC:7.1. Stepper motor:
A Stepper Motor is a digital motor, because it is moved in discrete steps as it traverses
through 360 degrees. Thus stepper motor has an advantage of precise rotation. The stepper motor
which we are using has step angle of 1.8 degree. It has four input wires which need to be
energized in particular order so as to move the motor in desired fashion.
8/8/2019 2005p3ps055-Welding Research Institute
18/32
18
Fig.5: Stepper motor inputs to the four wires (0011), (0110),(1001) and (0011) in this order
gives clockwise rotation. One output, for example (0011) makes the motor move by one step.
7.2. The interface circuit components and their purpose:
The following hardware required for the interface circuit:
1. Parallel or serial port connector: They are male connectors and the color configurationhelps us to identify the appropriate wire and connect it to the buffer.
2. RS 232 to TTL level converter: It is called the MAX-232 converter. We need it whenwe use serial port to give the feedback to the stepper motor. As all the electronic
components follow the TTL level logic, the RS232 output needs to be converted to
the TTL level.
Fig.6: The pin configuration of MAX232 converter
8/8/2019 2005p3ps055-Welding Research Institute
19/32
19
\
3. Serial to parallel converter IC EDE1400: If in case, serial port is used, it is to be firstconverted to parallel data using the IC EDE1400. Rest of the circuit remains same for
series and parallel port connections.
Fig.7: Serial to parallel converter IC.
4. Four 1K resistors:R accounts for potential drop and acts like an extra resistor whichguarantees that transistor does not conduct when there is no signal fed.
5. Four TIP 120 transistors:When the input is on there is a base current in the transistorwhich gets amplified to cause the collector current.Thus when collector current reaches athreshold current level the stepper coil is in 1 state.
6. Four IN4001 diodes:If the coil is energized with DC, a diode is frequently installed acrossthe coil, to dissipate the energy from the collapsing magnetic field at deactivation, which
would otherwise generate a spike of voltage and might cause damage to circuitcomponents.
8/8/2019 2005p3ps055-Welding Research Institute
20/32
20
Fig.8: Charging and discharging of the coil, before and after placing the diode
7. 5 V source for biasing: For biasing the transistors we need a constant voltage source.This source serves our purpose.
7.3 The circuit diagram:
Fig.9: The interface circuit between parallel port and the stepper motor.
8/8/2019 2005p3ps055-Welding Research Institute
21/32
21
For the serial port interface the data has to be converted to parallel with EDE1400 after its
converted to TTL level logic using the MAX-232.
With the help of interface circuit, by controlling the state of parallel port pins we can obtain a
voltages of 0 and +V (6V) in the desired sequence and thus control the rotation of motor.
8/8/2019 2005p3ps055-Welding Research Institute
22/32
22
8. CONCLUSCION:In this project, a seam tracking system which uses high intensity laser source, camera and a
PC has been developed. The system is made user friendly trough a graphical user interface.
Image acquisition, processing and control are achieved in 0.41 seconds, (in a PC with 1.66
GHz Core Duo processor) which is well in agreement with the need. Parallel programming
can be used to further reduce the processing time.
8/8/2019 2005p3ps055-Welding Research Institute
23/32
23
9. REFERENCES:1. SMITH, J.S., and LUCAS, J.: A vision-based seam tracker for butt-plate TIG
welding, J. Phys. E; Sei. Instrum., 1989, 22, pp. 739 -744
2. UMEAGUKWU, C., and McCORMICK, J.: Investigation of an array technique forrobotic seam tracking of weld joints, IEEE Trans. Ind. Electron., 1991, 38, (3), pp.
223-229
3. CHAMBERS, S.P., SMITH, J.S., and LUCAS, J.: A real time vision system forindustrial control using transporters, Electro-techno[. (IEEIE) February/March 1991,
pp. 32-37
4. WLESE, D.R.: Laser triangulation sensors: A good choice forhigh speedinspection, I&CS Control Technol. Eng. Mngmt., 1989, 62, (9), pp. 21-29
5. MATLAB-Help file
8/8/2019 2005p3ps055-Welding Research Institute
24/32
24
10.APPENDIX :MATLAB CODE:
1. Initialization of video: contents of file initial.m%---------------------initialization of video for calibration---------
vid=videoinput('winvideo',1,'RGB24_352x288')
%creation of video input objects
preview(vid)
%previews the information shot
2. Calibration: contents of file calibration.m%---------------------once the arrangement ready, through GUI, execute this code
p= getsnapshot(vid);
% gets the snap
figure,imview(p);
% for viewing pixel numbers, this will help in cropping
pe=imcrop(p)
% manual cropping by user
p1=rgb2gray(pe);
%converting normal image into gray scale
8/8/2019 2005p3ps055-Welding Research Institute
25/32
25
background=imopen(p1,strel('square',95));
%SE = strel('square',W) creates a square structuring element whose width is W pixels.
%IM2 = imopen(IM,SE)----------->
%performs morphological opening on the grayscale IM with the structuring element SE
p4=imsubtract(p1,background);
%This command gives the difference of respective co-ordinates of both images
p5=imadjust(p4,stretchlim(p4),[0 1]);
h=ones(5,5)/25;
p6=imfilter(p5,h);
% filters an image with a 5-by-5 filter with equal weights, called an averaging filter
level=graythresh(p6);
%level = graythresh(I) computes a global threshold (level) that can be used to convert an
intensity image to a binary image
bw=im2bw(p6,.48);
%converts normal image to binary image
s=edge(bw,'canny');
%carries out edge detection using the specified algorithm
imview(s)
%----------------------code to plot the edge into array
t=0
trow=0;
trow1=0;
[r1,c1]=size(s);
8/8/2019 2005p3ps055-Welding Research Institute
26/32
26
for j=1:c1
n=0;
for i=1:r1
if s(i,j)==1
c=c+1;
if c==1
trow=i;
tcol=j;
elseif c==2
trow1=i;
tcol1=j;
end
c=1;
end
end
distance(j)=r1-trow1;
t(1,j)=[distance(j)];
c=0;
end
%---------------------close the preview window
closepreview;
3. Image processing and measurement: Contents of measurement.m file:
%initilise the video with initial.m file, thus you now have image as array i
Old= ( p1+p4)/2
i1=rgb2gray(i);
i2=imcrop(i1);
background=imopen(i2,strel('square',95));
8/8/2019 2005p3ps055-Welding Research Institute
27/32
27
i4=imsubtract(i2,background);
i5=imadjust(i4,stretchlim(i4),[0 1]);
h=ones(5,5)/25;
i6=imfilter(i5,h);
level=graythresh(i6);
bw=im2bw(i6,.48);
s=edge(bw,'canny');
imview(s)
t=0
trow=0;
trow1=0;
[r1,c1]=size(s);
for j=1:c1
n=0;
for i=1:r1
if s(i,j)==1
c=c+1;
if c==1
trow=i;
tcol=j;
elseif c==2
trow1=i;
tcol1=j;
end
c=1;
end
8/8/2019 2005p3ps055-Welding Research Institute
28/32
28
end
distance(j)=r1-trow1;
t(1,j)=[distance(j)];
c=0;
end
k=0;
l=0;
for p=6:(c1-6)
if (t(p)-t(p+5))>3
k=k+1;
if k==1
p1=p;
end
p2=p+5;
end
if (t(p)-t(p-5))>3
l=l+1;
if l==1
p3=p-5;
end
p4=p;
end
end
new=(p1+p4)/2
4. Stepper motor control: contents of stepper.msteps=[1 1 0 0; 0 1 1 0; 0 0 1 1; 1 0 0 1];
8/8/2019 2005p3ps055-Welding Research Institute
29/32
29
i=1;
k=1;
parport=digitalio('parallel','LPT1');
% creates a port object on
line=addline(parport,0:3,'out');
%declares the output lines of parallel port
deviation=new-old
direction=sign(deviation)
%depending upon the sign of deviation we give the input to motor, if it is to move clockwise
or anti-clockwise.
if direction=1
for j=1:deviation
pval=steps(i,:);
i=i+1;
if i>4
i=1;
end
putvalue(parport,pval);
%puts one row specified by pval ofmatrix on parallel port
gval = getvalue(parport);
tic
pause(0.001);
toc
end
end
if direction=-1
8/8/2019 2005p3ps055-Welding Research Institute
30/32
30
for j=1:(-deviation)
pval=steps(i,:)
i=i-1
if i
8/8/2019 2005p3ps055-Welding Research Institute
31/32
31
11.INDEX:A
Acquisition 9
B
Biasing 20
C
Calibration 13, 14
Canny 17, 22
E
Edge detection 13, 15
Edge center 13, 17
G
Gray-scale image 13, 15, 24
L
Laser source 8, 11
M
MATLAB 6, 9
P
Parallel port 6
Parallel processing 6
S
Serial port 11
T
Tool-box 9
8/8/2019 2005p3ps055-Welding Research Institute
32/32
32