38
C0~eUTE~Ga.~P~cs xNe I~c.~G~ PROCESSIN~ S, 68--105 (1976) Experiments on Picture Representation Using Regular Decomposition ALLEN ~[(L]NGERAND CHARLES1:~. DYEI~ C, omp~ter Science Department, ~chool of Engineering and Applied Science, University of California, Los Angeles, California 90024 Communicated by A. Rosenfeld Received December 23, 1974 The problem of building a computer-searchable data representation for a complex image and the effec~ of representation on algorithmsfor scene segmentationinto regions is considered.A regular decompositionof picture area into successively smaller quadrants is defined, which involves logarithmic search. This hierarchical search and resuls picture representation are shown to em~bierapid access of image data wiI~hont regard t~ position, efficient storage, and approximate structural descriptions of constituent patterns. Examples involving solid geometrical obiects and alphabetic characters are given. 1. INTRODUCTION Processing pictures by computer involves two central problems, segmentation and recognition. The segmentation task involves identifying subsets or extracting obiects of interest from a scanned (digitized) picture. Implementation of this phase (preprocessi.ng) means defining a process or set of processes to reduce the amount of picture data, since digitized pictures contain far too many points for a meaningful description of a scene ES]. This report focuses on the development of a preprocessing technique which is general enough to cover a large class of prob- lems, and yet easy to implement in terms of computational complexity, ,~torage requirements, and extensibility. To reduce the difficulty of subsequent recognition procedures, information must be discarded which is irrelevant to the current goal of classification. This data reduction can be achieved in two ways : 1. by restricting the problem domain to a limited number of possible inputs, and 2. through data reduction heuristics designed to eliminate particular kinds of irrelevant information. Pattern recognition problems are goal-directed problem-solving activities which share the property of data "over-richness" and the need for a useful data repre- sentation with other areas of complex computer decision-making (artificial intel- ligence). In the pictorial domain, all decision-activities involve seach. One of us [1,383 proposed to facilitate two-dimensional search by a regular decomposition 68 Copyrlgh~ ~ 1976 by Acade,mc Press, Inc. All rights of reproduction in any form reserved.

Experiments on picture representation using regular decomposition

Embed Size (px)

Citation preview

Page 1: Experiments on picture representation using regular decomposition

C0~eUTE~ Ga.~P~cs xNe I~c.~G~ PROCESSIN~ S, 68--105 (1976)

Experiments on Picture Representation Using Regular Decomposition

ALLEN ~[(L]NGER AND CHARLES 1:~. DYEI~

C, omp~ter Science Department, ~chool of Engineering and Applied Science, University of California, Los Angeles, California 90024

Communicated by A. Rosenfeld

Received December 23, 1974

The problem of building a computer-searchable data representation for a complex image and the effec~ of representation on algorithms for scene segmentation into regions is considered. A regular decomposition of picture area into successively smaller quadrants is defined, which involves logarithmic search. This hierarchical search and resuls picture representation are shown to em~bie rapid access of image data wiI~hont regard t~ position, efficient storage, and approximate structural descriptions of constituent patterns. Examples involving solid geometrical obiects and alphabetic characters are given.

1. INTRODUCTION

Processing pictures by computer involves two central problems, segmentation and recognition. The segmentation task involves identifying subsets or extracting obiects of interest from a scanned (digitized) picture. Implementation of this phase (preprocessi.ng) means defining a process or set of processes to reduce the amount of picture data, since digitized pictures contain far too many points for a meaningful description of a scene ES]. This report focuses on the development of a preprocessing technique which is general enough to cover a large class of prob- lems, and yet easy to implement in terms of computational complexity, ,~torage requirements, and extensibility.

To reduce the difficulty of subsequent recognition procedures, information must be discarded which is irrelevant to the current goal of classification. This data reduction can be achieved in two ways :

1. by restricting the problem domain to a limited number of possible inputs, and 2. through data reduction heuristics designed to eliminate particular kinds of

irrelevant information.

Pattern recognition problems are goal-directed problem-solving activities which share the property of data "over-richness" and the need for a useful data repre- sentation with other areas of complex computer decision-making (artificial intel- ligence). In the pictorial domain, all decision-activities involve seach. One of us [1,383 proposed to facilitate two-dimensional search by a regular decomposition

68

Copyrlgh~ ~ 1976 b y Acade,mc Press, Inc. All rights of reproduct ion in any form reserved.

Page 2: Experiments on picture representation using regular decomposition

PICTURE REPRESENTATION USING REGULAIg DECOMPOSITION 69

procedure to delete large noninformative areas of a picture or other array. In this paper we show how the resulting condensed picture can be:

1. evaluated by a top-down approach using a new algorithm which retains more marginal boundary data,

2. processed from a resulting data structure into recognition-oriented units such as isolated objects or properties (symmetry, orientation).

Harmon [-361 has shown that area-partitioned information is suflieient for people in the recognition of human faces. The human visual system has many low-level operations which occur for all contexts [-9, 28]. These "front-end" oper- ations re&tee the eombinatories, and hence complexity, of human scene analysis by eliminating all but a relatively few areas in a search for objects in a scene. A subsidiary purpose of this paper is to show that the decomposition scheme of geographic partitioning yields similar front-end context-independent qualitative information regarding picture e, haracteristics such as pattern orientations, sizes, and shapes.

Separation of objects from their background is basic in pattern recognition and scene analysis F8, 14-, 29]. A large class of algorithms employs edge detection and line fitting procedures for direct object identification [30-32]. Others have used region-growing methods and subsequent identification of separate objects [19, 33-35]. However, such algorithms seem to suffer in two prominent ways. First, they involve a complex and exhaustive point-by-point search of the entire picture domain to identify objects in a scene. Second, these preproeessing techniques inw~lve "bottom-up" segmentation: the use of raster point values to build up a glebal picture deseription. Local noise and other aberrations greatly influence the control and ef[ieiency of such algorithms [-19].

Regular decomposition has been employed by researchers in computer graphics, scene analysis, '~rchitectural design F6], and pattern recognition. Warnoek's [-3-5] hidden surface elimination algorithm subdivides successively finer picture squares while searching for areas which are simple enough to display graphically. SI~I's mobile automation [7] utilizes a "grid model" which similarly subdivides the automation's visual environment to an arbitrary degree of precision for deter- mining the feasibility of a proposed journey path.

To motivate the decomposition schcnle we will discuss processing a digitized aerial photograph. Large areas of the original scene are rolling green hills with little information content. Most information is in a portion, such as a town, which can be viewed as a complex subproblem consisting of subsets which may be ob- jects such as buildings, swimming pools, and parking lots. After these are as- similated the processing can move to other areas of the picture looking for subordinate structures such as an intercity road. This way of processing the photograph spends little or no time looking at simple areas. Complex areas define the contents of the photograph and are analyzable as subproblems, each requiring a solution. These subprobIems are reduced into further subproblems until they are either solved or a time limit is reached.

The body of this presentation is devoted to the description of the improved regular-decomposition image-processing algorithm. I t is used to obtain informs-

Page 3: Experiments on picture representation using regular decomposition

70 h-~INGER AND DYEI~

tive (complex) portions of a picture in several examples, for which hierarchical picture representations were obtained. This daga structure and its potential for efficient handling of scene processing functions are discussed. Specific labels are defined to aid in traversing the data structure (essentially a tree of quadrants of the original picture, each possibly subdivided into four successors). Label values indicate the extent of informative parts of the picture within a given quadrant. This information is used to declare some subquadrants of adjacent quadrants to be neighbor quadranls (possibly containing extensions of the object in the quadrant which is to be kept at finer picture decompositions, i.e., deeper in the tree). Heuristics for searching neighbor quadrants for added linkage information arc described, as are methods for merging quadrants into regions to approximate structural scene relationships (e.g., orientation). Finally, computational examples are discussed and measm'ements of storage, computation time, and object detec- tion and separation ability are given.

2. REGULAR DECOMPOSITION

The concept of the decomposition algorithm is :

1. Represent a digitized picture as spatial subsets of differen~ size marked either "informative for scene description" or "noninformative."

2. Discard picture elements (pixels) that belong to "noninh)rmative" subsets,

Note that this contrasts with procedures which test whether picture elements "~re part of an object.

Initially, the entire digitized picture (a two-dimensional array of gray level m' light intensity values) is a quadrant. Three possibilities exist when ~he algorithm looks at a quadrant:

1. Nothing informative is contained there. (This will be the case for large homogeneous areas. Such areas may be eliminated from the data structure without loss of picture information.)

2. A large amount of information is found in the quadrant. (Many lines, vertices, and regions with diverse textures are found. Picture elements in the quad- rant should be saved in the data structure of the reduced picture.)

3. An intermediate amount of information is present (not enough to make a definite decision).

If the algorithm fails to make a decision about a picture quadrant, it is subdivided and then each of the four subquadrants is processed by the same procedure. The subdivision process is applied recursively until either no failures occur or else the quadrant size becomes equal to the smallest resolvable point of the picture (one pixel).

The process of regular decomposition is thus a logarithmic search for picture areas where there is "informative" data present. The algorithm builds a tree by hierarchically examining a picture's contents. Each area is assigned an importance based on how informative it is judged to be. Measures of information may be coarse--involve total intensity or total local pseudogradient--or may involve complex calculations or previously stored templates. We will discuss functions

Page 4: Experiments on picture representation using regular decomposition

PICTURE I~EPRESENTATION USING REGULAR DECOMPOSITION

__J__

71

Fro. 2.1, 3. diagonally oriented object,

for discriminating picture information after an example which illustrates the regular decomposition process.

Figure 2.2 shows the regular decomposition tree which results from a picture containing a diagonally oriented object located along the smallest squares in [,'it. 2.1 Dashed tines ira Fig. 2.2 represent regions whose pLxels are eliminated ~nd these are slmwn as nonsubdivided areas in Fig. 2.1. The decimal labels of nodes in Fig. 2.2 correspond to picture areas (squares of different sizes in Fig. 2.1) in a way tha~ is discussed in detail in Section 3.

The choice of an appropriate discrimination function is clearly one critical issure in this decmnposition process. If the function is relatively simple, then few sub- quadrants may be processed and only a small amount of information discarded. If the function is complex, then more quadrants may be processed and computa- tion per quadrant may be high. However, there cannot be a general method for defining area importance. Informative values must depend on context and also on the required picture description E8]: importance must be defined in both syntactic and semantic terms. Two simple methods have been tested so far which utilize only syntactic information.

1. Thresholding the picture itself. Intensity of grey level is the coarse picture importance parameter. [-This is the natural approach in several cases: automatic character recognition (black characters, white page); chromosome analysis (stained chromosomes darker than their background); reconnaissance photo- graphs (clouds whiter than terrain)].

P

I:'A j'~" // ~I 1 "'P.D

�9 PA'D A" PDDA P A A B P A D B P e A S PDOB

P A A C PAOC PDAC PDDC P A A D PAD{) P D A D PDDD

Fro, 2.2. Regular decomposition tree for Fig. 2.1.

Page 5: Experiments on picture representation using regular decomposition

72 KLINGEI~AND DYEK

A['i,l) A(1,21 AH,5) A(I,4) A(2,1) A(2,21 A(2,51 A(24) A(5,1) A(321 A(33) A(3,4 A(4,t) A(4,2 h 1413) A 14,4)

Fro. 3.1.4 X 4 mab'ix k.

2. Edge and curve detection. The number of edges is the picture importance parameter. (For pictures not composed of line drawings, well-known methods, including calculating and thresholding the local pseudogradient E8], can be used to isolate edges.)

A quadrant being subdivided ham previously been labeled "intermediate amount of information." Such a quadrant is treated as a new unknown picture which is to be partitioned. The importance of all four subquadrants is computed relative to their parent quadrant. With intensity as the coarse picture parameter, we define the relative importance d(x, y) of a subquadrant :c within a quadr-mt y "m

intensity of subquadrant x d(x, y) =

Intensity of quadrant y

Thus, d(x, y) represents the proportion of information of quadrant y, f(mnd in subquadrant x. (A measure of importance relative to the surrounding area is convenient since it is independent of the depth in the tree.) The discrimination function we will use will be two thresholds of the relative importance of quadrant x in picture y. In terms of the threshold values wt and ~.02 a quadrant with:

'l.v2 ~ d(x, y) <~ 1.0 will be termed informative;

wl <~ d(x, y) < w~ will be termed not sure;

0.0 <x d(x, y) < w~ will be termed noninformative.

The thresholds w~ and w2 will be established before processing picture elements in s work. However~ a natural extension would be to make them adaptive to the picture content.

In the following, we use a function ~ to include qualitative information, prior knowledge about the domain, and contextual clues. This function is called the informatio~z measure and takes values of d(x, y) into {informative, not sure, noninformative}.

3. TREE STI%UCTURES

3.1. Trees and Digitized Pictures

Digitized pictm'es are commonly represented as two-dimensional arrays, where each element of the array contains some information about the corresponding

A

A(I~) " A(2, ~ } A (3.*) "A(4,~)

All , l } A(2,1} A(3,I) A(4,1) h(~ ,2) A(2,2) A(3,2) AI4,2)

AH,3) A(2,3) A[3,3) A(4,3) A(1,4) A(2,4) A(5,4} A(4,4)

Fro. 3.2. Tree of matrix A decomposed by rows.

Page 6: Experiments on picture representation using regular decomposition

PICTUI~E I~PI~ESENTATION USING REGULAR DECOMPOSITION 73

A

uI~per lell upper rlgh~ lower lefl lower rlghl corner cornel corner ca(net

A{1,1) A(t,2)At2,1) A(2,Z} A(5,5) A(3,4) A(4,5) A(4,4)

Fro. 3.3. Tree partition of matrix A by areas.

area of the image space being viewed. It is well known that such an array can be thought of as a speci~l case of a tree structure [37]. For example, Figs. 3.1 and 3.2 give two representations of a 4. X 4. matrix, the latter explicitly displaying [,he relationships of elements in the same row in a tree. However, this tree does not display all the structure of the matrix (column relationships omitted), and a silnilar tree based on column relations omits row information. Since picture elements are usually obtained h'om horizontal raster scan lines, row-oriented data structures and algorithms utilizing line by line "slice~" ~re common in picture processing [13, 19, 4-0]. However, gener-~l properties of pictures classes are un- likely to be represented in linear (row) form [17], since key geometrical, topo- logical, structural, and metric constraints arc usually involved. Because a data structure should reflect prior knowh'.dge about properties contained ~ithin the data base, representations which facilitate computer search for picture properties in areas should be preferable. A tree structure which directly reflects areas (rather than rows) for the above 4. X 4. matrix is shown in Figs. 3.3 and 3.4.. This strtteture can be conveniently represented for computation by the Dewey decimal (library oh.ssifieation) notation:

i A 1.1 upper loft corner 1.1.1 A. (1, i) I.i.2 .4 (i, 2) 1.1.3 `4 (2, ~) i.i.4 `4 (2, 2) 1.2 upper right corner

:

1.4.4 A (4., 4.)

An image can be represented by a tree containing only nodes where a sub- quadrant has been found to be important or nonterminal (not nonimportant). Hierarchic levels within the tree contain information regarding the structure of

upper- upper- lower- lower- left r,ghI lef, right

FrG. 3.4. One Ievel of regular deeomposit ion.

Page 7: Experiments on picture representation using regular decomposition

74 K.LINGER AND DYEE

patterns in the image (e.g., see Figs. 2.1 and 2.2, where the "upper-left-corner diagonal pat tern" can be recognized when node P.A is reached and the presence of only the successors P.A.A and P.A.D noted.) Component parts of patterns may be identical (e.g., letters "P" and "B") so that tree partial similarities may be a useful recognition aid. This leads to a decision to build algorithms which traverse a reduced picture's tree by preorder [-373. Visiting the nodes of the tree by preorder permits all successors of a node to be examined before it is examined. Since successors represent smaller picture zones, the tree and a preorder traversal algorithm break the overall scene analysis task into the solution of a series of subproblems.

3.2 Trees and Regular Decompositio~

The data structure of Fig. 3.3 is a regular decomposition of the picture area by successive partitioning into quadrants. The result is a tree where each f~Lthcr node has at most fottr successor nodes. These are ordered arbitrarily by: upper- left, upper-right, lower-left and lower-right. Following the notation of ['1, 38Z, these subquadrant areas will be called A, B, C, and D, respectively, and the alphabetic label attached via the Dewey system notation to locate the successors of a node. This is illustrated in Fig. 3.4- for a single level of decomposition and in Fig. 3.5 for six levels, and formalized by the following definitions.

DEFInITiON 1. A Q-tree is a finite set of nodes which is either empty or consist~ of a quadrant and at most four disjoint Q-trees.

This recursive definition of Q-tree is analogous to Knuth's definition of binary tree ~37~, except here each node has exactly four subtrees.

NUMBER O F N O D E S A T L E V E L i = 4 n MAXIMUM TREE DEPTH FOR B4 x 64 ARRAY = 6

/ ~ / TOTAL NUMBER OF NODES IN TREE =i~141 = 5461 NODES

FIo. 3.5. Complete regulra' de(,'o,nposition tree to level 6 : Le~d imdes are pixels if P is digitized ~o a 64 • 64 array.

Page 8: Experiments on picture representation using regular decomposition

1 I(,IUI.IT!] .I~EPIIrI~SENTAT[ON USING 1LEGLILAIL DECOMPOSITION 75

TABLE 1 Picture Quadrant Contents Locating Procedure

Pe++,:i,d~+re: C+SN2/ERT(p, m, J~);

s a n tl. m n t) i++hirr t.bez'rP A ~ I ,+.lI+(+'.iJ ~ . T h e .d+m,+]u te J

~;.;+~.x+z'c [+++]t'd(++.lg:++++ +~+t' t+++'+:[++.: t h e +It++2+tr:~tl['+ Ii+++-[ : +c+++ :IT2 ~:he p ~ : t t l l ' e at++ i'++tmd h~, i:+i+.+ I+],. .] - t t ] ~ + l>rfloptlt+1*/+, +i'h+ hit)t]t+]+)+l+ ++++all a i ; d C.s i ] +pet+++[r+ .,n ~I I+]I+IIlIII:IPPJ+' '+ l:I ' [IV ++ ++t PL2P ++t<l +(]L"

~[)t ' I t ~ h i [,J -+s - - +

i[__ ] 'UaII{] I ) = '1 I ' ~]tel~

x2N~x = m:

: L m a x + t+; y a d ;

.~ [ I14];i<[[+)} + +A' t h ( m

".rod+. : I I++r~rf[ '~L+~ill + : . . f la-~) : ] : A m ~ ;

h , , ,u t ( l q ' i t ' t ~','t+~_2+

" . l , I [ l ~ iPla + I ( ( , r1~+r+ ~ �9 i 1 ]~ r 2 ) :

12+ hP+; , f fpJ - ' , " 1]~; % itiiIl ' , N , [ I [ ( : r IIIIH + " ( r l + a + ) d ) :

], ~4++]x t +lit+l, [ t, +]]+[L , :~[l+[l~) ':+) ] e't+(~ ;

it~ II4:iiL+(il } ' I } ' ~ h l q ~

+~l )J i+ . l , l p ~ ] ~ < s , 111+11 + v t++~] ~+}+

q+ = t+r l [J [ t d + l ( t l / ) ~

t'- :: l+li+) l ,: l;'d~, X%I,I~II ,[+$(~ :~iilaX ll~+ di~[ ' s Lh+' ~ttt,Oltltl, l~lh)lll'~lll~ Ill}t) I ~ [liil hl th+P l } ] . : [ ) ] l l f$ ~:p~lihp,

relict CI~[Wt:RT~

I)EF[NITION 2. A quadrant is the roo~ of a Q-tree. Quadrants arc nodes which can be labeled by a set of properties C.

D]~r[NITZON 3. A picture is a quadrant labeled P. DEF[NITLON 4. _A+ leaf quadrant is a quadrant whose successor Q-trees are Ml

empty. DEFINITION 5. A reduced picture is the set of all le~f quadrants in a Q-tree whose

root is labeled P. DEFZNITION 6. A regular decomposition of a picture P is a Q4ree with P at the

root a~d the reduced picture of P at the leaves.

Informally, a quadrant is an image area, rectangular or square, characterized by relative position, size, and intensity. The absolute loeation of a quadrant in the picture is obtainable from its Dewey decimal label 1-39, 43~ : see the procedure in Table 1 ; however, passing down properties C as the tree is established is easier and quicker for computing quadrant position:

C = (Intensity, x_ min, x_ max, y_ rain, y_ max}

The properties in C can be recomputed for the four successor subquadrants by Rule 1, which divides each side of the quadrant in half and assigns new vertices to the subquadrants using the functions floor (k J) and ceiling (~- 7) to obtain integer-values of boundary line positions and hence disjoint subquadrants (see Fig. 3.6).

Page 9: Experiments on picture representation using regular decomposition

76 KLINGE[~ AND D~Eit

Yomln Xamln

X~max

O.,A I aiB

~.C a.D

•o[max

l E(x~l. + • ] [(Xamln + Xamax) /2]

Fro. 3.6. B o u n d a r i e s of fou r s u b q u a d r ~ n t s of q u a d r a n t ~ defined b y R u l e 1.

RULE 1. Let a be the label of a quadran% whose properties are C~ :

Ca = (I., xa n-fin, x. max, y. rain, y. max},

Then the root-lubcis ~nd properties of its four subqu~drants ~re

t. a.A

< . . . . . o:,x Lx. mill + .ca,-axJ C~.A = I(i, j), x, rain, , i = Yo-4 mill f = x,~.,t rain 2

2 . Oz.~

Ca.. = ~ I(i , ,]), .r. mill, , ] = Y,,.~a min i ~ x . . . rain

[Y,, Inin -~ ya ln~xl,

2 3. o~,C

C~.c = z(i , j ) , \ ] ~ y~.c mln i = ~:~.a rain

4 . C g . ~

<j ~a,D Inu~x 3?a,D Ill}IX FX a Ca.,, = Z E z ( i, j ) ,

= y~.D min i = z~,~ rain

y a m a x > .

min + :c~ max], / Za l'n&X~

2

y~ rain, - . 2

m i l l -~ X~ l n ~ x ] 2:a I I l a x 7 / 2

yy~ mi~ +2 ya ln~x], y~ max>.

Page 10: Experiments on picture representation using regular decomposition

PICTUR.E [{EPI{ESI{NTATION USING IIE(kUL.~k[{ DECOMPOSITION 77

The program which performs regular decomposition implements the definitions : tL picture P to be processed begins as an m • n digitized image; Rule 1 is applied to it rccursively and the discrimination function of Section 2 is used to decide which, if any, of the four successor Q-trees should be included. If u (d(x, y)) = non- informative, ~hen y's successor Q-tree with root labeled x will be empty. If ~(d(x, y)) = informative, then y's successor Q-tree with root l~beled m consists of the single leaf quadrant x. If #(d(x, y)) --- not sure, then y's successor Q-tree with root labeled x is a Q-tree consisting of at least one node; here, Rule 1 will be applied to quadrant x as it was to its parent quadrant y. The final result is a tree structure with P as the root node, "not sure" quadrants m~king up the nonterminal nodes and "informative" quadrants comprising the leaf nodes. Pic- ture elements of P within the boundaries of the leaf nodes are retained. These constitute the reduced picture.

8.3 Unconditional Partitioning to a Tree Level

Digitized pictures containing on the order o[ 10 ~ picture elements (1024 X 1024 array), which c;mnot be stored in fast memory, e~m be processed by unconditional decompo~i ~ion followed by sequential processing of all the subpictures so obtained. The prec(~ding techniques, including the discrimination function, can then be ttpplicd to each subpicture. For exmnple, let P be a 1000 X 1000 array, where caeh pictur(~ element is stored in two bytes, so that to store P, 2000 K bytes of storage are nec~ded. For 200 K of fast memory, .~n unconditional partition of P to loyal 5 (P .A .A.A.A.A. . .P .D.D.D.D.D) yields 102~t subpictures whio]~ could each be rcguh~rly decomposed. A global picture description could then be built up using the subpicturcs' trees.

Preprocessing to obtain the subpietures via the unconditional p~rtitioning is necessary to obtain ;~rea coherence for subsequent processing. Although a con- ventiona[ raster scan data-base could be more easily partitioned into "line sub- pictures" to lit into fast memory, picture information is not linear. Quadrant subpictures represent compact ~reas of picture dements. There, points generally have all their eight-neighbor points as well. (For example, in a picture containing m X n points, only 2 (m + n) -- 4 of these do not have all eight-neighbor points present, while no points in a single raster scan line h~ve all eight.)

3.4 Advantages el l~egular Decomposition

The advantages of regular decomposition in image processing are:

1. Via unconditional partitioning, pictures which are physically too big to store in fast memory ~t one time can be processed as a sequence of subpictures extracted during preprocessing.

2. Regular decomposition enables addressing for rapid ,~ccess to any geographi- cal part of the image.

3. Regular decomposition retains explicitly in the data structure a hierarchical description of picture patterns, elements, and their relationships. Hence, this scheme may ~Iso be used in conjunction with syntactic pattern recognition algorithms [22-25].

Page 11: Experiments on picture representation using regular decomposition

78 KLINGEIg AND D~_IlI.

4. Representations pernlit recm'sive analysis of subpictm'es. 5. The decomposition algorithm contains major routines (traversal, tree-crea-

tion) which are independent of image class. Small changes can adapt regular decomposition to ~ddely different types of pictures.

6. The resultant tree data struc~ure distinguishes object from nonobject (or background) and enables processing to locate separate obiects.

4. NEIGHBOR QUADRANT8

The nlotivation for introducing neighbor quadrants is due to two observations regarding the regtdar decomposition scheme. First, decomposition into quadrants is arbitrary. Picture areas and the dividing lines imposed may combine to slice single objects into fragments which could be separately "noninformative." This was observed in tests by Omolayole [41], where absence of some edges found in the original image led to incorrect linkage information about objects. Second, to detect and extract objects from background (distinguish relevant data), the discrimination function uses prespeeified parameters. For some values a reduced picture ~dth incomplete information may result (i.e., objects present in the original image are unrecognizable or completely absent in the reduced picture). Search of neighbor areas should deal effectively ~dth both factors and yield im- proved reduced pictures. The regular decomposition algorithm has been defined as a decision process involving quadrant a and its four subquadrants (a. A, a. B, a.C, and , .D) . A tree node a remains "active" if at least one successor sub- quadrant is added (found "not noninformative"), and this is evaluated by two- level processing. The algorithm resembles a front-end "field of vision" which can be bolstered by neighbor quadrants via a three-level algorithm.

Table 2 shows quadrant configurations (i.e., the informative or "not sure" subquadrants) and the corresponding neighbor subquadrants which were used in experiments discussed in Section 6. Other sets of templates could be used to detect specific-shapes. The following example motivates our selection of these neighbor subquadrants. Refer to quadrant c:onfiguration 5 in Table 2 and let c~ be the en(;Iosing quadrant. Then,

d(~. A, ~) ) w~,

d(a.B, a) < wl,

d(t3~.C, ~) < ll)l,

d(oe.D, a) ) wi,

indicating a general diagonal orientation of the underlying object(s). This sug- gests that the neighbor subquadrants ~. B. C and a. C. B are likely areas to search for additional picture information, since these fill in the arbitrary partitioning of a into only two subquadrants a. A and a.D. This is shown in Table 2, by cross- hatching, where the level-three border-softening quadrants may become "act ive" if any or all of them pass a revised evaluation test. The neighbor quadrants added to the tree must have a dummy father node inserted (their true ancestor node must be "noninformative," hence not contained in the tree).

Page 12: Experiments on picture representation using regular decomposition

PICTURE REPRESENTATION USING R.E(2ULAH. DECOMPOSITION 79

'gABLE 2

A Set, of Neighbor Subquadraat Template~ SuggesLed by the Coarse Structm'al Configuration of Quadrants

Ouodranl Ne~cjhbor conf kJurot mn ~ subquodJ'ont s

TABLE 2--,Continu~d OuOdronf --~.~ NelghDor

confiqurohon ~ubquodronfs

The revised function for determining neighbor subquadrant importance is de- fined relative to both a .A and a .D in this example (configuration 5, Table 2). Since successor nodes contain information regarding orientation within a quadrant

Page 13: Experiments on picture representation using regular decomposition

80 KLINGER AND DYER

~he discrililinatiuli function ~ for neighbor quadrants is the average imporbance value of "informative" and "not sure" quadrants. This yields the relative in- formative v i u e of the area and enables rating neighbor quadrants by cross- quadrant information. A more refined technique would be to evaluate only those local pixels which are on certain borders of quadrants, as indicated by the specific quadrant configuration. However, the increased computation time required to improve the decision process did not seem warranted and consequently this global (averaging) method was used. For example, with intensity the coarse picture parameter, we found

intensity of neighborhood subquadrant a . B . C d ( a . B . C , {a.A, ~.D}) --

(intensity of ~.A + intensity of a . D ) / 2

Because subquadrants contain fragmentary information which involves linkag~ of quadrants, the thresholds w~ and ~v2 in the discrimination function ~ shmfld bc lowered :

~ informative, u,i - ~ ~< d(x, {Yl, . , . , y}) ~< 1.0,

~(d(x, lye, . . . , Y,~})) = Jnot sure, ~v~- e <~ d(x, {y . . . . , y}) < ao~ - e,

[noninformative, 0.0 ~< d(x, {yi . . . . , y~}) < wi - ~,

where e is the desired percentage decrease in the threshold constants w, and ~v,,. After each application of Rule 1, the appropriate template in Table II is found by the program and the neighbor quadrants inspected. If d (z, { y~,.. . , y,~}) /> ~vl - e, then that subquadrant is added to the tree in the normal manner as either a "not sure" or "informative" quadrant with a "dummy" father node. Subse- quently, this neighborhood quadrant will be treated no differently from a quadrant obtained directly by Rule 1 during further decomposition. Different values of the threshold parameters ~v~, ~v2, and ~ were tested for several image classes and the experimental results are discussed in Section 6.

5. REGION APPROXI~IATION USING QUADRANT CONNECTIVITY

Once the picture tree structure has been built, further processing can obtain a description of objects or connected areas in the scene in terms of a specific subtree. A single object in the image is now covered by one or more adjacent leaf quadrants. Thus, the problem of recognizing objects reduces initially to the problem of deciding whether two leaf quadrants belong or do not belong to the same object. Further processing is necessary when a single leaf quadrant covers several objects.

To obtain leaf quadrants which could be from the same object it is necessary to "prune" the tree. The reduced tree must retain only subtrees whose nodes correspond to spatially connected quadrants. This is done by first sorting to obtain geographically separate groups of leaf quadrants; this follows directly from the actual configuration of subtrees in the data structure, as can be seen from ~he example of two objects and their resulting tree structure (three distinct subtrees) of Fig. 5.1. After the sort groups the leaf quadrants, further processing can be done within a group, or of two related (e.g., by symmetry) groups.

Page 14: Experiments on picture representation using regular decomposition

PICTUIs REPRESENTATION USING ItEGULAI~ DECOMPOSITION 81

Fro, 5.1. Tree structure of two regions: three sub~rees which can be merged.

Implementation of the pruning procedure to group leaf quadrants which are geographically connected was done by quadrant rather than by using the specific tree configurations. Quadran%s are connected into groups using the eight-neighbor conncctedness criterion discussed by Rosenfeld [10, 11, 26]. The following defini- tions provide a formal description of the quadrant accumulation method actually implemented in the program.

D~FiNrrro~ 7. Given a point Pc(i, j), the eight-point neighborhood of P0 is {(i - 1, j - 1), ( i - 1, j), ( i - 1 , ] + 1)> ( i , j - 1), ( / , j + 1), ( i + 1, j - 1), ( i + 1, j), ( i + 1, j + 1)}.

In Fig. 5.2, PI, P',, . . . , P8 represent these points. D~FINImtoN 8. Two quadrants Q1 and Q~ are connected if there exist any two

points P1 e Q1 and P~ e Q~ such that P1 is contained in the eight-point neighbor- hood of P~.

D~r~ImIo~ 9. Let X = {Q1, Q~, . . . , Q~} be a set of extracted picture quad- rants. Then a region of quadrants (or region) R is a subset of X such that R = {Q~i, Q~, . . . , Qi,~}, where {il, i2, . . . , ira} c c_. {1, 2 . . . . , n} and (Vj: 0 < j ~< m) ((31c: 0 < ]~ ~< n) (i~. ~ i~, Qo.eonneetedto Q~k) A ( ~ 3 P E { X - R}) (P connected to Qis).

Quadrants are connected into regions by checking the border points around each quadrant. Using the vertex information contained in C for each quadrant (X rain, X max, Y rain, Y max), straightforward application of Definitions 8 and 9 to the set of extracted picture quadrants yields connected picture regions from leaf quadrants.

Assume that regular decomposition resulted in a tree structtu'e containing quadrant leaves Q~, Q2, . . . , Q~, where m i> 1. Then after merging connected quadrants there are R~, R~, . . . , R~ regions, where m/> n/> 1, and these regions define the gross topological picttu'e patterns [-27~.

PI P2 P3

P4 PO P5

P6 P7 P8

Fto. 5.2. Eight neighbors of P0.

Page 15: Experiments on picture representation using regular decomposition

82 KLINGEI~ AND DYEI~

6. EXAMPLES OF APPLICATIONS

A number of pictures were "hand-digitized" in 64 X {J4 and 32 X 32 array~ of grey levels, ranging from 0 to 9, to illustrate the features of regular decomposi- tion. Programming was done in PL/C at the UCL_A_ Campus Computing Network and UCLA Health Sciences Computing Facility: the program listing is in [-443. This section reports the resu]ts of computational experiments designed to de- termine the success and efficiency of the ideas presented here over different images and parameter vaIues. The reduced pictures discussed here are shown in the Appendix and were produced by a line printer using overprinting. The examples presented here are block scenes and alphabetic characters. Intensity was the coarse picture importance parameter used [38]. Five parameters, Wl, w~, e, with/without neighborhood quadrants, and the starting level in the tree for apply- ing the discrimination function, were varied in the experiments.

The program efficiency is represented by the tree size and density (number of nodes present divided by number of nodes in a complete tree of same depth), data reduction (percentage of picture elements deleted), and information-saved

TABLE 3

Computational Statistics

Max. %Plc ~ Pic. Image Compact, Name Ratio

letter~ 8b yes .10 .25 .05 l 4 341 19 12 30 I00 0 ,20

Lechers Be no .i0 .25 .05 l 2 21 7 4 25 82 18 .18

letters 8d yes .i0 .25 .05 2 4 341 35 20 27 i00 0 .16

letters Be yes .i0 .50 .05 i 5 1365 87 48 14 i00 0 .03

letters 8f yes .20 .40 .I0 ! 5 1365 21 12 28 96 4 .18

blocks 4b yes .i0 .25 .05 1 5 1365 21 12 38 i0~ 0 ,28

blocks 4c no .lO .25 .05 1 3 85 15 8 36 94 If) .28

blocks 4d yes .iO ,25 .05 2 5 1365 40 ~1 22 I00 0 .13

blocks i 4e yes .i0 .50 .05 i 7 21845 51 28 32 I00 0 .22

blocks 4f yes ,20 ,40 ~ 1 6 5461 25 14 33 I00 0 .23

poly- hedron ib yes .i0 ,25 .05 1 4 341 35 22 53 99 0 .15

poly- hedron lc no .i0 .25 .05 1 3 85 23 14 5O 93 4 .15

l~nes lib yes .I0 .25 .05 1 2 21 17 12 75 93 7 .34

llnes _ ii= no .i0 .25 .OS 1 2 21 17 12 ?5 93 7 ,34

lines lid y.. ,10 .15 .05 I 2 4 341 141 9~ 75 lo~ v ~ .zo

t

Page 16: Experiments on picture representation using regular decomposition

PICT1H~E I~EPRESENTATION USING REGULAR DECOMPOSITION

TABLE 3~

83

Corresponding w~ Maximmn % increase figure tree depth

Letters 8d 0.25 4 25.0 Lett, ers 8e 0.50 5

Blocks 4d 0.25 5 40.0 Blocks 4e 0.50 7

(percentage of object elements retained) statistics. The experimental results and parameter values are given for all examples tested in Table 3 and the follo~dng discussion refers to spot entries in that table. An explanation of Table 3 follows and some sections of the table arc included in text.

Colmnns 1 and 2 in T'~ble 3 label the specific pictorial example shown in the Appendix; there the input and extracted pictures are shown. Columns 3-7 give the values of the five program parameters. Columns 8-11 give information about the picture structure resulting from regular decomposition: the maximunl tree depth of the structure (col. 8) ; the maximum number of nodes which could be in a complete Q-tree of that depth, computed ~'%~ 4 ~, where n = maximum tree depth (col. 9) ; actual nmnber of nodes in the picture structure (col. 10) ; and the number of those tha t make up the extracted picture (col. 11). All the blank parts M' the extracted pictures are points which have been eliminated from the picture structure by decomposition. Column 12 gives the percentage of picture area saved and c,olumn 13 tells what percentage of the original image intensity has been kept. These two statistics summarize the data reduction capabilities of the algo- rithm. Column 14 describes approximation error computed as the percentage of object points lost in the reduced picture (i.e., the percentage of picture points part of some object; wMch have been eliminated h'om the da ta structure). ["Ob- ject-information-lost" approximation errors, and "noninformative-pixels-saved" approximation errors are combined in column ].5 (see below).]

To include both types of approximation errors we have defined a "picture compaction ratio." This ratio is obtained by adding the number of picture points

TABLE 3b

Corre- Neighbor % object No. of % pioture spending quads area lost nodes area deleted

figure (storage saved)

Letters 8b yes 0.0 19 70.0 Letters 8c no ] 8.0 7 75.0

Blocks 4b yes 0.0 21 62.0 Blocks 4o no ] 0.0 15 64.0

Polyhedron ] b yes 0.0 35 47.0 Polyhedron ] e no 4.0 23 50.0

Page 17: Experiments on picture representation using regular decomposition

84. NLINGEP~ AND DYER

TABLE 3c

Corresponding w, - wl IVIax. tree No. of figure depth nodes

Letters 8b 0.15 4 19 Letters 8f 0.20 5 21 Letters 8o 0.40 5 87 Blocks 4b 0.15 5 21 Blocks 4f 0.20 6 25 Blocks 4e 0.40 7 51

which should have been saved (but were not) since they art part of some object plus the number of points which are really noninformative but have been kept in the extracted picture, divided by the total number of points in the original image. The experimental results are listed in column 15. A perfect extracted pfl;- turk would have no informative points lost and no nonlnformative points saved, yielding a 0.0 compaction ratio. The worst structure would contain only the non- informative points and none of the informative ones, yielding a ratio of 1.0. The discussion of the figures themselves follows.

The ordering of figures in each appended example is : the input digitized image, a series of extracted pictures which use various values of the five parameters, the geographically separate regions, and finally the leaf quadrants that make up each region, for one extracted pictm'e.

Figure la shows a 64 X 64 image of a polyhedron. Figures lb and lc show the extracted picture quadrants after decomposition with and without neighbor quad- rants, respectively. Table 6.1 shows the considerable reduction in the size of the tree and the marked increase in the number of informative edge points of the polyhedron that were lost as a result of not checking neighbor quadrants.

Figure 2 shows the quadrants and their properties, C, that make up the single region in the extracted picture of Fig. lb. The trace of decomposition given in Fig. 3 illustrates the recursive algorithm and the addition of neighbor quadrants into the tree building process.

Figure 4a shows solid objects of the type commonly used in scene analysis experiments [3I]. Fig~ares 4b-4f illustrate the variety of extracted pictures we can obtain by varying the parameters. Figures 4b and 4c used identical parameter values except that 4b inspected neighbor quadrants while 4c did not. The utility of this heuristic is clear as 6% more of the pictm'e intensity and 10% more of

TABLE 3d

Start level h{ax. tree No. of lmdes depth

Letters (Fig. 8b) 1 4 19 Letters (Fig. 8d) 2 4: 35

Blocks (Fig. 4b) l 5 21 Blocks (Fig. 4d) 2 ,~ 40

Page 18: Experiments on picture representation using regular decomposition

PICTURE I{EPRE,~ENTATION USIN(I I%EGULAR. DECOMPOSITION

TABLE 6.1

Neighbor Quadrant Utility for Polyhedroa

No. of nodes % picture in t ree intensity kept,

85

% ol~]ect area lost

Wighout neighbor quadrants (Fig. 1 e) 23 93 4 With neighbor quadrants (Fig. lb) 35 99 0

the obj oct area were saved by ehecldng neighbors. Figures 4b and 4d used identical parameter values except that 4d started the discrimination function at level 2 of the tree instead of level 1. Starting at a lower tree depth forces the program to look, at finer areas; consequently, a more exact description is obtained, at the cost of checking and keeping a considerable larger tree. Table 6.2 supports this conclusion.

Figure 4e has an enlarged "not~ sure" zone and consequently the search is forced much deeper into the tree in order to find "informative" areas (max depth = 7 instead of only 5 for Fig. 4b). Figure 4f illustrates the increasing of the e parameter from the other figures and the slight effect it also has on increasing the maximum search depth in the tree (max depth = 6). Figures 5-7 show the quadrants and their properties, C, that define the three regions in the picture.

Figure 8a shows the digitized picture of the alphabetic letters B and 0. Figures 8b and 8c demonstrate graphically the need for a neighbor quadrant search. Figure 8b is the extracted picture of 8a with neighbor quadrants and the letters B and 0 have been saved with no information lost. Figure 8e is without neighbor quadrants and 18% of the intensity associated with the two letters has been deleted, resulting in what now appears as the letters E and C.

Table 6.3 further compares experimental statistics associated with the neighbor quadrants parameter.

Figure 8d starts the discrimination function at level 2 and hence produces a much "fuller" tree (35 nodes to 19 for Fig. 8b) while not gaining much in terms of the picture compaction ratio (0.16 to 0.20 for Fig. 8b). Being more selective in deciding what is "informative" by increasing *v2 to 0.50 as we have done in Fig. 8e produces a remarkably good picture compaction r~tio of 0.03 while not discarding any of the informative picture elements, but quadrupling the tree size (87 nodes to 19 for Fig. 8b). Figure 8f shows the effect of making it relatively easy to discard information by letting w2 = 0.20, while making the E parameter a relatively "loose" 0.10 to pick up neighbor quadrants. The results are ahnost

TABLE 6.2

Effect of Unconditional Partitioning to a Level for :Blocks Imago

Starting levef Nc~. t,f nodes ~:.~ picture % i~b]ect Pi~.ture compaction area kep~ area losl. ra~io

[ (Fig. 4b) 2] 38 0 0.28 2 (Fig. 4d) 40 22 0 0.13

Page 19: Experiments on picture representation using regular decomposition

86 KLINGER AND DYEh'.

TABLE 6.3

Neighbor Quadrant Utility for Blocks Image

l'vIaximum No. of nodes % picture % object tree depth area kept area lost

Without neighbor quadrants (Fig. 8c) 2 7 25 18 With neighbor quadrants (Fig. 8b) 4 19 30 0

identical to those for Fig. 8b in structure (21 nodes, 12 leaves in Fig. 8f and 19 nodes, 12 leaves in Fig. 8b) and in success (0.18 picture compaction ration in Fig. 8f and 0.20 in Fig, 8b).

Table 6.4 compares the sensitivity of the tree size with storage reduction and object retention for varying threshold values.

7. DISCUSSION OF t~.ESULTS

The feasibility and efficiency of building picture tree structures using regular decomposition by quadrants has been shown. The running time c~f the Mgorithm is determined by the complexity of the scene rather than by the picture size. Consequently, regular decomposition is most practical for applications inwflving sparse picture information. This includes such large domains as char'~cter recog- nition, chromosome analysis, time-series analysis, and line drawings.

Because the algorithm is quite general, it can be used to obtain a convenient data structure for a wide range of different pattern recognition and scene ~malysis tasks and picture types. Numerical parmneters and a discrimination function must be specified and these can be tuned to the class of pictures under consideration.

The initial motivation for adding neighbor quadrant search was to "soften" the regular decomposition boundary lines, The added search generM]y improves algorithm performance (though at a considerable cost in increased computing time). For example, in the letters image of Table 6.4, varying parameters had little or no effect on object retention : the use of neighbor quadrant search balances out significant degradation from wide variation of threshold values. Thus, neigh- bor search could enable use of a single set of parameter values to preproeess a d~ta set.

TABLE 6.4

Sensi|~ivity of Threshold Values and Their Effects on Storage Reduction and Object Retention for Letters

Threshold values

'Wl W~

Maximum No. of % storage % object Picture Corre- tree depth nodes reduction arealost compaction spending

ratio figure

0.10 0.26 0.05 4 ]9 70 0 0.20 8b 0.10 0.,50 0.0,5 ,5 87 86 0 0.03 8e 0.20 0 .40 0.10 5 21 72 4: 0.]8 8f

Page 20: Experiments on picture representation using regular decomposition

PICTUI1E REPRESENTATION USING REGULAR. DECOMPOSITION 87

The experimental results reveal the following parameter sensitivities (these are supported by condensed tables, derived from the appended Table III).

1. If w: is increased from 0.25 to 0.50, then the maximum tree depth searched is increased substantially: 25%-40%. (Intuitively, this is as expected since in- creasing w~ means more selectivity in choosing "informative" quadrants.) (See Table 3a.)

2. Including neighbor quadrants in the extracted picture structure decreased the percentage of object area lost (the effect was a range of from 4%-18~7o); increased the tree size (range, 40%-170%); and the effect was only to increase overall storage requirements slightly (from 0% to 5%). (See Table 3b.)

3. Increasing w2 - wt from 0.1.5 to 0.40 resulted in an increase in the maximum depth of the tree of from 25% to 40% and an increase in the number of nodes in the picture structure from 100% to 300%. (This is intuitively appealing since we are widening the "not sure" zone in the discrimination function, forcing more quadrants to be subdivided.) (See Table 3e.)

4. Unconditional partitioning to one lower tree level increases the tree size 80%-700%. (See Table 3d.)

8. CONCLUSIONS AND FUTURE WORK

The program discussed in this paper used a top-down recursive partitioning of picture area into successively finer quadrants to obtain resulting tree structures for several types of images. The trees contain information on such key global properties as symmetry, shape, and orientation of constituent objects or patterns ; experiments were done to merge areas, using the tree to locate the objects. Quanti tat ive results were obtained from alphabetic letters, blocks in a scene, and the simple structures of a polyhedron. We showed that segmentation errors caused b y regular decomposition can be overcome in all cases by a relatively simple algorithm which uses the same type of concept. Neighbor quadrants were defined and, at the cost of increased processing time, the percentage of intensity of the picture kept after decomposition using the neighbor algorithm improved.

While much needs to be done to develop a working system for image processing from these concepts, the work presented here makes it likely that such a develop- ment should take place. Several refinements to be presented elsewhere should continue the process of building a theoretical basis for a practical image processing system. Future work includes examination of tree structures to determine whether translation and rotation of an object, can be overcome by processing the data ~btained by regular decomposition.

APPENDIX: SAMPLE PR.0(IItAM RUNS

ACKNOWLEDGMENTS

This research was sponsored by the Air Force Office of Scientific Research, Air Force Systems Command, USAF, under Grant No. AFOSI~-72-2384. The United States Government is authorized to reproduce and distribute reprints for Governmental purposes notwithstanding any copyright notation hereon.

The authors express their appreciation for this support.

Page 21: Experiments on picture representation using regular decomposition

88 I(LINGE[/, AND Dglgll.

X X X X X X ~ X K X Y X X X X ~ X X X X ~ Y X X ~ X Y X X X X • .......................................................... ~ ..... X

% .......... + ..................................... 0---+ ........... X X .......... X- X ................................................................. X X ................................................................ X X ................................................................ X X X- X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X - - - - ~ - - . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

. . . . . . . . . . . . . . . . . . . . . . . . ~% ~ ~<'~ ~ % ~ ~ ~ ~ ~ . . . . . . . . . . . . . . . . . . . . . . X

X . . . . . . . . . . . . . . . . . . . . . . ~ 3 0 ~ % ~ ~ '~ ~ ~ 1 ~ ~ X ~ . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . . . . . X ~ . ~ M ~ M ~ . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . . . . ~%ZXX~ ~ ~ ' ~ %~% ~ % ~ X ~ ' ~ M . . . . . . . . . . . . . . . . . X

X . . . . . . . . - . . . . . ~ '$N % ~'4 ~ 8 X X ~<X X ~ X% g X X ' X X X ~ X ~ ' K ~ O ~ ~ . . . . . . . . . . . . . . X

X .............. ~ ' ~ ~~ ~ ~ ~ g~ ~ g~ Z ~ X~ X ~ ~ .............. •

• .............. ~ g ~ ~I3X,4~!~ 'i~ ~ ~l~ MZ ~ X ~ ~ 0~@@@ ~ .............. X

X . . . . . . . . . . . . . . B H ~ u 1 6 5 . . . . . . . . . . . . . . X

X . . . . . . . . . . . . . . ~ H W ~ H H ~ @ @ ~ # ~ # # @ @ @ ~ M M u . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . ~ W ~ ~ @ ~ @ ~ @ ~ @ ~ M M u . . . . . . . . . . . . . . X

X . . . . . . . . . . . . . . ~ @ ~ B ~ @ ~ u 1 6 5 1 6 2 . . . . . . . . . . . . . . X

X . . . . . . . . . . . . . . m g g m ~ g ~ m @ ~ u 1 6 5 1 6 5 . . . . . . . . . . . . . . X

• . . . . . ~ . . . . . . . . . ~ ~ g ~ @ u . . . . . . + - - + . . . . x

X . . . . . . . . . + . . . . ~ ~ m ~ m ~ m ~ @ ~ ~ . . . . . . . . . . . . . . X x - - - + . . . . . . . . . . . ~ ~ @ ~ ' ~ m ~ @ @ ~ ~ @ ~ § . . . . . . . . X X - - + ........... B ~ ~ ~ @ ~ ~ - - + - - O ........ X X . . . . . . . . . . . . . . . ~ ~ g " ~ ' ~ g ' L ~ @ . ~ " ~ @ ~ ~ ' ~ ' ~ " ~ ' @ ~ @ ~ _ _ ~

x . . . . . . . . . . . . . . ~ e ~ @ ~ ~ o ~ u . . . . . . . . . . . . . . X • . . . . . . . . . . . . . . B m ~ g ~ ~ ~ o w ~ o @ ~ o . . . . . . . . . . . . . . . . X X .............. ~ M ~ ~ @ ~ ~ ................. X

X . . . . . . . . + . . . . . . . B ~ ~ ~ @ ~ ~ . . . . . . . . . . . . . . . . . . . X X . . . . . . . . - . . . . . . . . ~ ~ ~ ~ . . . . . . . . . . . . . . . . . . . . X X~--+--~ ........... ~ f f i ~ @ ~ u ..................... X

X . . . . . . . . . . . . . . . . . . . . . . ~ @ ~ @ ~ @ @ ~ . . . . . . . . . . . . . . . . . . . . . . . . X • . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ? . . . . . . . . . . . . . . . . ~ . . . . . . . . . . . . . . . X

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

X . . . . . . . . . . + . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . , . . . . . . . 0---+ . . . . . . . . . . . X

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

X" ,

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X X X X X ~ ' K X X X X X X X K X X X X F X X ~ X X X K F X X X X ~ X X X F X X X X X X X X X X X X X X X X X X X X X X X X X X X X X

FIG. la. Digitized picture of a polyhedron.

Page 22: Experiments on picture representation using regular decomposition

PICTU[LE 1LEPILESENTAT~ON USINC~ ]LBGUL~kl~, DECOMPOSITION 8!)

X / X .<

x . :~. X :':

X . . . . . . . . . . . . . . ~- y" ;~ 2 ~ '::: ;t ]~ ':~ :~ '~' ~: !I �9 ' ~, ! ;?; ~'.i ~ ?t ~'i ~ 2 ,~I ~ '! !i : '~ ~,i :~ . . . . . . . :(. X . . . . . . . . . ::.'~t.!.~ :'.- ~;:: '~ ,~ :~ ~ :~ ?i,~,':'; "~ ~'~ ;,' .~, .'~ ; i . ~' p, v ::.~ !: ,,.., ~;.:. ~:~j . . . . ~ _ ' " .g

X . . . . . . . . . . - " : ' . ~ ' : ' v : . - . ~ ' . ~ . ~ , ' $ : ~ ' ; i ' . " ~ ' ; ~ : ' ~ I ; ~ , : ' 4 " ~ , ~ ; ~ Z ' I ~ . ~ . : C ~ . I : ' : ; % . ~ F A ~ , ~ I - = - - ~ - . . - - , . . . . . . . . . . . . . ~,

X .... - - - - . - r - ~ , ~ ~ , ~ ~ ~:~:,"~:~vP:~ ~ ~ ~ ~ ~ ~ l F ~ " ~ k . ~ - - ~ ~ - ~ g ~ ) ~ - . - , - - . ~ . . . . . . . . . . ...... :~

• . . . . . . ~ , ~ ,.3 ,-#~'~ .~ ~ ~ W C~ :~')::): ~ ~ ~: ~ "~ :.~ ~ ~' ~ :~-'): ~ ~ @ ~ m L ~ ~ . . . . . .

• - - - - M ~ n~ [fl ',-i~, .~ ~ ~ ~: ..~ ~-~ ',~ # ~,~ ~ :, ~ :~ ~ ,~, ~v ~ ,~, ~ ~ ~ .~ 8 @ ~ ~ ~ - - . Y. X - - ~ . ~ , b ' : ' i ~ , ~ 4 : ' ~ l : : ~ - l : ~ 4 , - ~ : ) ~ f ~ ' ~ e ~ ; . ] ~ - , ~ m ~ - - x

X . - - :~ ~ ~ i~ ~ '.'/' ,~ ~ ~: c~ ~ -~ �9 ~ 4 ~ �9 ~ : ~ ' ~ "):~ ~ 8 .~ ,~ ~- ~' , ~ r qi ~ - , - . . . . . . . . . . . . . . . '. . . . . . . :,~

X - - ~ ~ ~ ~1~1~1~'~ P] ~ ~ ~ ~. ~ ~ ~ / e ~ ~: ~ ~ ~: ~: ~ ~. ~ ~ ~ ~ ~ ~ - - X

X - - ~ ~ ~ ~ tl e] l] ~ gJ ~ ~ ~ -~ ~ ~. ~_ #~ ~ ~ .~ g g ~ , e ,~ ~ ~ ~ . ~ ~ . . ~ . . . . . . . . . . . . . . . . . . v

X - - I ' J ~ ~ i ~ , ~ ~ l ~ ~ , ~ ' ~ ~ ' @ ~ ~ @.~':):~ ~ : ~ ~.:~g@;-~-.~- ~ ' ~ : ~ ' ~ - - . . . . . . . . . . . . . . . . . X

X . . . . FJM g J ~ # ~ @ ~ ~ ~ # ~ - ~ ~ @ ~ , ~ - ~ . . . . X X . " : - . - . . . . ~ t~ ~ 0 ~ - ~ ~ O ~ ~ ~ e_ ~2 ~_ ~ ~ W ' I - , - . - - ~ - . . . . . . ' . . . . . . . . . . . . . . . . . X

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . . . . . . . . . . - - - . - - - "7 . . . . . . . . X

. . . . . . .- . . . . . . . . . . . -v . . . . . . . . . . . . . . X ~< . . . . . . . . . . . . . " _ _ L - - _ ~ . - - =---~. . . . . . . . '/.

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

~ X )r ~ X X X X A ~ ~ x y . ; <X :< ;, .:4 X X ,~ >. X ,<.4.,~ :", '<,',.;~,4 X ~ ,'I, ~ ;,: :'~> < . ~ X ~ ": ," <Y A , X ' , A < "< ~ ;,,'~' X, . l , ' , ,,,.,,4 ~..4

FIG. lb. Retained polyhedron--neighbor algorithm. Parameters : wz = 0.10, w2 = 0.25, e = 0.05, level = 1, neighbor = yes. Percentage of picture inLensity re~naining after preproeessing = 99%; percentage of piogure area remaining = 53%.

Page 23: Experiments on picture representation using regular decomposition

9 0 K L I N G E I ~ : k N D I ) Y E P

X X ~C

X X X

X X X X X X X X X X X X • X • X X X X X X X X

.,x, X X X X X >r

X X X X X X X X X X X X X X X X X X X X X

X X

x

.X,~':x2x~XXXXXX:,X•162 K K.x:<XXC": <:,.X2~ <.X~'.K/>~ K C<:K~::~'. X ;<

X Y.

�9 . .,.

X X

X

"" " . _ _ ' • . . . . . . . ' - k _ L _ _ _ ' . . . . . . . . " - . . . . : • - - . x

. . . . . . . -- ..... ~ "~ '~ ~ ~/~-' ~ N % 'Y~ ~ ~; ~ ~ ~ ~ _~ ~,'~---- .~_.T.-- ~ X

. . . . . . ~ ~ ~-~ ~i"~/~ ,'-:I '.i2, 7.. ~C ;; "~.V 2,1~' ~ ~ :Z ~ % ". ~.~,: '~ Y, C-~, ~ =) . . . . . . X

...... ,d ~ ~ ~ ~,F~':', W -~-~ ~ ~: r $# ~ :;':~ ~ -~ ~v';,i.@ ~r--~ ~ ~r ............... X

. , ~ {~ ~1~ ~ ~ ~ - ~ . ~ = ) = ~_ ~ ~_ ~ ~ #. ~ ~ : ~ ~_ ~ . ~ ~ . ~ ~ T . . . . . . . . . . . . . . . . . . . . . . . X

. . . . ~ . ~ # ~ . ~ g ~ ~ # - ~ r ~ - ~ . . . . . . X - - - . T - - ~ ~ ~ n ~ ~. ~,.~ ~ ~ ~ ~ . . . . . . . _ . . . . . . . . . . . . . . . . . j__X _~. . . . . ~.i. ~ ~ ~ ~ c: ~ ~ ~ ~. ~ ~ ~ ~ ~ '~ ~ ~ - :-.':.:"":,:L.-: -L". X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . -=-- �9 X

- r - . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .X . . . . ~- . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . )~

4" ........... ~---- , X ................................

................................ X

. . . . . . . . . . . . . . . . . . . . . . . . . . . .:. . . . . . . . . . . . . . . . . . . . . . . . . . . "~. ~XX)eXXXA2CX?,X>C),2~,,'<Xx,(~y , ~;Y ~'~.''< ( x x x ~ ~.<~x.,;~.~,~;IK4K','Z <AT,:p>~XXXxXX.~.KA>XXK

F i e . l c . R e t a i n e d p o l y h e d r o n p i ~ L u r e . P a r a m e t e r s : w l = 0 .10 , w2 = 0 .25 , ~ = 0 .05 , l e v e l = 1,

n e i g h b o r s = no . P e r c e n t a g e of p i c t u r e i n t ~ e n s i t y r e m a i n i n g ~ f t e r p r e p r o c e s s i n g = 9 3 % ; p e r c e n t s g e

o f plo, t m ' e m ' e ~ r e m a i n i n g = 5 0 % .

Page 24: Experiments on picture representation using regular decomposition

P I C ' r U I t E R E P R E S E N T A T I O N USIN(I I/ECtULAIt I )ECOMPOSITION 91

Xa,~a;~4XX~AX~A~XX)~'4XA~ ~A~XA,4~<XXXX~X~XXXAAXX;~XX~AXX~XX~A

• X X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X

• . . . . . . . . . _ _ 2 : J 2 - _ _ - - _ - - - _ 2 - _ 2 2 2 - : 2 - 2 2 2 : - 2 : - ....................... . . . . . . . . ~, -{ y. %, ~r .; N -4 ,~ .~ ~ x .{ % -~. X % N . . . . . . . . . . . . . ~ ~c "x. x "~ "c - ":. c '~, 'c. v.'~, '," ~. % ~ "(. n X . . . . . X

X

Y

X

A

X

x

X X" x X X x

x X X

X

,X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . ~" . . . . . . . . . . . . . . X ,X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A x : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : : x X . . . . . . . . . ,<

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

Xx~ ,Xxx< XXxxxxxX> ( X;x '<XXX.X XX XX~ XCIX XXXXX X~X XX X XX~,XXXX X X XXX XX XXXXXXXXX

. . . . . ~ ; E G I O N . . . . 1 - - O ~ ; A D R A N T ,~ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

OLIADRANIF INT -h IS ITY X M~N X ~a~.X Y _ M I f l y ' MAX

P , O ,C ~95' 40 t~e~ .X3 4B

r , . o , ~ , : , , A 4 7 3.3 3r'~ ~q 5~. �9 P , D , U . , ' ~ . C 5 0 3 7 a .O a g . 5 :~

I ' . C . . 0 175 ~ . ;33 -I. 9 [7 32 . . . . ~ = - C . D . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . B T 7 . . . . . . . a ~" . . . . . . . . . . 6 a . . . . . . . . . . . . 1 7 . . . . . - 3 ~ ' - -

72 e~I ~ " 13 16

I ~ . C ,AoB .D 6~1 . 3 6 1 3 . | 6

~ .~ ,C I05 f~ 17 3~ 33 4~ P , B , D * A F~6 1 7 2 ~ ,~.0 5~ ~ ' . O , D , C g7 2 5 "12 "~'9 5 ~

, - I ~ . ( ~ ,'/~ , C . . . . . . . . . . . . . . . . . . . . . . . . . . . t 6 / ~ " - - . . . . . . . . . . 9 - - - ~ . . . . ! . 6 . . . . . . . . . . . . . 3 3 . . . . . . . . . . ~ O - - - - - - F) .~ , a , D I I 0 o 16 , ~'I ~ 5 ~ . A . O I I 2 l 17 ~ , 1.7 3~.

p . A , B , C oP. q I h 1 7 2~, P . A , e , o 1 6 0 9 1 6 2"~ :12

Fro. 2. Region deseripbiolt aloe,' regular deeomposigiou of Fig. lb. (Note: blank areas have been ehminated from the pic:tm'e s~ructure,) Percentage of i)ieture intensity remaining after pre- proeeHsiug = 99%; percentage of picture area remaining = 53%.

Page 25: Experiments on picture representation using regular decomposition

92 K~LINGEI? AND [)YEll,

O U A ~ A N ~

SUUOUXOt, A ~ I S P . A C A N n I s P , B CANDI CATE ~ , C CAf~O; C A T F P . D CA;qDI CATE

OUAD.A~r

r i co S tJf3OUAO.~ J, N T 5

a . D . ~ I)Ek E ' rRc

~ . O , [ " D~L FTCD e ~ Gh,~On s

P . O . U . A C ~ N ~ O A T C P , t I . U e C C ~ t N O | O ~ T E

QUADt~A~T P . O , E . C

S UBO~IADF; ANT ~, F p . o , e , c o A Ce p . D , I S , C , e O e L E T C C P . o . o . c , r LeAp p . O o e . C . ~ D ~ L ~ t e C

OUAO~ANV ~ , O . O . A

S U O a U ^ O ~ A ~ S

p . o . ~ * * , ~ e I ' LET~ .C

a . D ~ f i D r L C T E C

O u ~ o r A~T

SUPOUADR ~NTS P , c , ~ ~ E L E T r D ~~ L~AF P * C , C D~L.ETeD P , C , O L E A F

~C 1GH~m~ s I ' . C , A , ~ C&NO I D A T ~ P , C . AeD CANO|OATI~

OUAQRANT p . C , A . O

s uOUU,~OAAN~rS P,C:eA~ c r L ~ r r ~

, . C o A o O . n L c ~ P ~ . C . A , O , C U L L E T e C a , c , A . o . o L e A ~

N E Z G h U O R S

OUAD~ANT P e C . A l n

S U a O ~ R ~ N ~ S P . O . A . i ] e A DE~LETPC D , C , A , n . D ~,~AP p . C . A , ~ , C D E t . ~ T e C P * C , A , ~ * D L E A F

N e 1OH ooA s

QUADRANT

P 'EI SU f lQUADRANT~ ~ , a , A C A N ~ I D A t ' E P * ~ , B L)ELETED P , B , C LL~AF P . H . O r 1 6 2

NJ~ I G;ABCIk S

OUAON^NT

P l g I ~ U S ~ A O A A N ~ s ~ . r S . D . A LEAF P . B e O . ~ O E L E T E • = e B I O , C ~.EAF p , o , a . o n e L E T E n

He IGt~eIOeS

~ U ~ O ~ N T

P ' O ' ~ U ~ S A O n ~ N r S P * B * A , A OELF-TEO P I B I A I B O E L ~ T E O P . E . A . r LEAF P , I ~ , A w O LEAF

N E Z O m O ~

O U ~ n r X H ' t P , A

5UBQ~AD~ AN~' ~ P , A . A DELETED P , A , B C A N D I C A T E ~ , A , r C A N C I C ^ T E P . A , D LEAF

N E I G H B O R S

OU~D~AN~

P e A "C UBOUAI)I~ ANT $ P . A e C . A D E L E T E D

P . A I C . f l LEAF p . A o C , r O C L ~ t E O ~ , A , C , O LEAF

~ E I G ~ F m R S

OUAO~ANT

P e A ~ U D Q U A D ~ N t S ~ . A , B . A D E L r T E O P , A o ~ . U D E L ~ t ~ O ~ , A , u . c L ~ A F P . A , D . O L E A F

~ e l G H a O ~ S

F~a, 3. Traea of regular deeomposigion for Fig. 1

Page 26: Experiments on picture representation using regular decomposition

P T C T U I ~ . E R E P I ~ E S E N T A T I O N U S I N G R E G U L A [ ~ D E C O M P O S I T I O N 93

X X • 2 1 5 2 1 5 2 1 5 2 1 5 2 1 5 2 1 5 X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . - . . . . . . . . . . . . . . X X . . . . . . . . - - . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . - - . . . . . . . . . . . . . . . . . . . . . . . . X • ~ -

................................................................ X

X .................................. ----- ........................ X X .............................................................. ~--X X )~ X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . ~ % ~ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

X ............. M ~ M ~ ....... ~ @ X X ~.~Y.~I~ ' ~ . ~ X . . . . . . . . . . . . . MMm@~%?~@--~%XW@~ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . ~ M ~ % % ~ % % X % @ ~ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . M ~ % ~ % @ ~ % @ ~ @ . . . . . . . . . . . . . . . . . . . - . . . . . . . . . . . . . X

X . . . . . . . . . . . . . M M ~ % ~ X % ~ g w ~ % % % @ ~ @ @ . . . . . . . . . . -- . . . . . . . . . . . MMM~-----X X . . . . . . . . . . . . . . . . ~ X % ~ @ ~ - % ~ % ~ . . . . . . . . . . . . . . . . . . . . . MMMM~@~@~ . . . . X

X . . . . . . . . . . . . . . . ~ % ~ . . . . %~ . . . . . . . . . . . . . . . . . . . . . . %X~X@@@~@ . . . . X

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ~ % ~ X ~ @ ~ . . . . X X .................................................. % % ~ @ ~ .... X X . . . . ~ ............................................. % ~ % @ @ ~ - - - - - - - X X' ~ - ~ - ' - - - - - - ~

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X ~ X ~ . . . . X X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ~ - ~ % X ~ . . . . X

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ~ % X ~ @ ~ @ ~ . . . . X

X .................................................. X~M~@@@ ....... X

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X ~ % ~ X X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . - . . . . . . . . . . . . . . X

X . . . . .

)r . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

X . . . . . . . -X- . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . - . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

X . . . . . . - - ' ~ ) ~ X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . . . ~ u 1 6 5 1 6 5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

X . . . . . . . . . . . . . . . . . ~u . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . ~ u 1 6 5 1 6 5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . ~ u 1 6 5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

X u 1 6 5

• . . . . . . . . . . . . u 1 6 5 1 6 5 1 6 5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . •

x . . . . . . . . . . . ~ u . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

x . . . . . . . . - - - - ~ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

X . . . . . . . . . . . . . . . X

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X" X

x . . . . . . . . . . . . . . . . . . . . . . . . . . . ~ . . . . . . . . . . . . - . . . . . . . . . . . . . . . . . . . . . . . . x

X X X X X X X ~ • 2 1 5 2 1 5 2 1 5 2 1 5

F I a . 4 a . D i g i t i z e d p i c t u r e o f a b l o c k s w o r l d s c e n e .

Page 27: Experiments on picture representation using regular decomposition

!)4 I~LIN(,I, ,h. AN1) I)YI~II,

X . . . . . . . . . . . . . . . .

X . . . . . . . . . . . . . . . .

X . . . . . . . . . . . . . . . .

X . . . . . . . . . . . . . . . . . . - . - ; T. - . F, , ' - . '=,- T',--. , - - . - - --.,-- - X . . . . . . . . . . . . . . . . \ . . . . . . . . . . - , . . . . . .K . . . . . . . . . . . . . . . .

y, . . . . . . . . . . . . . . . . . . . . . . . .

x . . . . . X 'M : ~'H . . . . . . . . . . . . . .

?:

. . . . . . . . . . . . . . x .4

),,

. . . . . . . . . . . . . . . . . . . . . . . . . X

X

X

X . . . . . . . . . . . . . . . . = ~Y~,~ ~ [ ~ ' W . " T ~ '~" ~'T ~ ~ ~ - W ' ~ ~ = - - - - . . . . . . . . . . . . . . . •

X - % '~ ';. ~ " : ~ ~ ~ - " ; ,,~ ,~X ~1 ~J F7 . . . . . N~ M,~/,~ ~ ~ ~ 1 ~ - - - . ~ x

X - '.;~: ,". ?'~ ; ~ . . . . . . . . . . . . . ~ .!: ",~ %' ~ . ~ l @ @W . . . . X

X . . . . . . . . . . . . . . . . . . Y : : ~; : ~ @ W ~ , / . . . . X

x . . . . . . . . . . . . . . . . . . :~ ;~ ~; '.'.,~ ~ ~1 ~ ~ .~ ~ . . . . X

X . . . . . . . . . . . . . . . . . . %'~: ' ; ~ ~lI.~ ~ ~ . . . . X

X . . . . . . . . . . . . . . . . . . x " , ~, ~ . . ~ . . . . . x

x . . . . . . . . . . . . . . . . X X - . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . = , - . . : _ - . - . . . - - - _ . _ - - - , - - = - - . x .

. . . . . . . . . . . . . . . . x ~K . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . - - - -X

X, . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X

X - . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X - . . . . . . . . . . . . . . . - - =~.-...-- . . . . . . . . . . X

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

X - - - :~ :) .~ 9 :~ @ ~ . . . . . . . . . . . . . . . . . . . . . . X • - - @ ~ : ~ 4 - ' ~ ' ~ . . . . . . . . . . . . . . . . . . . . . . x

~ , : - - ' ~ ~ . . . . . . . . . . . . . . . . . . . . . . X

x . . . . . . . ~ ~ ~ $ ~ # ~ ,~ ,~ I~ V~ . . . . . . x

X . . . . ~ ~ ~ .~ , ~ :~ . . . . . . . . . . x x - - - ~ e ~ . ~ @ ~ . . . . . . . . . . . . . X

x - - - ~ ~# -~ @~ . . . . . . . . . . . . . . . x X. . . . . - . . . . . . . - - : ~ _ - - . - : -_ - - - : _ - . : _ - ~ . . . . x X . . . . . . . . . . . . . . . . X . . . . . . . . . . . . . . . . K . . . . . . . . . . . . . . . . y . �9

X . . . . . . . . . . . . . . . . x . . . . . . . . . . . . . . . X . . . . . . . . . . . . . . . .

Xx.yYXXXXXXXXX),'•215215

X X X

X . �9 X

X X

Fro. 4b. Retained blocks scene: neighbor algorithm, parameters w~ = 0.10, w2 = 10.25~ e = 0.05, level = 1, neighbors = yes. Percentage of picture in tens i ty remaiMng after preprocess ing -- 100~o; perceni,age of picture area remaining = 3 8 ~ .

Page 28: Experiments on picture representation using regular decomposition

P I C T U R E l t E P ] t E S E N T A T I O N U S I N G REGULAI~ D E C O M P O S I T I O N 9,5

~<x ,x .X.._X • >r x • • < ;, .-'~. ;~. L~ ~ .~ ', :': x • x ~ x f. .x :,, ^ , . . ' < x .v.5h:,~#xy, x,," ~ > >:, i XW, 4;%x.4 x Y,~ ~'Y,~#_.x ~'x $_g_X "# X .... ~ . . . . . . . . . . . . , X

X . . . . . . . . . . . . . . . . X

X .................. X )< .... - - - - T - - " : . - " t . " _ . T - T - - t ' L " . . . . . . . . . . . . . . . . . . . . . " 7 , X . . . . . . . . . . . . . . . . >. X . . . . . . . . . . . . . . . . / , "< . . . . . . . . . . . . . . . . }.

K . . . . . . . . . . . . . . . .

X ?'~" . . . . . . . . . . . . . . X X ~ . . . . . . . . . r~ .... X

X @~- ....... % t~. I , ] ~ - - - X

. x . . . . . . . . . . . . . . . . . . . . . . . . ~.'_/, "4..,", .'," "i, ~ - "~.~ ~/~ ~J_N ~ _- r . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

x ;" ~ ,~: " ,w . . . . . . . . . . . . . . ~. ~. ~4 t~ ~ l , a ~ ~ .... x

x . . . . . . . . . . . . . . . . . . . . . . . . . . . - . . - - ----. - -_= -__-~ - .-_= r - . - r r . . . . . . . . . . . . - . . - . ~ ;L~._X ~ �9 . ~ _ ~ - - ~ r . ;., x . . . . . . . . . . . . . . . . . . X ~.",~,~@@ ~ . . . . x

X . . . . . . . . . . . . . . . . . . ,%,% f4 % @@~ g ~@ .... X X . . . . . . . . . . . . . . . . . . . ~ ~ :i@,&'~ @ ~@ . . . . X X . . . . . . . . . . . . . . . . . . . . - . --. -...% .- --. - - . - - - - - - -T . - .-_ .-_ .---. . . . . . . . . . . . "c:=_I"A?, ' . :~.@~_@~.- . - . - -X

X . . . . . . . . . . . . . . . . . . % y . ~ , @ @ ~ @ ~ . . . . x

X . . . . . . . . . . . . . . . . . . . "; % Y. ,~'. ~ @ ~ '~l ,~ . . . . . X

X --"~,% % Y,W; ......... X

X . . . . . . . . . . . . . . . . X K . . . . . . . . . . . . . . . . X X. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . - : - : - . ............. ,4 X ................ X X . . . . . . . . . . . . . . . . ;r X . . . . . . . . . . . . . . . . .,<

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . ' - . . . . . . . . . . . . . . . X

X -- - I= ~::i= -~' ~ . ) : @ - - - , . . . . . . . . . . . . . . . . . . . X X - - .~ :C,:':)'~ ~ r ~ . . . . . . . . . . . . . . . . . . . . . . X X - ~. @- ~'~ ~: @ ~ ~@@ . . . . . . . . . . . . . . . . . . . . . - X

x . . . . . . . - ~ : , ~ r . . . . . . . . x x . . . . . . ~1~r~.~e . . . . . . x

. x . . . . . . . . . . . . . . - - .T- : - - - _" : * ~ ,~ .:)= .:): ~-') ~ '~ ~ r-.}~ . . . . . . . X • . . . . ~ ~ ~ :~ ~'@ ~raw . . . . . . . . . . X X --- ~ ~:~#~;~ . . . . . . . . . . . . . X

x --~ r . . . . . . . . . ~- . . . . . , x . x . . . . . . . . . . . . . . . . . . . . . . . . X

X . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . X

X ' - . . . . . . . . . . . . . . F X X . . . . . . . . . . . . . . . "i . X

X . . . . . . . . T ....... X

~ A X X X X X X • X X X X X ) 4 X X A X ;K : < X X N • X X X X X X X X • X K X X X X X X X X X X X X • X X A

Fro . 4c. Re~alned bloolcs scene: original algori thm. Parameters : w~ = 0.10, w2 = 0.25, ~ = 0.05, level = 1~ neighbors = no. Percen tage of picture ingensity remaining after preproeessing = 94% ; percenl;age of picgure arel~ l 'em~ining = 36%.

Page 29: Experiments on picture representation using regular decomposition

,9(3 I s ' L I N G E R A N D DY~1:~

X x Z

x ,K \ ,(

: ( . . . . . . . " . . . . . . . . : .

X . . . . . . . . . . . . . . . . . . . . . . . . ' ;% X . . . . . .: ~; ,4 . : ~ . . . . . . . . . . . . . . K

~ _ . . . . . . . . . . . . . . . . . . . . . . . �9 ~ 7. ~ @ ~ W ~ x

x

,~. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . _..:.-~,'i.C~ "~, ~ ~._~B--_- . . . . . . •

X . . . . . . . . X X: . . . . . . . . X

. . . . . . . . X ":<" . . . . - . . . . . . - . . . . . . . X X . . . . . . . . . . . . . . . . . . . . . . . . " - . . . . . . . . . . . . . . . �9 X . . . . . . . . X )C . . . . . . . . X

K . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X . . . . . . . . . . . . . . . .

X . . . . . . . . . . . . . . . .

X . . . . . . . . . . . . . . . .

X . . . . . . . . . . . . . . . . X - - - - :): :)::~ ~ ~ "C ':~3 . . . . " : - • - - $ r #~ :ga . . . . . .

X - - : ~ # ~ ' # r . . . . . . .

�9 X

x x X

. . . . . . . . . . . . . . . . . . . . . . . . . ~ - " ~ X X

' X . •

X X • X X X X • X j X

X X

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . X �9 .: X ~. x, XX .C X X X XX K:~ X , ,y X X X XX;..:~:,'.: x, XX :','K,w, X>, >. XX X • X XX X ;~ • XX X X XXX X • • X X XX X X XX

FIG. 4d. R e t a ~ e d blocks ~ceae: different s ta r t in~ level. I~arameters; wl = 0.10, w~ ~ 0.25, = 0.05, level = 2, neighbors = yes. Perce l t tage of p ic ture int, ensit.y remai rdng af ter p reprocess ing

= 100~ percentage of pic ture ~rea remaining = 22~

Page 30: Experiments on picture representation using regular decomposition

P I C T U R E R E P R E S E N T A T I O N U S I N G REGULAI~ D E C O M P O S I T I O N 97

X R X X ~ : X X X X x X X X ~ X X X ~ X X X X X X X X X X X X X X X X X X X N K X X X X X X X X X X ~ X X X X • 2 1 5

X X R . X

X �9 X X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . : . . . . . . . . . . . . . . . . . . . . . . . . . ~ - ' " ~ " X

X " X K X

X . . . . . . . . . " " X

�9 ' - - = - ~ = - = . . . . . . . . . . ~ . . . . . . . . . . . . . . . . x : " X X X X X ~ . . . . . . . . . •

x -v~l@@-- --XX@~---

x . . . . . . . . . . - ~ N X X ~ X X W W ~ X X % ~ W W W -

X ~ @ . . . . . . . . . . .

X

x . . . . . . . . . . . . . . . . . . . . - : = . : = : ~ = - = - = = ~ X . . . . . . . . . . . . . . . . X - - - - . . . . . . . . . . . . . .

} . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3=~Z:ZZ ~ C ~ 5 Z - - 5 5 ~ ...... k ~ . . . . . . . . . . . . . . . .

X . . . . . . . . . - - . . . . . .

, , X

"X

.X X X

. . . . - - . . . . . . . . . . --X

....... 'A ~M~----~X

- - ~ X X ~ ~ 1 ~ - - - - - x - - ~.. X ~ ~ @ ~ I ~ . . . . x

..... =.~ % ~X X~ @~@R-_~-== X

- - / , ~ . x % ~ # ~ f~ ~I# . . . . X

........... _;- X % X ~C ~,,3 ~ ~ @W .~=-z. :_- X

X . . . . . . . . . . . . . . . ~ X X . . . . . . . . . . . . . . . . x

x . . . . . . . . . . . . . . . . >( X . . . . . . . . . . . . . . . . . x

X .................................................................... -. " 7 . - - T .-...-. -T.-" -.F ::-" "T.- x X . . . . . . . . X . . . . . . . .

X �9 -- . . . . . . .

X . .

X . . . . . . . . . . . .

X . . . . . . . . . . . . . . . .

X " * - . . . . . . . . . . . . . . .

X . . . . . . . . . . . . . . . . X " ", ' - . . . . . . . , . . . . . . . .

X . . . . . . . . . . . . . . . .

' X . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . "K

. . . . . . . . . . . . . . . X

. . . . . . . . . . . . . . . . X

. . . . . . . . . . . . . . . . )K

. . . . . . . . . . . . . . . . X

. . . . . . . . . . . . . . . . ,.(

. . . . . . . . . . . . . . . X

. . . . . . . . . . . . . . . . X

X X

X

. X X

�9 X

�9 X

,, ' , , ' , X

, . •

. - X •

" . X .." ' X

"X

" X X X . ~ X X X X X X X X ~ < X X X X X X ~ ~ X X X X X X X X X X X X X X X • X X X X X X X X X X X X X X X X X X X ) < X X X X XX X X X

F r o . 4e. l~ehabled blocks scene: threshold 2 raised. Pan ,me te r s : wl = 0.10, w2 = 0.50, e = 0.05, l eve l = 1, ~e ighbors = yes. Peroentage of picture in~ensi~y remaining after preprocessing -- 1 0 0 ~ ; pe r ( : en tage of p ic turc ~rea remaining = 32%,

Page 31: Experiments on picture representation using regular decomposition

98 KLINGEI~ AND DYEIr

~," X

v, X • X : . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . -

X •

Y X ,: X

e X K . . . . X

K X • •

• . . . . •

.K . . . . . . . . . . . . . . . . X X ....... MMMMN .... X

X ..... M M .I MI,.] W f~ .... X

x - - ~ ~,~W W W W ~ - - ~ - X

K -- % :~; ,% ~.W~r WW W ~ -- --~- X

"X . . . . . . - - ~,, Y. ~ % @@ @ @ @ @ -:::'~-- X

x --- %~.,% Y.@~ @ ....... X

X ......... -- . . . . . X

................ X ,• . . . . . . . . . . . . . . : . . . . . .................. ,. - ; : . . __ .~ ___ ~ - j - - =.:.~. _ ~ ~ -

l ................

"K . . . . . . . . . . . . - - " ~~ ~ ' - " z - "Y'C ; " - '--" " = " - -~X " X . . . . . . . . . . . . . . . . X 'K . . . . . . . . . . . . . . . . X 'K . . . . . . . . . . . . . . - - -X

X . . . . . . . . . . . . . . . . X X . . . . . . . . . . . - . . . . X X . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . . . . . . . . . X x x X x x X . . . . . . . . . . . . . . . . . X x x X • X

• X

X , X x X

x . . . . . . . . . . . . . . . . X �9 x

~ X . . . �9 , _ . . . x

:< y XX X ~ x;~. x ;,, x'x',';x x x x x '< X >-Xx XXX X XXXX XXXXXX XX• XX XXXX,X XX X XXXXX X XXX XX XXX

- ~ % , , , ~ . . . . . . . . . . . . . .

- ~.' !,,"4 ~ W . . . . . . . . . '~ . . . .

- " : F T -? ' - - - - . . . . . . . . . . . . . .

. . . . . . . . �9 . . . ' . . . . . . . . . . . . . .

. . . . . . . . . . . . . . : & . . . " . . . . . .

FIG, 4f. l~etained blocks scene: bo th thresholds r~ised. Parameters: wl = 0.20, w 2 = 0.40, = 0.10, level = 1, neighbors = yes. Percentage of picture intelL~i(~y rem~Uning after preprocess-

ing = 100%; percentage of picture a.re~ remaining = 3 3 ~ .

Page 32: Experiments on picture representation using regular decomposition

PICTURE It,EPRESENTATION USING REGULAIt DECOMPOSITION 99

Fins. 5, 6, 7. Regious found in Fig. 4b.

x x

: 1

x

x . . . . . . . . ' . - , l - - - - x x . . . . . v M v ~ l e e - - - - 4

m s m m l . x . . + + �9 . . . . . . + ~ . . . . ,

x + l 1 4 m + - . + , + ~ . . . . x ~ I m l x

~ t m ~ l e

X - - + + ~ + ~ . . . . . v - - - x

. . . . . . . . . . . . . . . . x

. . . . . . . . . . . . . . . . i

. . . . . . . . . . . . . . . . I x . . . . . . . . . . . . . . . . x

. . . . . . . . . . . . . . . . x

. . . . . . . . . . . . . . . . l x . . . . . . . . . . . . . . . . x

1 " x

x ~

1 I

+ + l ; + ) : ' ~ + h + ; ' , . + T y X + I ' + + + + ^ + m : < + ~ i + ~ + ~ ' * " * + - ~ + ' + ~ r ~ + ~ . + - + + ~ + ~ ' + ~ + ~ + + ~ P + + ~ ' ~ + ~ + + ~ * ` ~ + ~ + ~ + + + ~ ' ~ * " + + m ~ + ~ + ~

r , . + . . + 7 + ~ ] ? + + , , + +

Fro. 5. Region ] descriptioa after regular dccomposit;ioa of Fig. 4b.

Page 33: Experiments on picture representation using regular decomposition

i00 KLINGEIZ AND DYER

I

x

" 1 x ~

I " x

: 1

1 I x x

: 1 : I

: . . . . . . . . ---,,*, ...... I �9 . . . . . . t t ) t t t l ~ I ~ V 4 . . . . . . x . . . . . t ~ t ~ l ~ t ~ e e ~ . . . . . . . x .... )**$t$I~ .........

--.41)iIIil . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . ~ .

~*Ieem*W~)I*I*~ItI.~IIt~e~e#~**tI*I*k~-~*s~I~I~**mI;M*`**It*I~I~I~pI~*~*~I~v 4

Fro. 6. [~egiGn 2 description after regular decomposition of Fig, 4b,

Page 34: Experiments on picture representation using regular decomposition

PICTURE REPRESENTATION USING REGULAII DECOIVfPOSITION ]0]

x

x

x

x

x

x x

x

x x x x

x

1 .

1 *

. . . . . ~ . , , l e . . . . . . . , . ( , ; N ~ - -

- v l , ~ ~ ,~. ~, , , ~ o w . - r D u 4 4 l -

Page 35: Experiments on picture representation using regular decomposition

]02 K L I N G E R A N D DYEII .

T~q I ' ' , ; l , : : : t l r I': ~ I C TI;',: '!:[

< ,~x < . ' < '1 '~ ~:*;< ~ / X X K X < X ~ Z X A : < ~ :4 < X A • N . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X ~ ' . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

x - - ~ * ~ ~4~N ....................... X x - - % ~ u ........... ~ . . . . . . . . . . X > - - r . . . . . . . . . ~ . . . . . . . . x x - - ~ - - - ~ ........ ~ - - ~ ....... x

• ....... ~ - - - - - ~ - - - . . . . . . X X - - ~ $ ~ . . . . . . . r . . . . . . ~ # . . . . . . X

x - - ~ $ 9 - - - ~ . . . . . . . . % 9 . . . . ~% . . . . . . . X • 1 6 2 . . . . . . . ~ - - - ~ . . . . . x x - - * ~ - - ~ ......... ~*~u ........ •

x - - ~ * ~ ........... ~ .......... x x--~9@~ ~ ....................... x

X ................................ X

X ................................ x x ................................ X X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X ................................. X X ................................ X X ................................ X X ................................ X

X ................................ X ................................ X

X ................................. .X ................................ X

X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X X X ~ X • 2 1 5 2 1 5 2 1 5

X;<X ;< :~ / X X X z. XX XX ;< :,:,: ;4 ~ 74 ;,: :;(;( X X ;< X X S Y,.:< X / ~ < x . . . . . . . . . . . . . . . . ;,; X . . . . . . . . . . . . . . . . X x--* % ~,?~% . . . . . . . . . >,

x - - * * * - - ' ~ * - --~ ~* ~ .... x

x--9~ ~---#- -4. ~---~ C: ~--- ). x-- ~ f~ ~--- ~- - %*----F9"4--- x x - - * ~ - # t , 9 @ r ~ ' ...... ~ - - x

• ~ - - - . ~ - - ~ . ~ . . . . u ~,

x--#*~--$ ~- --* ~%~%% .... >:

x - - # * * * % * ' ~ . . . . . - ~ $ - - •

x - - t~*.~-1:~:~ ~ . . . . . . . . . x x . . . . . . . . . . . . . . . . x x . . . . . . . . . . . . . . . . y x ), x > X x

x ), • x X :. X z

X / •

x S X :

X

X X X X X X X X X X : < X X X X X X X X X • x • 4 A ~ X'~,X X~X

Fro. 8a. Digil;ized picimre of alphabetic ehartu.r B and O. FIe . 8b. Retained alphabetic oha,'acters--aeigl~bor Algorithnl. Pa ramete r s : w, = 0.:1{], 'w~

= 0.25, ~ = 0.05, level = 1, neighbors = yes. Perze,~age of pi(H,ure imxmsil;y remaining ~t'ter prepro~essing = 100%; percentage of pi~;tm'e area remaining = 30~o.

" f i l e EXT~-tAr . .T,?!E, I ' ~ I , ~ T U C ' K " Y I ! E v x ' r F . ~ & C ' I K O I ' ] ' , : 1 . : 'r, '

X • • 2 1 5 X X X ~ X ~ , . • 2 1 5 2 1 5 ~ • X y • X • 2 1 5 X X X X • 2 1 5 ;,:~ X %,.~ x x > • X , ~ • X.'<X x . . . . . . . . . . . . . . . . X X . . . . . . . . X X . . . . . . . . . . . . . . . . x X - - - 7 . . . . . )

X - - :~ 4.: 1~ ~ I,:"4,: . . . . ~ - - x X # :): ;~ u ~ : ) : - - - 1:~ >:

• :~-~ # - - - 9 - - ' % : ~ # } % r ,~ X - - : ~ * - - t # . . . . . . I 1 : } r :} 'I: . . . . •

A

• ~::'~ . . . . . . .x ; x - - 4~ # ~ . - } : ,~ # - - - 4 * . . . . . . r~ : ) : - - >. x--##* . . . . =~:~ .... , X X--***---r . . . . I~* .... :~ ~--- X x - - ;',::L~ . . . . ~ :~ ~ - - u x x - - ~ . ~ , - - - # . . . . .%:~ - ~ _ _ , . ~ ~ _ _ _ •

x--~--~= --- % ~I~-I~, x X-- ~=$--~, . . . . . , ##, ~.# .... x

x . . . . . . . . . . . . . . . . :,, x . . . . . . . . x • . . . . . . . . . . . . . . . . x X . . . . . . . . • X X X •

• x X • X X X x X X X y x X X k • X X :,, X k X :,,: X X X X x X X x X ,'.; X X X > X X X X X X

X X :,< X X X :,, X X X X X ~ X x X X X X X X ~t X X X A X X • X X X X;" ~ Y.:~ X X X X XX X X X X:4',(XXXXXXXXXXXX~Xt4X~,X;~AX;~:,;XX,~;,,XXXX

Fie. 8c. P~e~ahmd alphabetic characters--original algorithm. Parameters: w~ = 0.]0, w~ = 0.25, e --- 0.05, level = 1, neighbors = no. percentage of picture i~Lensity remai,i,~g after preprocess- ing = 82%; percentage of picblre area remainh~g = 25%.

Fro. 8d. Retained alphabetiu characters--different stm'ting level, l~aralnet.ers: w, = 0.10, w2 = 0.25, e = 0.05, level = 2, neighbors -- yes. Percentage of picture lid;easily remaining afl;er prepror = 100~; percentage of picture :wea remaining = 2 7 ~ .

Page 36: Experiments on picture representation using regular decomposition

P [ C T U R I ] REI?I{I~,SENTATION USIN(} ' f I}.E(.~I LAII , D E C O M P O S I T I O N 103

T i l t ' : E X T t : : , " ~ " l',r'.:. , I ' I C T I J ! - , - : T;.F:~ F.,",TP:~',.CIE!I: ' - ' ] [ C T U R ~

X • X # Y X • X X X >: :z ;~:Y • X X N • X • • >: >'tO', X >" ;'; '.< ', YX ": X t,%"(X Y. X X X X Y X ;{ Xt-:.:< X ,',f X • X X X X X,~ X X ;,r ~'X X X X X x >: t,: . . . . . . . . . . . . . . . . . M x x ,< . . . . . . . . . . . . . . . . X x ~ 4~ =b --)= ~- ~ ~ - - - >~ \ - - -L~ ~ ~.~ ~ . . . . . . . . ;,:

X ~:): ~ - - - :h "1~ $: ~ ~ ) ~ ~ ; ' ; - - ~ r 1 ~ - - - ~ :~ ~'-'): $, ~= X

X X • . . . . . . . . . . . . . . . x. 'X X x . . . . . . . . . . . . . . . . y ,

< X X. 2{ X X X A X X X X X X x Y, ,X X X X X X X X ,< X X X X • X x & X X X y, X X X X ;'; X X X )': • x X X X x' X X X ;'!. X Y X X • ;': Y. X ~XY•215215215215 ,,: x :,: X X X '4 X '4 X X 4 x ;< X X :'( X X :'f K ~ X >:• :< K X .~t XX X>: X

FIG. 8e. l{eh~ined t~lplmbet,ic chart~ctem: threshold 2 raised. Par~uneters: wt = 0.10, w~ = 0.50, e = 0.05, level = 1, neighbors = yen. Per(~enl~age of pir inl:ensity tff[;er preproeessing = ] 00%; picture area remaining = 14%.

FIG. 8f. P~etained alphubetio ch~raeters: both thresholds raised. Parameters: w~=0.20, w2 = 0A0, ~ = 0.10, level = 1, neighbors = yes. Percentage of picture intensii;y remaining ,~fter preprocessing = 96%; percent,age of pichlre area rem,~ining = 28%.

Page 37: Experiments on picture representation using regular decomposition

104 I~INGER AND DYEI~

REFERENCES

]. A. K[inger, Data structures and pattern recognition, in Proceedings of the Firth Inlernational Joint Conference on Pattern Recogailion, W~hington, D. C., 73CH0 821-9C; IEEE, New York, 1973.

2. A. I~tinger, Pattern recognition programs with level adaptation, in Proceedings of the 1973 I E E E Confcrence on Decision and Control, San Diego, 73CHO 806-OSh,IC, IEEE, New York, December 1973.

3. J. :E. Warnoek, A hidden surface algorithm for computer generated halftone pictures, Com- puter Science Department, University of Utah, TR 4-15, June 1969.

4. I. E. Sutherland, R. F. Sproul, and R. A. Schumacker, A characterization of ten hiddeti-surface algorithms, AC~[ Computing Surveys 6, No. 1, March 1974.

5. W. ~[. Newman and P~. ~'. ~pro~, Principles of Interactive Co~nputcr G~.c~pJdcs, McCn'aw-I-Ii~l, New York, 1973.

6. C. I~L Eastman, Representatio~m for space planning, Comm. ACM 13, No. 4, April 1970. 7. C. A. Rosen and N. J. Nilsson, Application of intelligent automata to reconnaissaI~ce, SRI

Project 5953, D~eember 1967. 8. A. Rosenfeld, Picture Processing by Comp~der, Academic Press, New York, 1969. 9. A. Rosenfetd, Non-purposive perception in computer visioll, Computer Science Center,

University of ~[arylaud, TR-219, 1973. 10. A. Kosenfeid, Adjacency in digital pictures, Computer S~ience Center, Uuiversity of h,[aryl~md,

TR-203, October 1972. 11. A. [~oseafeld, Figure extraction, in Automatic Interpretation and Class~eation of lmages,

(A. Grasselli, Ed.), Academic Press, New York, 1969. 12. K. FLt and B. K. Bhargava, Tree systems for syntactic patte~u~ recogniti(m, IEEE Trans.

Compulers C-22, No. 12, December 1973. 13. R.. D. Merrill, Representation of contours and regiorLs for egiclent computer search, Comm.

AC~[ 15, No. 2, February 1973. 14. A. Kosenfeid, Progress in picture processing: 1969-1971, ACM Co~nputing Survc?ls 5, N(). 2,

June 1973. 15. C. M. Eastman and C. I. Yessios, An elficient algorithm for finding the union, intersection ,rod

differences of spatial domains, Department of Computer Science, Carnegie-Mellon Uni- versity, September 1972.

16. C. ~[. Eastman, Heuristic algorithms for atttomated space planning, in Proceedings of the Second Joint International Conference on Artificial Inlelligenee, Imperial College, London, 1971.

[7. U. M:ontanari, Networks of constraints: Fundamental properties and applications to pictnre processing, Department of Computer Science, Carnegie-Melon University, January 1971.

18. IJ[. Y. F. Feng and T. Pavfiidis, Analysis of complex shapes in terms of simpler ones : Feature generation for syntactic pa~tern recognition, Departmen~ of Electrical Engineering, Prince- ton University, T1~-149, April 1974.

19. I-L Y. F. Feng and T. Pavilidis, The generation of polygonal outlines of objects from gray level pictures, Department of Electrical Engineering, Princeton University, TR-]50, April 1974.

20. R. A. Kitsch, Resyathesis of biological images front tree-structm'ed decomposition da~a, in Graphic Languages (F. Nake and A. Rosenfsld, Eds.), North-I=Iollaud, Amsterdam, 1972.

21. J. 'Freeman~ The modelling of spatial relations, Computer Science Center, University of Maryland, TR-281, December 1973.

22. O. Firschein and ~ . A. Fischler, Describing and abstracting pictorial structures, Pattern Recognilion 3, No. 4, November 1971.

23. O. Firsehein and l%i. A. Fisehler, A s~udy in descriptive representation of pictorial data, Paltern Recognition 4, No. 4, December 1972.

24. T. Pavilid~.% AnMysis of set patterns, Pattern I~ecognition 1, I~lovember 1968. 25. T. Pavilidis, Structural pattern recognition: Primitives and ~uxtaposition relations, ia Frontiers

of Patt~n R, evogn~ion (S. Watanabe, Ed.), Academic Press, New York, 1972.

Page 38: Experiments on picture representation using regular decomposition

PICTUI~E REPRESENTATION USING REGULAR DECOMPOSITION 105

26. A. Koscnfeld, Connectivity in Digital Pictures, d. Assoc. Comput Mach. 17, No. 1, January 1970.

27. J. P. Mylopoulos and T. Pavilidis, On the topological properties of quantized spaces. 1. The notion of demension. II. Connectivity and order of connectivity, J. Assoc. Comput. Mach. 18, No. 2, April 1971.

28. I L L . Gregory, Eye and Brain, l-VfcGraw-Hill, New York~ 1973. 29. R. O. Duda and P. E. Hart, Pattern Classification and Scene Analysis~ Wiley, 1973. 30. L. G. Roberts, Machine Perception of Threo Dimensional Solids, in "Optical and :Electro-

Optical Information Processing" (Tippett, Ed.), MIT Press, Cambridge, Mass., 1965. 3]. A. Guzman, Computer recognition of three dimensional objects in a visual scene, Department

of Electrical Engineering, Massachusetts Institute of Technology, MAC-TR-59, 1968. 32. G. FMk, Computer interpretation of imperfect line data as in a three dimensional scene,

Dep~r~meat of Computer Scieneej Stanford University, AIM 139, 1970. 33. M. Minsky and S. Papert, Project MAC Progress Report IV, MIT Press, Cambridge, Mass.,

1967. 34. C. 1%. Brlce and C. L. Fennema, Scene analysis using regions, Artificial Inlelligence J., I~ No. 3,

1970. 35. Y. Yakimovsky, Scene anMy~is using a semantic base for region growing, Department of

Computer Science, Stanford University, AIM 209, 1973. 36. L. D. I-I~rmon, The recognition of faces, Scientific American 229, No. 5, November 1973. 37. I). E. Knuth, The Art of Computer Programming: Fundamental Algorithms, Vol. 1, Addison-

Wesley, Menlo Park, CaliL, 1973. 38. A. Klingcr, Patterns and search statistics, in Optimizing Melhods in. Statistics (J. S. Rustagi,

]~d.), Academic Press, New York, 1971. 39. M. Rhodes, private communication, 1974. 40. A. Klinger, A. Kochman~ and N. Alexandridisp Computer anMysis of chromosome patterns:

Feature encoding for flexible decision making, IEEE Trans. Computers, C-20~ No. 9, September 1971,

4l. J. Omolayole, private communication, 1974. 42. C. Dyer, private communication, 197~. 43. A. Klinger, "Regular decomposition and picture structure, in Proceedings of lhe 1974 Intm'-

national Conference on ~.gystems, Man, and Cybernetics, Dallas, Texas, 74CH0 908-48MC, IEEE, New York, 1974.

44:. A. Klinger and C. 1%. Dyer, Experiments on picture representation using regular decomposition, Computer Science Depm'tment, University of California, Los Angeles, UCLA-ENG-7494, December 1974.