matlab_w7

Embed Size (px)

Citation preview

  • 8/10/2019 matlab_w7

    1/3

    HERIOT-WATT UNIVERSITY

    DEPARTMENT OF COMPUTING AND ELECTRICAL ENGINEERING

    B35SD2 Matlab tt!"#al $I%a&' %!('ll#)& *#)& Ma"+!, Ra)(!% F#'l(*

    Ob'.t#,'*/

    We will demonstrate how Markov Random Fields (MRF) and fractals can be used effectivelyin image processing for modelling both for synthesis and analysis.

    During this session you will learn ! practice"#$ %sing MRF models&$ 'utoinomial Models

    R'**!".'* "'0#"'(/

    %n order to carry out this session you will need to download images and matlab files from thefollowing location"

    1tt/444.''14a.+6.''7"WWWT'a.1#)&B38SD2B38SD21t%l

    lease download the following functions and script"

    generate*MRF.m

    generate*MRF*+uick.m

    generate*MRF*inomial.m

    denoise*MRF.m

    I%a&' M!('ll#)& /

    's you will have seen in the lecture room image modelling plays an important role in modernimage processing. %t is commonly used in image analysis for te,ture segmentationclassification and synthesis (game industry for instance). %t is also of common use inmultimedia applications for compression purposes. Due to the diversity of image types it isimpossible to have one universal model to cover all the possible types even when limitingourselves to simple te,ture cases. -arious models have been proposed and each has itsadvantages and drawbacks. For te,tures co$occurrence matrices are a standard choice butthey are in general difficult to compute and are difficult to use for modelling.

    We will concentrate here on MRF and fractal models which have been widely and verysuccessfully used in the last # years. Fractals have been used for compression for instance.

    Ma"+!, Ra)(!% 9#'l(* /

    Markov random fields belong to the statistical models of images./ach pi,el of an image can be viewed as a random variable.'n image can very generally be described as the realisation of a n by m dimensional randomvariable of (in general unknown) probability density function (pdf). 0iven the fact that n andm are the dimensions of the image each component of the vector corresponds to a pi,el in theimage. 1his is called a random field.

    1o generate a sample image from a given random field we want to sample p(2) where 2 isthe n,m random field. %f we assume that all the pi,els are uncorrelated we can then simulate

    #

  • 8/10/2019 matlab_w7

    2/3

    (or sample) p(2) as =

    =n

    i

    ixpXp#

    )()( where ,iare the pi,els of the image. 1his will not

    however enable us to generate te,ture images as correlation between pi,els is an essential partof te,ture modelling.1o introduce correlation in its most general form we have to assume that all the pi,els in theimage are correlated and we have to consider all the pi,els at the same time. 1his is impossiblein practice due to the si3e of the sampling space (&4(n5m) for a binary image k4(n5m) for a k

    grey level image).%f we now assume that a pi,el depend only on its neighbours we have what is called a MarkovRandom Field. 1hey are much easier to manipulate as changing one pi,el in the image onlyaffects its neighbours. 1hey are a very powerful tool for image analysis and *7)t1'*#*. Formore information on Markov Random Fields please consult your notes or any relevant bookas the theory is out of the scope of this tutorial. % strongly recommend the tutorial from6harles ouman that can be found at"http"77dynamo.ecn.purdue.edu78bouman7publications7mrf*tutorial7view.pdf%n rief a random field is a Markov random field if (29,) has a 0ibbs distribution. 1his

    means that (2) can be e,pressed as" )()(

    ##)( )( xU

    xV

    e

    Z

    e

    Z

    xXP xVsxc

    =

    == where :(,) is an

    energy term depending on the vicinity of ,. 1his can take any form and two of them are seen isthis tutorial the inomial form seen in class and the otts model. %n the case of the inomialmodel can be calculated directly from the neighboorhood to give ;t and the we candirectly sample ;t. 1his is done in the program generate*MRF*inomial.m whichcorresponds to the model seen in class.%n most case however we cannot find < in the e,pression of (29,) and cannot therefore use directly. %n this case the following procedure is followed"

    6reate a random image of n grey levels.

    %teratively change each pi,el and calculate the likelihood (energy) associated to the

    change. 1he choice of the neighbourhood function is the choice of the user and willmake the specificity of a given random field.=eep a change if the likelihood is smaller.>ave a probability of keeping the change if the likelhood is higher. 1his probabilityensures convergence to the global minimum of energy.

    ?top after n iterations or when no more changes are accepted.

    @ou can see this as an analogy with physics. %magine you heat a crystal until it melts andbecomes li+uid (our initial random state). Aet it now cool down as the li+uid cools down thecrystal structure comes back (our MRF term) and slowly imposes a structure onto the initiallyrandom arrangement of molecules (our pi,els).

    1ry this now using the generate*MRF and generate*MRF*+uick programs.'fter having tried with the current parameters edit them analyse them and try to modify themask (energy function) and see the effect on the generated image. 's an indication adirectional mask will produce a directional te,ture.

    's you can see changing the MRF terms (parameters) in the mask changes the imagegenerated. 1he same parameters also generate images that Blook the sameC. 1his can be used inte,ture synthesis. >owever you should soon realise that it is not easy to find the right terms togenerate a specific image type. What is normally done is the following" from a set of imageswe want to analyse we e,tract the parameters of the MRF (estimation phase). nce this isdone we can now generate similar images. We can also Eust send the parameters of say a

    te,ture and regenerate it at the other end from the parameters realising use compressionefficiency.

    &

    http://dynamo.ecn.purdue.edu/~bouman/publications/mrf_tutorial/view.pdfhttp://dynamo.ecn.purdue.edu/~bouman/publications/mrf_tutorial/view.pdf
  • 8/10/2019 matlab_w7

    3/3

    'nother are of application of MRF is ('-)!#*#)&.%f you know the parameters of an MRF image (@: D ;1 ;//D 1 =;W 1>/%M'0/ %1?/AF) you can very effectively de$noise it as we will show now.First create an image using a isotropic MRF model "

    mask 9 G#7H #7H #7HI #7H #7H #7HI #7H #7H #7HJ in generate*MRFIDo ima 9 generate*MRF(GH& H&J&)I

    ;ow dodenoise*MRF(ima)I1his programs first adds a random noise to the image and then tries to recover the originalimage from the noisy version assuming it does not know anything about the original imageapart from its Markov parameters.@ou should recover the original image. lease note that the program starts from the noisyimage and does not know the original one>ow is this doneKWell we now observe a noise data @ corrupted by noise n as @ 9 2Ln. We know that 2 is aMarkov random field. We want to recover 2. 1herefore we want to find the most probable 2given the observed data @" (27@). :sing bayes we can rewrite this as (27@)8 (@72)(2).

    We know (2) based on our Markovian model and (@72) 9 (@92Ln72) 9 (n9@$2). %f wehave a model for the noise (>ere we have assumed a 0aussian ;oise) we can evaluate (n)and therefore evaluate (27@) for each possible value of 2 and choose the most probable'gain in practice a noise model is assumed (say 0aussian) and the MRF parameters andnoise parameters are iteratively estimated (/M techni+ue for instance) from the data (our @here) alleviating the need for the knowledge of the MRF parameters.

    H