9
ResearchArticle A Descent Conjugate Gradient Algorithm for Optimization Problems and Its Applications in Image Restoration and Compression Sensing Junyue Cao 1,2 and Jinzhao Wu 3,4 1 ChengduInstituteofComputerApplicationChineseAcademyofSciences,Chengdu,China 2 UniversityoftheChineseAcademyofSciences,Beijing,China 3 eSchoolofComputerandElectronicInformation,GuangxiUniversity,Nanning530004,China 4 GuangxiKeyLaboratoryofHybridComputationandICDesignAnalysis,GuangxiUniversityforNationalities, Nanning530006,China Correspondence should be addressed to Jinzhao Wu; [email protected] Received 7 July 2020; Accepted 25 August 2020; Published 29 September 2020 Guest Editor: Wenjie Liu Copyright©2020JunyueCaoandJinzhaoWu.isisanopenaccessarticledistributedundertheCreativeCommonsAttribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. It is well known that the nonlinear conjugate gradient algorithm is one of the effective algorithms for optimization problems since it has low storage and simple structure properties. is motivates us to make a further study to design a modified conjugate gradient formula for the optimization model, and this proposed conjugate gradient algorithm possesses several properties: (1) the search direction possesses not only the gradient value but also the function value; (2) the presented direction has both the sufficient descent property and the trust region feature; (3) the proposed algorithm has the global convergence for nonconvex functions; (4) the experiment is done for the image restoration problems and compression sensing to prove the performance of the new algorithm. 1. Introduction Consider the following model defined by min f(x)| x R n , (1) where f: R n R is a continuous function. e above problem (1) has many practical applied fields, such as economics, biology, and engineering. It is well known that the nonlinear conjugate gradient (CG) method is one of the most effective methods for (1). e CG algorithm has the following iterative formula with x k+1 x k + α k d k , k 0, 1, 2, ... , (2) where α k denotes the steplength, x k is the kth iterative point, and d k is the search direction designed by d k g k + β k d k1 , if k 1, g k , if k 0, (3) where g k f(x k ) is the gradient and β k is a scalar which determines different CG algorithms ([1–7], etc.), where the Polak–Ribi` ere–Polak (PRP) formula [6, 7] is one of the well- known nonlinear CG formulas with β PRP k g T k g k g k1 ( g k1 2 , (4) where g k1 f(x k1 ) and ‖·‖ is the Euclidean norm. e PRP method has been studied by many scholars, and many results are obtained (see [7–12], etc.) since the PRP algo- rithm has the superior numerical performance but for convergence. At present, under the weak Wolfe–Powell (WWP) inexact line search and for nonconvex functions, the global convergence of the PRP algorithm is still open, and it Hindawi Mathematical Problems in Engineering Volume 2020, Article ID 6157294, 9 pages https://doi.org/10.1155/2020/6157294

ADescentConjugateGradientAlgorithmforOptimization ...Polak–Ribi`ere–Polak(PRP)formula[6,7]isoneofthewell-knownnonlinearCGformulaswith βPRP k gT k g k−g −1 g k−1 2, (4) whereg

  • Upload
    others

  • View
    0

  • Download
    0

Embed Size (px)

Citation preview

Page 1: ADescentConjugateGradientAlgorithmforOptimization ...Polak–Ribi`ere–Polak(PRP)formula[6,7]isoneofthewell-knownnonlinearCGformulaswith βPRP k gT k g k−g −1 g k−1 2, (4) whereg

Research ArticleA Descent Conjugate Gradient Algorithm for OptimizationProblems and Its Applications in Image Restoration andCompression Sensing

Junyue Cao12 and Jinzhao Wu 34

1Chengdu Institute of Computer Application Chinese Academy of Sciences Chengdu China2University of the Chinese Academy of Sciences Beijing China3 e School of Computer and Electronic Information Guangxi University Nanning 530004 China4Guangxi Key Laboratory of Hybrid Computation and IC Design Analysis Guangxi University for NationalitiesNanning 530006 China

Correspondence should be addressed to Jinzhao Wu tony_gxaliyuncom

Received 7 July 2020 Accepted 25 August 2020 Published 29 September 2020

Guest Editor Wenjie Liu

Copyright copy 2020 Junyue Cao and JinzhaoWu)is is an open access article distributed under the Creative CommonsAttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work isproperly cited

It is well known that the nonlinear conjugate gradient algorithm is one of the effective algorithms for optimization problems sinceit has low storage and simple structure properties )is motivates us to make a further study to design a modified conjugategradient formula for the optimization model and this proposed conjugate gradient algorithm possesses several properties (1) thesearch direction possesses not only the gradient value but also the function value (2) the presented direction has both the sufficientdescent property and the trust region feature (3) the proposed algorithm has the global convergence for nonconvex functions (4)the experiment is done for the image restoration problems and compression sensing to prove the performance of thenew algorithm

1 Introduction

Consider the following model defined by

min f(x) | x isin Rn

1113864 1113865 (1)

where f Rn⟶ R is a continuous function )e aboveproblem (1) has many practical applied fields such aseconomics biology and engineering It is well known thatthe nonlinear conjugate gradient (CG) method is one of themost effective methods for (1) )e CG algorithm has thefollowing iterative formula with

xk+1 xk + αkdk k 0 1 2 (2)

where αk denotes the steplength xk is the kth iterative pointand dk is the search direction designed by

dk minus gk + βkdkminus 1 if kge 1

minus gk if k 01113896 (3)

where gk nablaf(xk) is the gradient and βk is a scalar whichdetermines different CG algorithms ([1ndash7] etc) where thePolakndashRibierendashPolak (PRP) formula [6 7] is one of the well-known nonlinear CG formulas with

βPRPk g

Tk gk minus gkminus 1( 1113857

gkminus 1

2 (4)

where gkminus 1 nablaf(xkminus 1) and middot is the Euclidean norm )ePRP method has been studied by many scholars and manyresults are obtained (see [7ndash12] etc) since the PRP algo-rithm has the superior numerical performance but forconvergence At present under the weak WolfendashPowell(WWP) inexact line search and for nonconvex functions theglobal convergence of the PRP algorithm is still open and it

HindawiMathematical Problems in EngineeringVolume 2020 Article ID 6157294 9 pageshttpsdoiorg10115520206157294

is one of the well-known open problems in optimizationfields Based on the PRP formula many modified nonlinearCG formulas are done ([13ndash16] etc) because many scholarswant to use the perfect numerical attitude of it RecentlyYuan et al [17] open up a new way by modifying the WWPline search technique and partly proved the global con-vergence of the PRP algorithm Further results are obtained(see [18ndash20] etc) by this technique It has been proved thatthe nonlinear CG algorithms can be used in nonlinearequations nonsmooth optimization and image restorationproblems (see [21ndash24] etc)

We all know that the sufficient descent propertydesigned by

dTk gk le minus c gk

2 cgt 0 (5)

plays an important role for convergence analysis in CGmethods (see [13 14 24] etc) where cgt 0 is a constant)ere is another crucial condition about the scalar βk ge 0 thathas been pointed out by Powell [10] and further emphasizedin the global convergence [11 12] )us under the as-sumption of the sufficient descent condition and the WWPtechnique a modified PRP formula βPRP+

k max 0 βPRPk1113966 1113967 ispresented by Gilbert and Nocedal [13] and its globalconvergence for nonconvex functions is established All ofthese observations tell us that both property (5) and βk ge 0are very important in the CG algorithms To get one of theconditions or both of them many scholars made a furtherstudy and got many interesting results Yu [25] presented amodified PRP nonlinear CG formula designed by

βmPRP1k

gTk+1yk

gk

minus μyk

g

Tk+1dk

gk

4 (6)

where μgt (14) is a positive constant and yk gk+1 minus gkwhich has property (5) with c 1 minus (14μ) Yuan [12]proposed a further formula defined by

βmPRP2k

gTk+1yk

gk

2 minus min

gTk+1yk

gk

2 μ

yk

g

Tk+1dk

gk

4

⎧⎨

⎫⎬

⎭ (7)

which possesses not only property (5) with c 1 minus (14μ)

but also the scalar βmPRP2k ge 0 To get a greater drop a three-

term FR CG formula is given by Zhang et al [26]

dk+1 minus θkgk+1 + βFRk dk

θk d

Tk yk

gk

2

(8)

where it has (5) with c minus 1 Dai and Tian [27] gave anotherCG direction designed by

dk+1

minus 1 + βk

dTk gk+1

gk

2

⎛⎝ ⎞⎠gk+1 + βkdk if kge 0

minus gk if k 0

⎧⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎩

(9)

which also possesses (5) with c minus 1)e global convergenceof the above CG method is proved by Dai and Tian [27] forβk βYuanPRPk and βk βYuPRPk For nonconvex functions andthe effective Armijo line search they did not analyze themOne the main reasons lies in the trust region feature Toovercome it we [28] proposed a CG formula designed by

dk+1

minus gk if k 0

minus gk+1 +βkdk minus βk d

Tk gk+1 gk+1

2

1113874 1113875gk+1

ck

if kge 0

⎧⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎩

(10)

where ck (|βk|dkgk+1) which possesses not only (5)with c minus 1 but also the trust region property It has beenproved that the CG formula will have better numericalperformance if it possesses not only the gradient value in-formation but also the function value [29] )is motivates usto present a CG formula based on (10) designed by

dk+1

minus gk if k 0

minus gk+1 +βlowastk dk minus βlowastk d

Tk gk+1 gk+1

2

1113874 1113875gk+1

clowastk

if kge 0

⎧⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎩

(11)

where clowastk (|βlowastk |dkgk+1) βlowastk βmPRP2lowastk (gT

k+1ylowastk

gk 2) minus min (gTk+1ylowastk gk2) μ(ylowastk gT

k+1dkgk4)1113966 1113967 andylowastk yk + ρk with ρk (max ϱk 01113864 1113865sk2) and ϱk

2[f(xk) minus f(xk+1)] + [gk+1 + gk]Tsk and sk xk+1 minus xk)e new vector ylowastk [30] has been proved that it has somegood properties in theory and experiment Yuan et al [29]use it in the CG formula and get some good results )eseachievements inspire us to propose the new CG direction(11) and this paper possesses the following features

(i) )e sufficient property and the trust region featureare obtained

(ii) )e new direction possesses not only the gradientvalue but also the function value

(iii) )e given algorithm has the global convergenceunder the Armijo line search for nonconvexfunctions

(iv) )e experiments for image restoration problemsand compression sensing are done to test the per-formance of the new algorithm

)e next section states the given algorithm )e con-vergence analysis is given in Section 3 and experiments aredone in Section 4 respectively )e last section proposes oneconclusion

2 Algorithm

Based on the discussions of the above section the CG al-gorithm is listed in Algorithm 1

2 Mathematical Problems in Engineering

Theorem 1 e direction dk is defined by (11) then thereexists a positive βgt 0 satisfying

dTk gk minus gk

2 forallkge 0 (12)

dk

le β gk

forallkge 0 (13)

Proof By (11) we directly get (12) and (13) for k 0 withβ minus 1 If kge 0 using (11) again we have

gTk+1dk+1 g

Tk+1 minus gk+1 +

βmPRP2lowastk dk minus βmPRP2lowast

k dTk gk+1 gk+1

2

1113874 1113875gk

clowastk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

minus gTk+1gk+1 +

βmPRP2lowastk g

Tk+1dk minus βmPRP2lowast

k dTk gk+1 gk+1

2

1113874 1113875gTk+1gk+1

βmPRP2lowastk

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

1113874 1113875

minus gk+1

2

+βmPRP2lowast

k gTk+1dk minus βmPRP2lowast

k dTk gk+1

βmPRP2lowastk

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

1113874 1113875

minus gk+1

2

(14)

then (12) is true By (11) again we can get

dk+1

minus gk+1 + βmPRP2lowastk dk minus βmPRP2lowast

k

dTk gk+1

gk+1

2gk+1c

lowastk

le gk+1

+βmPRP2lowast

k

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

+ βmPRP2lowast

k

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

gk+1

2

1113874 1113875 gk+1

βmPRP2lowastk

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

1113874 1113875

gk+1

+2 βmPRP2lowast

k

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

βmPRP2lowastk

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

1113874 1113875

3 gk+1

(15)

Initial step given any initial point x0 isin Rn and positive constants ϵ isin (0 1) σ0 gt 0 δ isin (0 1) μgt (14) σ isin (0 1) set d0 minus g0

minus nablaf(x0) and k 0Step 1 stop if gkle ε is trueStep 2 find αk σ0σik such that

f(xk + αkdk)lef(xk) + δαkg(xk)Tdk

where ik min 0 1 2 satisfying the equation in Step 2 mentioned in Algorithm 1Step 3 set xk+1 xk + αkdkStep 4 stop if gk+1le ε holdsStep 5 compute dk by (11)Step 6 set k k + 1 and go to Step 2

ALGORITHM 1 )ree-term conjugate gradient algorithm

Mathematical Problems in Engineering 3

which implies that (13) holds by choosing β isin [3 +infin) Wecomplete the proof

Remark 1 )e relation (13) is the so-called trust regionfeature and the above theorem tells us that direction (11) hasnot only the sufficient descent property but also the trustregion feature Both these relations (12) and (13) will makethe proof of the global convergence of Algorithm 1 be easy tobe established

3 Global Convergence

For the nonconvex functions the global convergence ofAlgorithm 1 is established under the following assumptions

Assumption 1 Assume that the function f(x) has at least astationary point xlowast namely g(xlowast) 0 is true Supposethat the level set defined by L0 x | f(x)lef(x0)1113864 1113865 isbounded

Assumption 2 )e function f(x) is twice continuouslydifferentiable and bounded below and its g(x) is Lipschitzcontinuous We also assume that there exists a positiveconstant Lgt 0 such that

g(x) minus g(y)leLx minus y x y isin Rn (16)

Now we prove the global convergence of Algorithm 1

Theorem 2 Let Assumption 1 be true en we get

limk⟶infin gk

0 (17)

Proof Using (12) and the Step 2 of Algorithm 1 we obtain

f xk + αkdk( 1113857lef xk( 1113857 + δαkg xk( 1113857Tdk ltfk minus δαk g xk( 1113857

2

(18)

which means that the sequence f(xk)1113864 1113865 is descent and thefollowing relation

δαk g xk( 1113857

2 lef xk( 1113857 minus f xk + αkdk( 1113857 (19)

is true For k 0 toinfin by summing the above inequalitiesand Assumption 1 we deduce that

1113944

infin

k0δαk g xk( 1113857

2 lef x0( 1113857 minus finfin lt +infin (20)

holds )us we have

limk⟶infinαk g xk( 1113857

2

0 (21)

)is implies that

limk⟶infin g xk( 1113857

0 (22)

or

limk⟶infinαk 0 (23)

Suppose that (22) holds the proof of this theorem iscomplete Assuming that (23) is true we aim to get (17) Letthe stepsize αk satisfy the equation in Step 2 in Algorithm 1for αlowastk (αkσ) we have

f xk + αlowastk dk( 1113857gtf xk( 1113857 + δαlowastk dTk g xk( 1113857 (24)

By (12) and (13) and the well-known mean value the-orem we obtain

f xk + αlowastk dk( 1113857 minus f xk( 1113857 αlowastk dTk g xk( 1113857 + αlowastk( 1113857

2O dk

2

1113874 1113875

minusαk

σg xk( 1113857

2

+α2kσ2

O dk

2

1113874 1113875

gt δαlowastk dTk g xk( 1113857

minus δαk

σg xk( 1113857

2

(25)

which implies that

αk gtσ(1 minus δ) g xk( 1113857

2

O dk

2

1113874 1113875

ge σ(1 minus δ)O1β

1113888 1113889 (26)

is true )is is a contradiction to (23) )en only relation(22) holds We complete the proof

Remark 2 We can see that the proof process of the globalconvergence is very simple since the defined direction (11)has not only the good sufficient descent property (12) butalso the perfect trust region feature (13)

4 Numerical Results

)e numerical experiments for image restoration problemsand compression sensing will be done by Algorithm 1 andthe normal PRP algorithm respectively All codes are run ona PC with an Intel (R) Core (TM) i7-7700T CPU 29GHz1600GB of RAM and the Windows 10 operating systemand written by MATLAB r2014a )e parameters are chosenas σ 05 σ0 01 δ 09 and μ 300

41 Image Restoration Problems Setting x be the true imagewhich has M times N pixels and (i j) isin A 1 2 M

times 1 2 3 N At a pixel location (i j) xij denotes thegray level of x )en defining a set N by

N ≔ (i j) isin A | ζ ij ne ζ ij ζ ij smin or smax1113966 1113967 (27)

which is the index set of the noise candidates Suppose that ζis the observed noisy image of x corrupted by salt-and-pepper noise we let ϕij (i j minus 1) (i j+1113864 1) (i minus 1 j) (i +

1 j) be the neighborhood of (i j) By applying an adaptivemedian filter to the noisy image y ζ is defined by the imageobtained smax denotes the maximum of a noisy pixel andsmin denotes the minimum of a noisy pixel )e followingconclusions can be obtained (i) if (i j) isin N then ζ ij must

4 Mathematical Problems in Engineering

be restored A pixel (i j) is identified as uncorrupted and itsoriginal value is kept which means that wlowastij ζ ij with theelement wlowastij of the denoised image w by the two-phasemethod (ii) If (i j) notin N holdswlowastmn ζmn is stetted and ζmn

is restored if (m n) isin ϕij capN Chan et al [31] presented thenew function fα and minimized it for the restored imageswithout a nonsmooth term which has the following form

fα(w) 1113944

(ij)isinN1113944

(mn)isinϕij∖Nψα

wij minus ζmn1113872 1113873 +12

1113944(mn)isinϕij capN

ψαwij minus ζmn1113872 1113873

⎧⎪⎨

⎪⎩

⎫⎪⎬

⎪⎭ (28)

Figure 1 Restoration of Barbara man Baboon and Lena by Algorithm 1 and the PRP algorithm From left to right a noisy image with 20salt-and-pepper noise and restorations obtained by minimizing z with the PRP algorithm and Algorithm 1

Mathematical Problems in Engineering 5

where α is a constant and ψα is an even edge-preservingpotential function )e numerical performance of fα isnoteworthy [32 33]

We choose Barbara (512 times 512) man (256 times 256) Ba-boon (512 times 512) and Lena (256 times 256) as the tested im-ages )e well-known PRP CG algorithm (PRP algorithm) isalso done to compare with Algorithm 1 )e detailed per-formances are listed in Figures 1 and 2

Figures 1 and 2 tell us that these two algorithms (Al-gorithm 1 and the PRP algorithm) are successful to solvethese image restoration problems and the results are good

To directly compare their performances the restorationperformance is assessed by applying the peak signal-to-noiseratio (PSNR) defined in [34ndash36] which is computed andlisted in Table 1 From the value of Table 1 we can see thatAlgorithm 1 is competitive to the PRP algorithm since itsPSNR value is less than that of the PRP algorithm

42 Compressive Sensing In this section the followingcompressive sensing images are tested Phantom(256 times 256) Fruits (256 times 256) and Boat (256 times 256))esethree images are treated as 256 vectors and the size of the

Figure 2 Restoration of Barbara man Baboon and Lena by Algorithm 1 and the PRP algorithm From left to right a noisy image with 40salt-and-pepper noise and restorations obtained by minimizing z with the PRP algorithm and Algorithm 1

6 Mathematical Problems in Engineering

observation matrix is 100 times 256 )e so-called Fouriertransform technology is used and the measurement is theFourier domain

Figures 3ndash5 turn out that these two algorithms work wellfor these figures and they can successfully solve them

5 Conclusion

)is paper by designing a CG algorithm studies the un-constrained optimization problems )e given methodpossesses not only the sufficient descent property but also

(a) (b) (c)

Figure 3 Phantom (a) the general images (b) the recovered images by Algorithm 1 and (c) the recovered images by the PRP algorithm

(a) (b) (c)

Figure 4 Fruits (a) the general images (b) the recovered images by Algorithm 1 and the (c) recovered images by the PRP algorithm

Table 1 PSNR Algorithm 1 and the PRP algorithm

20 noise Barbara Man Baboon Lena AverageAlgorithm 1 31115 380355 294393 410674 349143PRP algorithm 311118 379583 294534 41356 3496940 noise Barbara Man Baboon Lena AverageAlgorithm 1 275415 340063 258947 366496 310230PRP algorithm 276153 345375 258571 36701 311777

(a) (b) (c)

Figure 5 Boat (a) the general images (b) the recovered images by Algorithm 1 and (c) the recovered images by the PRP algorithm

Mathematical Problems in Engineering 7

the trust region feature )e global convergence is proved bya simple way )e image restoration problems and com-pressive sensing problems are tested to show that the pro-posed algorithm is better than the normal algorithm In thefuture we will focus on the following aspects to be paidattention (i) we believe there are many perfect CG algo-rithms which can be successfully used for image restorationproblems and compressive sensing (ii) more experimentswill be done to test the performance of the new algorithm

Data Availability

All data are included in the paper

Conflicts of Interest

)ere are no potential conflicts of interest

Acknowledgments

)e authors would like to thank the support of the funds)is work was supported by the National Natural ScienceFoundation of China under Grant no 61772006 the Scienceand Technology Program of Guangxi under Grant noAB17129012 the Science and Technology Major Project ofGuangxi under Grant no AA17204096 the Special Fund forScientific and Technological Bases and Talents of Guangxiunder Grant no 2016AD05050 and the Special Fund forBagui Scholars of Guangxi

References

[1] Y Dai and Y Yuan ldquoA nonlinear conjugate gradient with astrong global convergence propertiesrdquo SIAM Journal onOptimization vol 10 pp 177ndash182 2000

[2] R Fletcher Practical Methods of Optimization John Wileyand Sons New York NY USA 2nd edition 1987

[3] R Fletcher and C M Reeves ldquoFunction minimization byconjugate gradientsrdquo e Computer Journal vol 7 no 2pp 149ndash154 1964

[4] M R Hestenes and E Stiefel ldquoMethods of conjugate gradientsfor solving linear systemsrdquo Journal of Research of the NationalBureau of Standards vol 49 no 6 pp 409ndash436 1952

[5] Y Liu and C Storey ldquoEfficient generalized conjugate gradientalgorithms part 1 theoryrdquo Journal of Optimization eoryand Applications vol 69 no 1 pp 129ndash137 1991

[6] B T Polak ldquo)e conjugate gradient method in extremeproblemsrdquo Computational Mathematics and MathematicalPhysics vol 9 no 4 pp 94ndash112 1969

[7] E Polak and G Ribiere ldquoNote sur la convergence demethodes de directions conjugueesrdquo Revue franccedilaise drsquoin-formatique et de recherche operationnelle Serie rouge vol 3no 16 pp 35ndash43 1969

[8] Y Dai ldquoConvergence properties of the BFGS algorithmrdquoSIAM Journal on Optimization vol 13 no 3 pp 693ndash7012002

[9] Y Dai ldquoAnalysis of conjugate gradient methodsrdquo PhD)esis Institute of Computational Mathematics and Scien-tificEngineering Computing Chese Academy of SciencesBeijing China 1997

[10] M J D Powell ldquoNonconvex minimization calculations andthe conjugate gradient methodrdquo Lecture Notes in

Mathematics vol 1066 pp 122ndash141 Spinger-Verlag BerlinGermany 1984

[11] M J D Powell ldquoConvergence properties of algorithms fornonlinear optimizationrdquo SIAM Review vol 28 no 4pp 487ndash500 1986

[12] G Yuan ldquoModified nonlinear conjugate gradient methodswith sufficient descent property for large-scale optimizationproblemsrdquo Optimization Letters vol 3 no 1 pp 11ndash21 2009

[13] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journalon Optimization vol 2 no 1 pp 21ndash42 1992

[14] W W Hager and H Zhang ldquoA new conjugate gradientmethod with guaranteed descent and an efficient line searchrdquoSIAM Journal on Optimization vol 16 no 1 pp 170ndash1922005

[15] W W Hager and H Zhang ldquoAlgorithm 851rdquo ACM Trans-actions on Mathematical Software vol 32 no 1 pp 113ndash1372006

[16] Z Wei S Yao and L Liu ldquo)e convergence properties ofsome new conjugate gradient methodsrdquo Applied Mathematicsand Computation vol 183 no 2 pp 1341ndash1350 2006

[17] G Yuan Z Wei and X Lu ldquoGlobal convergence of BFGS andPRP methods under a modified weak Wolfe-Powell linesearchrdquo Applied Mathematical Modelling vol 47 pp 811ndash825 2017

[18] X Li S Wang Z Jin and H Pham ldquoA conjugate gradientalgorithm under Yuan-Wei-Lu line search technique forlarge-scale minimization optimization modelsrdquo Mathemati-cal Problems in Engineering vol 2018 Article ID 472931811 pages 2018

[19] G Yuan Z Sheng B Wang W Hu and C Li ldquo)e globalconvergence of a modified BFGS method for nonconvexfunctionsrdquo Journal of Computational and Applied Mathe-matics vol 327 pp 274ndash294 2018

[20] G Yuan Z Wei and Y Yang ldquo)e global convergence of thePolak-Ribiere-Polyak conjugate gradient algorithm underinexact line search for nonconvex functionsrdquo Journal ofComputational and Applied Mathematics vol 362 pp 262ndash275 2019

[21] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2020

[22] G Yuan T Li and W Hu ldquoA conjugate gradient algorithmfor large-scale nonlinear equations and image restorationproblemsrdquo Applied Numerical Mathematics vol 147pp 129ndash141 2020

[23] G Yuan J Lu and Z Wang ldquo)e PRP conjugate gradientalgorithm with a modified WWP line search and its appli-cation in the image restoration problemsrdquo Applied NumericalMathematics vol 152 pp 1ndash11 2020

[24] G Yuan Z Meng and Y Li ldquoA modified Hestenes and Stiefelconjugate gradient algorithm for large-scale nonsmoothminimizations and nonlinear equationsrdquo Journal of Optimi-zation eory and Applications vol 168 no 1 pp 129ndash1522016

[25] G Yu ldquoNonlinear self-scaling conjugate gradient methods forlarge-scale optimization problemsrdquo)esis of Doctors DegreeSun Yat-Sen University Guangzhou China 2007

[26] L Zhang W Zhou and D Li ldquoGlobal convergence of amodified Fletcher-Reeves conjugate gradient method withArmijo-type line searchrdquo Numerische Mathematik vol 104no 4 pp 561ndash572 2006

8 Mathematical Problems in Engineering

[27] Z-F Dai and B-S Tian ldquoGlobal convergence of somemodified PRP nonlinear conjugate gradient methodsrdquo Op-timization Letters vol 5 no 4 pp 615ndash630 2011

[28] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2019

[29] G Yuan Z Wei and Q Zhao ldquoA modified Polak-Ribiere-Polyak conjugate gradient algorithm for large-scale optimi-zation problemsrdquo IIE Transactions vol 46 no 4 pp 397ndash4132014

[30] G Yuan and Z Wei ldquoConvergence analysis of a modifiedBFGS method on convex minimizationsrdquo ComputationalOptimization and Applications vol 47 no 2 pp 237ndash2552010

[31] R H Chan C W Ho C Y Leung and M NikolovaldquoMinimization of detail-preserving regularization functionalby Newtonrsquos method with continuationrdquo in Proceedings ofIEEE International Conference on Image Processing pp 125ndash128 Genova Italy September 2005

[32] J F Cai R H Chan and B Morini ldquoMinimization of anedge-preserving regularization functional by conjugate gra-dient types methodsrdquo in Image Processing Based on PartialDifferential Equations pp 109ndash122 Springer BerlinGermany 2007

[33] Y Dong R H Chan and S Xu ldquoA detection statistic forrandom-valued impulse noiserdquo IEEE Transactions on ImageProcessing vol 16 no 4 pp 1112ndash1120 2007

[34] A Bovik Handbook of Image and Video Processing Aca-demic New York NY USA 2000

[35] F Rahpeymaii K Amini T Allahviranloo andM R Malkhalifeh ldquoA new class of conjugate gradientmethods for unconstrained smooth optimization and absolutevalue equationsrdquo Calcolo vol 56 2019

[36] G Yu J Huang and Y Zhou ldquoA descent spectral conjugategradient method for impulse noise removalrdquo AppliedMathematics Letters vol 23 no 5 pp 555ndash560 2010

Mathematical Problems in Engineering 9

Page 2: ADescentConjugateGradientAlgorithmforOptimization ...Polak–Ribi`ere–Polak(PRP)formula[6,7]isoneofthewell-knownnonlinearCGformulaswith βPRP k gT k g k−g −1 g k−1 2, (4) whereg

is one of the well-known open problems in optimizationfields Based on the PRP formula many modified nonlinearCG formulas are done ([13ndash16] etc) because many scholarswant to use the perfect numerical attitude of it RecentlyYuan et al [17] open up a new way by modifying the WWPline search technique and partly proved the global con-vergence of the PRP algorithm Further results are obtained(see [18ndash20] etc) by this technique It has been proved thatthe nonlinear CG algorithms can be used in nonlinearequations nonsmooth optimization and image restorationproblems (see [21ndash24] etc)

We all know that the sufficient descent propertydesigned by

dTk gk le minus c gk

2 cgt 0 (5)

plays an important role for convergence analysis in CGmethods (see [13 14 24] etc) where cgt 0 is a constant)ere is another crucial condition about the scalar βk ge 0 thathas been pointed out by Powell [10] and further emphasizedin the global convergence [11 12] )us under the as-sumption of the sufficient descent condition and the WWPtechnique a modified PRP formula βPRP+

k max 0 βPRPk1113966 1113967 ispresented by Gilbert and Nocedal [13] and its globalconvergence for nonconvex functions is established All ofthese observations tell us that both property (5) and βk ge 0are very important in the CG algorithms To get one of theconditions or both of them many scholars made a furtherstudy and got many interesting results Yu [25] presented amodified PRP nonlinear CG formula designed by

βmPRP1k

gTk+1yk

gk

minus μyk

g

Tk+1dk

gk

4 (6)

where μgt (14) is a positive constant and yk gk+1 minus gkwhich has property (5) with c 1 minus (14μ) Yuan [12]proposed a further formula defined by

βmPRP2k

gTk+1yk

gk

2 minus min

gTk+1yk

gk

2 μ

yk

g

Tk+1dk

gk

4

⎧⎨

⎫⎬

⎭ (7)

which possesses not only property (5) with c 1 minus (14μ)

but also the scalar βmPRP2k ge 0 To get a greater drop a three-

term FR CG formula is given by Zhang et al [26]

dk+1 minus θkgk+1 + βFRk dk

θk d

Tk yk

gk

2

(8)

where it has (5) with c minus 1 Dai and Tian [27] gave anotherCG direction designed by

dk+1

minus 1 + βk

dTk gk+1

gk

2

⎛⎝ ⎞⎠gk+1 + βkdk if kge 0

minus gk if k 0

⎧⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎩

(9)

which also possesses (5) with c minus 1)e global convergenceof the above CG method is proved by Dai and Tian [27] forβk βYuanPRPk and βk βYuPRPk For nonconvex functions andthe effective Armijo line search they did not analyze themOne the main reasons lies in the trust region feature Toovercome it we [28] proposed a CG formula designed by

dk+1

minus gk if k 0

minus gk+1 +βkdk minus βk d

Tk gk+1 gk+1

2

1113874 1113875gk+1

ck

if kge 0

⎧⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎩

(10)

where ck (|βk|dkgk+1) which possesses not only (5)with c minus 1 but also the trust region property It has beenproved that the CG formula will have better numericalperformance if it possesses not only the gradient value in-formation but also the function value [29] )is motivates usto present a CG formula based on (10) designed by

dk+1

minus gk if k 0

minus gk+1 +βlowastk dk minus βlowastk d

Tk gk+1 gk+1

2

1113874 1113875gk+1

clowastk

if kge 0

⎧⎪⎪⎪⎪⎪⎨

⎪⎪⎪⎪⎪⎩

(11)

where clowastk (|βlowastk |dkgk+1) βlowastk βmPRP2lowastk (gT

k+1ylowastk

gk 2) minus min (gTk+1ylowastk gk2) μ(ylowastk gT

k+1dkgk4)1113966 1113967 andylowastk yk + ρk with ρk (max ϱk 01113864 1113865sk2) and ϱk

2[f(xk) minus f(xk+1)] + [gk+1 + gk]Tsk and sk xk+1 minus xk)e new vector ylowastk [30] has been proved that it has somegood properties in theory and experiment Yuan et al [29]use it in the CG formula and get some good results )eseachievements inspire us to propose the new CG direction(11) and this paper possesses the following features

(i) )e sufficient property and the trust region featureare obtained

(ii) )e new direction possesses not only the gradientvalue but also the function value

(iii) )e given algorithm has the global convergenceunder the Armijo line search for nonconvexfunctions

(iv) )e experiments for image restoration problemsand compression sensing are done to test the per-formance of the new algorithm

)e next section states the given algorithm )e con-vergence analysis is given in Section 3 and experiments aredone in Section 4 respectively )e last section proposes oneconclusion

2 Algorithm

Based on the discussions of the above section the CG al-gorithm is listed in Algorithm 1

2 Mathematical Problems in Engineering

Theorem 1 e direction dk is defined by (11) then thereexists a positive βgt 0 satisfying

dTk gk minus gk

2 forallkge 0 (12)

dk

le β gk

forallkge 0 (13)

Proof By (11) we directly get (12) and (13) for k 0 withβ minus 1 If kge 0 using (11) again we have

gTk+1dk+1 g

Tk+1 minus gk+1 +

βmPRP2lowastk dk minus βmPRP2lowast

k dTk gk+1 gk+1

2

1113874 1113875gk

clowastk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

minus gTk+1gk+1 +

βmPRP2lowastk g

Tk+1dk minus βmPRP2lowast

k dTk gk+1 gk+1

2

1113874 1113875gTk+1gk+1

βmPRP2lowastk

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

1113874 1113875

minus gk+1

2

+βmPRP2lowast

k gTk+1dk minus βmPRP2lowast

k dTk gk+1

βmPRP2lowastk

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

1113874 1113875

minus gk+1

2

(14)

then (12) is true By (11) again we can get

dk+1

minus gk+1 + βmPRP2lowastk dk minus βmPRP2lowast

k

dTk gk+1

gk+1

2gk+1c

lowastk

le gk+1

+βmPRP2lowast

k

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

+ βmPRP2lowast

k

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

gk+1

2

1113874 1113875 gk+1

βmPRP2lowastk

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

1113874 1113875

gk+1

+2 βmPRP2lowast

k

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

βmPRP2lowastk

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

1113874 1113875

3 gk+1

(15)

Initial step given any initial point x0 isin Rn and positive constants ϵ isin (0 1) σ0 gt 0 δ isin (0 1) μgt (14) σ isin (0 1) set d0 minus g0

minus nablaf(x0) and k 0Step 1 stop if gkle ε is trueStep 2 find αk σ0σik such that

f(xk + αkdk)lef(xk) + δαkg(xk)Tdk

where ik min 0 1 2 satisfying the equation in Step 2 mentioned in Algorithm 1Step 3 set xk+1 xk + αkdkStep 4 stop if gk+1le ε holdsStep 5 compute dk by (11)Step 6 set k k + 1 and go to Step 2

ALGORITHM 1 )ree-term conjugate gradient algorithm

Mathematical Problems in Engineering 3

which implies that (13) holds by choosing β isin [3 +infin) Wecomplete the proof

Remark 1 )e relation (13) is the so-called trust regionfeature and the above theorem tells us that direction (11) hasnot only the sufficient descent property but also the trustregion feature Both these relations (12) and (13) will makethe proof of the global convergence of Algorithm 1 be easy tobe established

3 Global Convergence

For the nonconvex functions the global convergence ofAlgorithm 1 is established under the following assumptions

Assumption 1 Assume that the function f(x) has at least astationary point xlowast namely g(xlowast) 0 is true Supposethat the level set defined by L0 x | f(x)lef(x0)1113864 1113865 isbounded

Assumption 2 )e function f(x) is twice continuouslydifferentiable and bounded below and its g(x) is Lipschitzcontinuous We also assume that there exists a positiveconstant Lgt 0 such that

g(x) minus g(y)leLx minus y x y isin Rn (16)

Now we prove the global convergence of Algorithm 1

Theorem 2 Let Assumption 1 be true en we get

limk⟶infin gk

0 (17)

Proof Using (12) and the Step 2 of Algorithm 1 we obtain

f xk + αkdk( 1113857lef xk( 1113857 + δαkg xk( 1113857Tdk ltfk minus δαk g xk( 1113857

2

(18)

which means that the sequence f(xk)1113864 1113865 is descent and thefollowing relation

δαk g xk( 1113857

2 lef xk( 1113857 minus f xk + αkdk( 1113857 (19)

is true For k 0 toinfin by summing the above inequalitiesand Assumption 1 we deduce that

1113944

infin

k0δαk g xk( 1113857

2 lef x0( 1113857 minus finfin lt +infin (20)

holds )us we have

limk⟶infinαk g xk( 1113857

2

0 (21)

)is implies that

limk⟶infin g xk( 1113857

0 (22)

or

limk⟶infinαk 0 (23)

Suppose that (22) holds the proof of this theorem iscomplete Assuming that (23) is true we aim to get (17) Letthe stepsize αk satisfy the equation in Step 2 in Algorithm 1for αlowastk (αkσ) we have

f xk + αlowastk dk( 1113857gtf xk( 1113857 + δαlowastk dTk g xk( 1113857 (24)

By (12) and (13) and the well-known mean value the-orem we obtain

f xk + αlowastk dk( 1113857 minus f xk( 1113857 αlowastk dTk g xk( 1113857 + αlowastk( 1113857

2O dk

2

1113874 1113875

minusαk

σg xk( 1113857

2

+α2kσ2

O dk

2

1113874 1113875

gt δαlowastk dTk g xk( 1113857

minus δαk

σg xk( 1113857

2

(25)

which implies that

αk gtσ(1 minus δ) g xk( 1113857

2

O dk

2

1113874 1113875

ge σ(1 minus δ)O1β

1113888 1113889 (26)

is true )is is a contradiction to (23) )en only relation(22) holds We complete the proof

Remark 2 We can see that the proof process of the globalconvergence is very simple since the defined direction (11)has not only the good sufficient descent property (12) butalso the perfect trust region feature (13)

4 Numerical Results

)e numerical experiments for image restoration problemsand compression sensing will be done by Algorithm 1 andthe normal PRP algorithm respectively All codes are run ona PC with an Intel (R) Core (TM) i7-7700T CPU 29GHz1600GB of RAM and the Windows 10 operating systemand written by MATLAB r2014a )e parameters are chosenas σ 05 σ0 01 δ 09 and μ 300

41 Image Restoration Problems Setting x be the true imagewhich has M times N pixels and (i j) isin A 1 2 M

times 1 2 3 N At a pixel location (i j) xij denotes thegray level of x )en defining a set N by

N ≔ (i j) isin A | ζ ij ne ζ ij ζ ij smin or smax1113966 1113967 (27)

which is the index set of the noise candidates Suppose that ζis the observed noisy image of x corrupted by salt-and-pepper noise we let ϕij (i j minus 1) (i j+1113864 1) (i minus 1 j) (i +

1 j) be the neighborhood of (i j) By applying an adaptivemedian filter to the noisy image y ζ is defined by the imageobtained smax denotes the maximum of a noisy pixel andsmin denotes the minimum of a noisy pixel )e followingconclusions can be obtained (i) if (i j) isin N then ζ ij must

4 Mathematical Problems in Engineering

be restored A pixel (i j) is identified as uncorrupted and itsoriginal value is kept which means that wlowastij ζ ij with theelement wlowastij of the denoised image w by the two-phasemethod (ii) If (i j) notin N holdswlowastmn ζmn is stetted and ζmn

is restored if (m n) isin ϕij capN Chan et al [31] presented thenew function fα and minimized it for the restored imageswithout a nonsmooth term which has the following form

fα(w) 1113944

(ij)isinN1113944

(mn)isinϕij∖Nψα

wij minus ζmn1113872 1113873 +12

1113944(mn)isinϕij capN

ψαwij minus ζmn1113872 1113873

⎧⎪⎨

⎪⎩

⎫⎪⎬

⎪⎭ (28)

Figure 1 Restoration of Barbara man Baboon and Lena by Algorithm 1 and the PRP algorithm From left to right a noisy image with 20salt-and-pepper noise and restorations obtained by minimizing z with the PRP algorithm and Algorithm 1

Mathematical Problems in Engineering 5

where α is a constant and ψα is an even edge-preservingpotential function )e numerical performance of fα isnoteworthy [32 33]

We choose Barbara (512 times 512) man (256 times 256) Ba-boon (512 times 512) and Lena (256 times 256) as the tested im-ages )e well-known PRP CG algorithm (PRP algorithm) isalso done to compare with Algorithm 1 )e detailed per-formances are listed in Figures 1 and 2

Figures 1 and 2 tell us that these two algorithms (Al-gorithm 1 and the PRP algorithm) are successful to solvethese image restoration problems and the results are good

To directly compare their performances the restorationperformance is assessed by applying the peak signal-to-noiseratio (PSNR) defined in [34ndash36] which is computed andlisted in Table 1 From the value of Table 1 we can see thatAlgorithm 1 is competitive to the PRP algorithm since itsPSNR value is less than that of the PRP algorithm

42 Compressive Sensing In this section the followingcompressive sensing images are tested Phantom(256 times 256) Fruits (256 times 256) and Boat (256 times 256))esethree images are treated as 256 vectors and the size of the

Figure 2 Restoration of Barbara man Baboon and Lena by Algorithm 1 and the PRP algorithm From left to right a noisy image with 40salt-and-pepper noise and restorations obtained by minimizing z with the PRP algorithm and Algorithm 1

6 Mathematical Problems in Engineering

observation matrix is 100 times 256 )e so-called Fouriertransform technology is used and the measurement is theFourier domain

Figures 3ndash5 turn out that these two algorithms work wellfor these figures and they can successfully solve them

5 Conclusion

)is paper by designing a CG algorithm studies the un-constrained optimization problems )e given methodpossesses not only the sufficient descent property but also

(a) (b) (c)

Figure 3 Phantom (a) the general images (b) the recovered images by Algorithm 1 and (c) the recovered images by the PRP algorithm

(a) (b) (c)

Figure 4 Fruits (a) the general images (b) the recovered images by Algorithm 1 and the (c) recovered images by the PRP algorithm

Table 1 PSNR Algorithm 1 and the PRP algorithm

20 noise Barbara Man Baboon Lena AverageAlgorithm 1 31115 380355 294393 410674 349143PRP algorithm 311118 379583 294534 41356 3496940 noise Barbara Man Baboon Lena AverageAlgorithm 1 275415 340063 258947 366496 310230PRP algorithm 276153 345375 258571 36701 311777

(a) (b) (c)

Figure 5 Boat (a) the general images (b) the recovered images by Algorithm 1 and (c) the recovered images by the PRP algorithm

Mathematical Problems in Engineering 7

the trust region feature )e global convergence is proved bya simple way )e image restoration problems and com-pressive sensing problems are tested to show that the pro-posed algorithm is better than the normal algorithm In thefuture we will focus on the following aspects to be paidattention (i) we believe there are many perfect CG algo-rithms which can be successfully used for image restorationproblems and compressive sensing (ii) more experimentswill be done to test the performance of the new algorithm

Data Availability

All data are included in the paper

Conflicts of Interest

)ere are no potential conflicts of interest

Acknowledgments

)e authors would like to thank the support of the funds)is work was supported by the National Natural ScienceFoundation of China under Grant no 61772006 the Scienceand Technology Program of Guangxi under Grant noAB17129012 the Science and Technology Major Project ofGuangxi under Grant no AA17204096 the Special Fund forScientific and Technological Bases and Talents of Guangxiunder Grant no 2016AD05050 and the Special Fund forBagui Scholars of Guangxi

References

[1] Y Dai and Y Yuan ldquoA nonlinear conjugate gradient with astrong global convergence propertiesrdquo SIAM Journal onOptimization vol 10 pp 177ndash182 2000

[2] R Fletcher Practical Methods of Optimization John Wileyand Sons New York NY USA 2nd edition 1987

[3] R Fletcher and C M Reeves ldquoFunction minimization byconjugate gradientsrdquo e Computer Journal vol 7 no 2pp 149ndash154 1964

[4] M R Hestenes and E Stiefel ldquoMethods of conjugate gradientsfor solving linear systemsrdquo Journal of Research of the NationalBureau of Standards vol 49 no 6 pp 409ndash436 1952

[5] Y Liu and C Storey ldquoEfficient generalized conjugate gradientalgorithms part 1 theoryrdquo Journal of Optimization eoryand Applications vol 69 no 1 pp 129ndash137 1991

[6] B T Polak ldquo)e conjugate gradient method in extremeproblemsrdquo Computational Mathematics and MathematicalPhysics vol 9 no 4 pp 94ndash112 1969

[7] E Polak and G Ribiere ldquoNote sur la convergence demethodes de directions conjugueesrdquo Revue franccedilaise drsquoin-formatique et de recherche operationnelle Serie rouge vol 3no 16 pp 35ndash43 1969

[8] Y Dai ldquoConvergence properties of the BFGS algorithmrdquoSIAM Journal on Optimization vol 13 no 3 pp 693ndash7012002

[9] Y Dai ldquoAnalysis of conjugate gradient methodsrdquo PhD)esis Institute of Computational Mathematics and Scien-tificEngineering Computing Chese Academy of SciencesBeijing China 1997

[10] M J D Powell ldquoNonconvex minimization calculations andthe conjugate gradient methodrdquo Lecture Notes in

Mathematics vol 1066 pp 122ndash141 Spinger-Verlag BerlinGermany 1984

[11] M J D Powell ldquoConvergence properties of algorithms fornonlinear optimizationrdquo SIAM Review vol 28 no 4pp 487ndash500 1986

[12] G Yuan ldquoModified nonlinear conjugate gradient methodswith sufficient descent property for large-scale optimizationproblemsrdquo Optimization Letters vol 3 no 1 pp 11ndash21 2009

[13] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journalon Optimization vol 2 no 1 pp 21ndash42 1992

[14] W W Hager and H Zhang ldquoA new conjugate gradientmethod with guaranteed descent and an efficient line searchrdquoSIAM Journal on Optimization vol 16 no 1 pp 170ndash1922005

[15] W W Hager and H Zhang ldquoAlgorithm 851rdquo ACM Trans-actions on Mathematical Software vol 32 no 1 pp 113ndash1372006

[16] Z Wei S Yao and L Liu ldquo)e convergence properties ofsome new conjugate gradient methodsrdquo Applied Mathematicsand Computation vol 183 no 2 pp 1341ndash1350 2006

[17] G Yuan Z Wei and X Lu ldquoGlobal convergence of BFGS andPRP methods under a modified weak Wolfe-Powell linesearchrdquo Applied Mathematical Modelling vol 47 pp 811ndash825 2017

[18] X Li S Wang Z Jin and H Pham ldquoA conjugate gradientalgorithm under Yuan-Wei-Lu line search technique forlarge-scale minimization optimization modelsrdquo Mathemati-cal Problems in Engineering vol 2018 Article ID 472931811 pages 2018

[19] G Yuan Z Sheng B Wang W Hu and C Li ldquo)e globalconvergence of a modified BFGS method for nonconvexfunctionsrdquo Journal of Computational and Applied Mathe-matics vol 327 pp 274ndash294 2018

[20] G Yuan Z Wei and Y Yang ldquo)e global convergence of thePolak-Ribiere-Polyak conjugate gradient algorithm underinexact line search for nonconvex functionsrdquo Journal ofComputational and Applied Mathematics vol 362 pp 262ndash275 2019

[21] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2020

[22] G Yuan T Li and W Hu ldquoA conjugate gradient algorithmfor large-scale nonlinear equations and image restorationproblemsrdquo Applied Numerical Mathematics vol 147pp 129ndash141 2020

[23] G Yuan J Lu and Z Wang ldquo)e PRP conjugate gradientalgorithm with a modified WWP line search and its appli-cation in the image restoration problemsrdquo Applied NumericalMathematics vol 152 pp 1ndash11 2020

[24] G Yuan Z Meng and Y Li ldquoA modified Hestenes and Stiefelconjugate gradient algorithm for large-scale nonsmoothminimizations and nonlinear equationsrdquo Journal of Optimi-zation eory and Applications vol 168 no 1 pp 129ndash1522016

[25] G Yu ldquoNonlinear self-scaling conjugate gradient methods forlarge-scale optimization problemsrdquo)esis of Doctors DegreeSun Yat-Sen University Guangzhou China 2007

[26] L Zhang W Zhou and D Li ldquoGlobal convergence of amodified Fletcher-Reeves conjugate gradient method withArmijo-type line searchrdquo Numerische Mathematik vol 104no 4 pp 561ndash572 2006

8 Mathematical Problems in Engineering

[27] Z-F Dai and B-S Tian ldquoGlobal convergence of somemodified PRP nonlinear conjugate gradient methodsrdquo Op-timization Letters vol 5 no 4 pp 615ndash630 2011

[28] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2019

[29] G Yuan Z Wei and Q Zhao ldquoA modified Polak-Ribiere-Polyak conjugate gradient algorithm for large-scale optimi-zation problemsrdquo IIE Transactions vol 46 no 4 pp 397ndash4132014

[30] G Yuan and Z Wei ldquoConvergence analysis of a modifiedBFGS method on convex minimizationsrdquo ComputationalOptimization and Applications vol 47 no 2 pp 237ndash2552010

[31] R H Chan C W Ho C Y Leung and M NikolovaldquoMinimization of detail-preserving regularization functionalby Newtonrsquos method with continuationrdquo in Proceedings ofIEEE International Conference on Image Processing pp 125ndash128 Genova Italy September 2005

[32] J F Cai R H Chan and B Morini ldquoMinimization of anedge-preserving regularization functional by conjugate gra-dient types methodsrdquo in Image Processing Based on PartialDifferential Equations pp 109ndash122 Springer BerlinGermany 2007

[33] Y Dong R H Chan and S Xu ldquoA detection statistic forrandom-valued impulse noiserdquo IEEE Transactions on ImageProcessing vol 16 no 4 pp 1112ndash1120 2007

[34] A Bovik Handbook of Image and Video Processing Aca-demic New York NY USA 2000

[35] F Rahpeymaii K Amini T Allahviranloo andM R Malkhalifeh ldquoA new class of conjugate gradientmethods for unconstrained smooth optimization and absolutevalue equationsrdquo Calcolo vol 56 2019

[36] G Yu J Huang and Y Zhou ldquoA descent spectral conjugategradient method for impulse noise removalrdquo AppliedMathematics Letters vol 23 no 5 pp 555ndash560 2010

Mathematical Problems in Engineering 9

Page 3: ADescentConjugateGradientAlgorithmforOptimization ...Polak–Ribi`ere–Polak(PRP)formula[6,7]isoneofthewell-knownnonlinearCGformulaswith βPRP k gT k g k−g −1 g k−1 2, (4) whereg

Theorem 1 e direction dk is defined by (11) then thereexists a positive βgt 0 satisfying

dTk gk minus gk

2 forallkge 0 (12)

dk

le β gk

forallkge 0 (13)

Proof By (11) we directly get (12) and (13) for k 0 withβ minus 1 If kge 0 using (11) again we have

gTk+1dk+1 g

Tk+1 minus gk+1 +

βmPRP2lowastk dk minus βmPRP2lowast

k dTk gk+1 gk+1

2

1113874 1113875gk

clowastk

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

minus gTk+1gk+1 +

βmPRP2lowastk g

Tk+1dk minus βmPRP2lowast

k dTk gk+1 gk+1

2

1113874 1113875gTk+1gk+1

βmPRP2lowastk

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

1113874 1113875

minus gk+1

2

+βmPRP2lowast

k gTk+1dk minus βmPRP2lowast

k dTk gk+1

βmPRP2lowastk

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

1113874 1113875

minus gk+1

2

(14)

then (12) is true By (11) again we can get

dk+1

minus gk+1 + βmPRP2lowastk dk minus βmPRP2lowast

k

dTk gk+1

gk+1

2gk+1c

lowastk

le gk+1

+βmPRP2lowast

k

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

+ βmPRP2lowast

k

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

gk+1

2

1113874 1113875 gk+1

βmPRP2lowastk

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

1113874 1113875

gk+1

+2 βmPRP2lowast

k

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

βmPRP2lowastk

11138681113868111386811138681113868

11138681113868111386811138681113868 dk

gk+1

1113874 1113875

3 gk+1

(15)

Initial step given any initial point x0 isin Rn and positive constants ϵ isin (0 1) σ0 gt 0 δ isin (0 1) μgt (14) σ isin (0 1) set d0 minus g0

minus nablaf(x0) and k 0Step 1 stop if gkle ε is trueStep 2 find αk σ0σik such that

f(xk + αkdk)lef(xk) + δαkg(xk)Tdk

where ik min 0 1 2 satisfying the equation in Step 2 mentioned in Algorithm 1Step 3 set xk+1 xk + αkdkStep 4 stop if gk+1le ε holdsStep 5 compute dk by (11)Step 6 set k k + 1 and go to Step 2

ALGORITHM 1 )ree-term conjugate gradient algorithm

Mathematical Problems in Engineering 3

which implies that (13) holds by choosing β isin [3 +infin) Wecomplete the proof

Remark 1 )e relation (13) is the so-called trust regionfeature and the above theorem tells us that direction (11) hasnot only the sufficient descent property but also the trustregion feature Both these relations (12) and (13) will makethe proof of the global convergence of Algorithm 1 be easy tobe established

3 Global Convergence

For the nonconvex functions the global convergence ofAlgorithm 1 is established under the following assumptions

Assumption 1 Assume that the function f(x) has at least astationary point xlowast namely g(xlowast) 0 is true Supposethat the level set defined by L0 x | f(x)lef(x0)1113864 1113865 isbounded

Assumption 2 )e function f(x) is twice continuouslydifferentiable and bounded below and its g(x) is Lipschitzcontinuous We also assume that there exists a positiveconstant Lgt 0 such that

g(x) minus g(y)leLx minus y x y isin Rn (16)

Now we prove the global convergence of Algorithm 1

Theorem 2 Let Assumption 1 be true en we get

limk⟶infin gk

0 (17)

Proof Using (12) and the Step 2 of Algorithm 1 we obtain

f xk + αkdk( 1113857lef xk( 1113857 + δαkg xk( 1113857Tdk ltfk minus δαk g xk( 1113857

2

(18)

which means that the sequence f(xk)1113864 1113865 is descent and thefollowing relation

δαk g xk( 1113857

2 lef xk( 1113857 minus f xk + αkdk( 1113857 (19)

is true For k 0 toinfin by summing the above inequalitiesand Assumption 1 we deduce that

1113944

infin

k0δαk g xk( 1113857

2 lef x0( 1113857 minus finfin lt +infin (20)

holds )us we have

limk⟶infinαk g xk( 1113857

2

0 (21)

)is implies that

limk⟶infin g xk( 1113857

0 (22)

or

limk⟶infinαk 0 (23)

Suppose that (22) holds the proof of this theorem iscomplete Assuming that (23) is true we aim to get (17) Letthe stepsize αk satisfy the equation in Step 2 in Algorithm 1for αlowastk (αkσ) we have

f xk + αlowastk dk( 1113857gtf xk( 1113857 + δαlowastk dTk g xk( 1113857 (24)

By (12) and (13) and the well-known mean value the-orem we obtain

f xk + αlowastk dk( 1113857 minus f xk( 1113857 αlowastk dTk g xk( 1113857 + αlowastk( 1113857

2O dk

2

1113874 1113875

minusαk

σg xk( 1113857

2

+α2kσ2

O dk

2

1113874 1113875

gt δαlowastk dTk g xk( 1113857

minus δαk

σg xk( 1113857

2

(25)

which implies that

αk gtσ(1 minus δ) g xk( 1113857

2

O dk

2

1113874 1113875

ge σ(1 minus δ)O1β

1113888 1113889 (26)

is true )is is a contradiction to (23) )en only relation(22) holds We complete the proof

Remark 2 We can see that the proof process of the globalconvergence is very simple since the defined direction (11)has not only the good sufficient descent property (12) butalso the perfect trust region feature (13)

4 Numerical Results

)e numerical experiments for image restoration problemsand compression sensing will be done by Algorithm 1 andthe normal PRP algorithm respectively All codes are run ona PC with an Intel (R) Core (TM) i7-7700T CPU 29GHz1600GB of RAM and the Windows 10 operating systemand written by MATLAB r2014a )e parameters are chosenas σ 05 σ0 01 δ 09 and μ 300

41 Image Restoration Problems Setting x be the true imagewhich has M times N pixels and (i j) isin A 1 2 M

times 1 2 3 N At a pixel location (i j) xij denotes thegray level of x )en defining a set N by

N ≔ (i j) isin A | ζ ij ne ζ ij ζ ij smin or smax1113966 1113967 (27)

which is the index set of the noise candidates Suppose that ζis the observed noisy image of x corrupted by salt-and-pepper noise we let ϕij (i j minus 1) (i j+1113864 1) (i minus 1 j) (i +

1 j) be the neighborhood of (i j) By applying an adaptivemedian filter to the noisy image y ζ is defined by the imageobtained smax denotes the maximum of a noisy pixel andsmin denotes the minimum of a noisy pixel )e followingconclusions can be obtained (i) if (i j) isin N then ζ ij must

4 Mathematical Problems in Engineering

be restored A pixel (i j) is identified as uncorrupted and itsoriginal value is kept which means that wlowastij ζ ij with theelement wlowastij of the denoised image w by the two-phasemethod (ii) If (i j) notin N holdswlowastmn ζmn is stetted and ζmn

is restored if (m n) isin ϕij capN Chan et al [31] presented thenew function fα and minimized it for the restored imageswithout a nonsmooth term which has the following form

fα(w) 1113944

(ij)isinN1113944

(mn)isinϕij∖Nψα

wij minus ζmn1113872 1113873 +12

1113944(mn)isinϕij capN

ψαwij minus ζmn1113872 1113873

⎧⎪⎨

⎪⎩

⎫⎪⎬

⎪⎭ (28)

Figure 1 Restoration of Barbara man Baboon and Lena by Algorithm 1 and the PRP algorithm From left to right a noisy image with 20salt-and-pepper noise and restorations obtained by minimizing z with the PRP algorithm and Algorithm 1

Mathematical Problems in Engineering 5

where α is a constant and ψα is an even edge-preservingpotential function )e numerical performance of fα isnoteworthy [32 33]

We choose Barbara (512 times 512) man (256 times 256) Ba-boon (512 times 512) and Lena (256 times 256) as the tested im-ages )e well-known PRP CG algorithm (PRP algorithm) isalso done to compare with Algorithm 1 )e detailed per-formances are listed in Figures 1 and 2

Figures 1 and 2 tell us that these two algorithms (Al-gorithm 1 and the PRP algorithm) are successful to solvethese image restoration problems and the results are good

To directly compare their performances the restorationperformance is assessed by applying the peak signal-to-noiseratio (PSNR) defined in [34ndash36] which is computed andlisted in Table 1 From the value of Table 1 we can see thatAlgorithm 1 is competitive to the PRP algorithm since itsPSNR value is less than that of the PRP algorithm

42 Compressive Sensing In this section the followingcompressive sensing images are tested Phantom(256 times 256) Fruits (256 times 256) and Boat (256 times 256))esethree images are treated as 256 vectors and the size of the

Figure 2 Restoration of Barbara man Baboon and Lena by Algorithm 1 and the PRP algorithm From left to right a noisy image with 40salt-and-pepper noise and restorations obtained by minimizing z with the PRP algorithm and Algorithm 1

6 Mathematical Problems in Engineering

observation matrix is 100 times 256 )e so-called Fouriertransform technology is used and the measurement is theFourier domain

Figures 3ndash5 turn out that these two algorithms work wellfor these figures and they can successfully solve them

5 Conclusion

)is paper by designing a CG algorithm studies the un-constrained optimization problems )e given methodpossesses not only the sufficient descent property but also

(a) (b) (c)

Figure 3 Phantom (a) the general images (b) the recovered images by Algorithm 1 and (c) the recovered images by the PRP algorithm

(a) (b) (c)

Figure 4 Fruits (a) the general images (b) the recovered images by Algorithm 1 and the (c) recovered images by the PRP algorithm

Table 1 PSNR Algorithm 1 and the PRP algorithm

20 noise Barbara Man Baboon Lena AverageAlgorithm 1 31115 380355 294393 410674 349143PRP algorithm 311118 379583 294534 41356 3496940 noise Barbara Man Baboon Lena AverageAlgorithm 1 275415 340063 258947 366496 310230PRP algorithm 276153 345375 258571 36701 311777

(a) (b) (c)

Figure 5 Boat (a) the general images (b) the recovered images by Algorithm 1 and (c) the recovered images by the PRP algorithm

Mathematical Problems in Engineering 7

the trust region feature )e global convergence is proved bya simple way )e image restoration problems and com-pressive sensing problems are tested to show that the pro-posed algorithm is better than the normal algorithm In thefuture we will focus on the following aspects to be paidattention (i) we believe there are many perfect CG algo-rithms which can be successfully used for image restorationproblems and compressive sensing (ii) more experimentswill be done to test the performance of the new algorithm

Data Availability

All data are included in the paper

Conflicts of Interest

)ere are no potential conflicts of interest

Acknowledgments

)e authors would like to thank the support of the funds)is work was supported by the National Natural ScienceFoundation of China under Grant no 61772006 the Scienceand Technology Program of Guangxi under Grant noAB17129012 the Science and Technology Major Project ofGuangxi under Grant no AA17204096 the Special Fund forScientific and Technological Bases and Talents of Guangxiunder Grant no 2016AD05050 and the Special Fund forBagui Scholars of Guangxi

References

[1] Y Dai and Y Yuan ldquoA nonlinear conjugate gradient with astrong global convergence propertiesrdquo SIAM Journal onOptimization vol 10 pp 177ndash182 2000

[2] R Fletcher Practical Methods of Optimization John Wileyand Sons New York NY USA 2nd edition 1987

[3] R Fletcher and C M Reeves ldquoFunction minimization byconjugate gradientsrdquo e Computer Journal vol 7 no 2pp 149ndash154 1964

[4] M R Hestenes and E Stiefel ldquoMethods of conjugate gradientsfor solving linear systemsrdquo Journal of Research of the NationalBureau of Standards vol 49 no 6 pp 409ndash436 1952

[5] Y Liu and C Storey ldquoEfficient generalized conjugate gradientalgorithms part 1 theoryrdquo Journal of Optimization eoryand Applications vol 69 no 1 pp 129ndash137 1991

[6] B T Polak ldquo)e conjugate gradient method in extremeproblemsrdquo Computational Mathematics and MathematicalPhysics vol 9 no 4 pp 94ndash112 1969

[7] E Polak and G Ribiere ldquoNote sur la convergence demethodes de directions conjugueesrdquo Revue franccedilaise drsquoin-formatique et de recherche operationnelle Serie rouge vol 3no 16 pp 35ndash43 1969

[8] Y Dai ldquoConvergence properties of the BFGS algorithmrdquoSIAM Journal on Optimization vol 13 no 3 pp 693ndash7012002

[9] Y Dai ldquoAnalysis of conjugate gradient methodsrdquo PhD)esis Institute of Computational Mathematics and Scien-tificEngineering Computing Chese Academy of SciencesBeijing China 1997

[10] M J D Powell ldquoNonconvex minimization calculations andthe conjugate gradient methodrdquo Lecture Notes in

Mathematics vol 1066 pp 122ndash141 Spinger-Verlag BerlinGermany 1984

[11] M J D Powell ldquoConvergence properties of algorithms fornonlinear optimizationrdquo SIAM Review vol 28 no 4pp 487ndash500 1986

[12] G Yuan ldquoModified nonlinear conjugate gradient methodswith sufficient descent property for large-scale optimizationproblemsrdquo Optimization Letters vol 3 no 1 pp 11ndash21 2009

[13] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journalon Optimization vol 2 no 1 pp 21ndash42 1992

[14] W W Hager and H Zhang ldquoA new conjugate gradientmethod with guaranteed descent and an efficient line searchrdquoSIAM Journal on Optimization vol 16 no 1 pp 170ndash1922005

[15] W W Hager and H Zhang ldquoAlgorithm 851rdquo ACM Trans-actions on Mathematical Software vol 32 no 1 pp 113ndash1372006

[16] Z Wei S Yao and L Liu ldquo)e convergence properties ofsome new conjugate gradient methodsrdquo Applied Mathematicsand Computation vol 183 no 2 pp 1341ndash1350 2006

[17] G Yuan Z Wei and X Lu ldquoGlobal convergence of BFGS andPRP methods under a modified weak Wolfe-Powell linesearchrdquo Applied Mathematical Modelling vol 47 pp 811ndash825 2017

[18] X Li S Wang Z Jin and H Pham ldquoA conjugate gradientalgorithm under Yuan-Wei-Lu line search technique forlarge-scale minimization optimization modelsrdquo Mathemati-cal Problems in Engineering vol 2018 Article ID 472931811 pages 2018

[19] G Yuan Z Sheng B Wang W Hu and C Li ldquo)e globalconvergence of a modified BFGS method for nonconvexfunctionsrdquo Journal of Computational and Applied Mathe-matics vol 327 pp 274ndash294 2018

[20] G Yuan Z Wei and Y Yang ldquo)e global convergence of thePolak-Ribiere-Polyak conjugate gradient algorithm underinexact line search for nonconvex functionsrdquo Journal ofComputational and Applied Mathematics vol 362 pp 262ndash275 2019

[21] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2020

[22] G Yuan T Li and W Hu ldquoA conjugate gradient algorithmfor large-scale nonlinear equations and image restorationproblemsrdquo Applied Numerical Mathematics vol 147pp 129ndash141 2020

[23] G Yuan J Lu and Z Wang ldquo)e PRP conjugate gradientalgorithm with a modified WWP line search and its appli-cation in the image restoration problemsrdquo Applied NumericalMathematics vol 152 pp 1ndash11 2020

[24] G Yuan Z Meng and Y Li ldquoA modified Hestenes and Stiefelconjugate gradient algorithm for large-scale nonsmoothminimizations and nonlinear equationsrdquo Journal of Optimi-zation eory and Applications vol 168 no 1 pp 129ndash1522016

[25] G Yu ldquoNonlinear self-scaling conjugate gradient methods forlarge-scale optimization problemsrdquo)esis of Doctors DegreeSun Yat-Sen University Guangzhou China 2007

[26] L Zhang W Zhou and D Li ldquoGlobal convergence of amodified Fletcher-Reeves conjugate gradient method withArmijo-type line searchrdquo Numerische Mathematik vol 104no 4 pp 561ndash572 2006

8 Mathematical Problems in Engineering

[27] Z-F Dai and B-S Tian ldquoGlobal convergence of somemodified PRP nonlinear conjugate gradient methodsrdquo Op-timization Letters vol 5 no 4 pp 615ndash630 2011

[28] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2019

[29] G Yuan Z Wei and Q Zhao ldquoA modified Polak-Ribiere-Polyak conjugate gradient algorithm for large-scale optimi-zation problemsrdquo IIE Transactions vol 46 no 4 pp 397ndash4132014

[30] G Yuan and Z Wei ldquoConvergence analysis of a modifiedBFGS method on convex minimizationsrdquo ComputationalOptimization and Applications vol 47 no 2 pp 237ndash2552010

[31] R H Chan C W Ho C Y Leung and M NikolovaldquoMinimization of detail-preserving regularization functionalby Newtonrsquos method with continuationrdquo in Proceedings ofIEEE International Conference on Image Processing pp 125ndash128 Genova Italy September 2005

[32] J F Cai R H Chan and B Morini ldquoMinimization of anedge-preserving regularization functional by conjugate gra-dient types methodsrdquo in Image Processing Based on PartialDifferential Equations pp 109ndash122 Springer BerlinGermany 2007

[33] Y Dong R H Chan and S Xu ldquoA detection statistic forrandom-valued impulse noiserdquo IEEE Transactions on ImageProcessing vol 16 no 4 pp 1112ndash1120 2007

[34] A Bovik Handbook of Image and Video Processing Aca-demic New York NY USA 2000

[35] F Rahpeymaii K Amini T Allahviranloo andM R Malkhalifeh ldquoA new class of conjugate gradientmethods for unconstrained smooth optimization and absolutevalue equationsrdquo Calcolo vol 56 2019

[36] G Yu J Huang and Y Zhou ldquoA descent spectral conjugategradient method for impulse noise removalrdquo AppliedMathematics Letters vol 23 no 5 pp 555ndash560 2010

Mathematical Problems in Engineering 9

Page 4: ADescentConjugateGradientAlgorithmforOptimization ...Polak–Ribi`ere–Polak(PRP)formula[6,7]isoneofthewell-knownnonlinearCGformulaswith βPRP k gT k g k−g −1 g k−1 2, (4) whereg

which implies that (13) holds by choosing β isin [3 +infin) Wecomplete the proof

Remark 1 )e relation (13) is the so-called trust regionfeature and the above theorem tells us that direction (11) hasnot only the sufficient descent property but also the trustregion feature Both these relations (12) and (13) will makethe proof of the global convergence of Algorithm 1 be easy tobe established

3 Global Convergence

For the nonconvex functions the global convergence ofAlgorithm 1 is established under the following assumptions

Assumption 1 Assume that the function f(x) has at least astationary point xlowast namely g(xlowast) 0 is true Supposethat the level set defined by L0 x | f(x)lef(x0)1113864 1113865 isbounded

Assumption 2 )e function f(x) is twice continuouslydifferentiable and bounded below and its g(x) is Lipschitzcontinuous We also assume that there exists a positiveconstant Lgt 0 such that

g(x) minus g(y)leLx minus y x y isin Rn (16)

Now we prove the global convergence of Algorithm 1

Theorem 2 Let Assumption 1 be true en we get

limk⟶infin gk

0 (17)

Proof Using (12) and the Step 2 of Algorithm 1 we obtain

f xk + αkdk( 1113857lef xk( 1113857 + δαkg xk( 1113857Tdk ltfk minus δαk g xk( 1113857

2

(18)

which means that the sequence f(xk)1113864 1113865 is descent and thefollowing relation

δαk g xk( 1113857

2 lef xk( 1113857 minus f xk + αkdk( 1113857 (19)

is true For k 0 toinfin by summing the above inequalitiesand Assumption 1 we deduce that

1113944

infin

k0δαk g xk( 1113857

2 lef x0( 1113857 minus finfin lt +infin (20)

holds )us we have

limk⟶infinαk g xk( 1113857

2

0 (21)

)is implies that

limk⟶infin g xk( 1113857

0 (22)

or

limk⟶infinαk 0 (23)

Suppose that (22) holds the proof of this theorem iscomplete Assuming that (23) is true we aim to get (17) Letthe stepsize αk satisfy the equation in Step 2 in Algorithm 1for αlowastk (αkσ) we have

f xk + αlowastk dk( 1113857gtf xk( 1113857 + δαlowastk dTk g xk( 1113857 (24)

By (12) and (13) and the well-known mean value the-orem we obtain

f xk + αlowastk dk( 1113857 minus f xk( 1113857 αlowastk dTk g xk( 1113857 + αlowastk( 1113857

2O dk

2

1113874 1113875

minusαk

σg xk( 1113857

2

+α2kσ2

O dk

2

1113874 1113875

gt δαlowastk dTk g xk( 1113857

minus δαk

σg xk( 1113857

2

(25)

which implies that

αk gtσ(1 minus δ) g xk( 1113857

2

O dk

2

1113874 1113875

ge σ(1 minus δ)O1β

1113888 1113889 (26)

is true )is is a contradiction to (23) )en only relation(22) holds We complete the proof

Remark 2 We can see that the proof process of the globalconvergence is very simple since the defined direction (11)has not only the good sufficient descent property (12) butalso the perfect trust region feature (13)

4 Numerical Results

)e numerical experiments for image restoration problemsand compression sensing will be done by Algorithm 1 andthe normal PRP algorithm respectively All codes are run ona PC with an Intel (R) Core (TM) i7-7700T CPU 29GHz1600GB of RAM and the Windows 10 operating systemand written by MATLAB r2014a )e parameters are chosenas σ 05 σ0 01 δ 09 and μ 300

41 Image Restoration Problems Setting x be the true imagewhich has M times N pixels and (i j) isin A 1 2 M

times 1 2 3 N At a pixel location (i j) xij denotes thegray level of x )en defining a set N by

N ≔ (i j) isin A | ζ ij ne ζ ij ζ ij smin or smax1113966 1113967 (27)

which is the index set of the noise candidates Suppose that ζis the observed noisy image of x corrupted by salt-and-pepper noise we let ϕij (i j minus 1) (i j+1113864 1) (i minus 1 j) (i +

1 j) be the neighborhood of (i j) By applying an adaptivemedian filter to the noisy image y ζ is defined by the imageobtained smax denotes the maximum of a noisy pixel andsmin denotes the minimum of a noisy pixel )e followingconclusions can be obtained (i) if (i j) isin N then ζ ij must

4 Mathematical Problems in Engineering

be restored A pixel (i j) is identified as uncorrupted and itsoriginal value is kept which means that wlowastij ζ ij with theelement wlowastij of the denoised image w by the two-phasemethod (ii) If (i j) notin N holdswlowastmn ζmn is stetted and ζmn

is restored if (m n) isin ϕij capN Chan et al [31] presented thenew function fα and minimized it for the restored imageswithout a nonsmooth term which has the following form

fα(w) 1113944

(ij)isinN1113944

(mn)isinϕij∖Nψα

wij minus ζmn1113872 1113873 +12

1113944(mn)isinϕij capN

ψαwij minus ζmn1113872 1113873

⎧⎪⎨

⎪⎩

⎫⎪⎬

⎪⎭ (28)

Figure 1 Restoration of Barbara man Baboon and Lena by Algorithm 1 and the PRP algorithm From left to right a noisy image with 20salt-and-pepper noise and restorations obtained by minimizing z with the PRP algorithm and Algorithm 1

Mathematical Problems in Engineering 5

where α is a constant and ψα is an even edge-preservingpotential function )e numerical performance of fα isnoteworthy [32 33]

We choose Barbara (512 times 512) man (256 times 256) Ba-boon (512 times 512) and Lena (256 times 256) as the tested im-ages )e well-known PRP CG algorithm (PRP algorithm) isalso done to compare with Algorithm 1 )e detailed per-formances are listed in Figures 1 and 2

Figures 1 and 2 tell us that these two algorithms (Al-gorithm 1 and the PRP algorithm) are successful to solvethese image restoration problems and the results are good

To directly compare their performances the restorationperformance is assessed by applying the peak signal-to-noiseratio (PSNR) defined in [34ndash36] which is computed andlisted in Table 1 From the value of Table 1 we can see thatAlgorithm 1 is competitive to the PRP algorithm since itsPSNR value is less than that of the PRP algorithm

42 Compressive Sensing In this section the followingcompressive sensing images are tested Phantom(256 times 256) Fruits (256 times 256) and Boat (256 times 256))esethree images are treated as 256 vectors and the size of the

Figure 2 Restoration of Barbara man Baboon and Lena by Algorithm 1 and the PRP algorithm From left to right a noisy image with 40salt-and-pepper noise and restorations obtained by minimizing z with the PRP algorithm and Algorithm 1

6 Mathematical Problems in Engineering

observation matrix is 100 times 256 )e so-called Fouriertransform technology is used and the measurement is theFourier domain

Figures 3ndash5 turn out that these two algorithms work wellfor these figures and they can successfully solve them

5 Conclusion

)is paper by designing a CG algorithm studies the un-constrained optimization problems )e given methodpossesses not only the sufficient descent property but also

(a) (b) (c)

Figure 3 Phantom (a) the general images (b) the recovered images by Algorithm 1 and (c) the recovered images by the PRP algorithm

(a) (b) (c)

Figure 4 Fruits (a) the general images (b) the recovered images by Algorithm 1 and the (c) recovered images by the PRP algorithm

Table 1 PSNR Algorithm 1 and the PRP algorithm

20 noise Barbara Man Baboon Lena AverageAlgorithm 1 31115 380355 294393 410674 349143PRP algorithm 311118 379583 294534 41356 3496940 noise Barbara Man Baboon Lena AverageAlgorithm 1 275415 340063 258947 366496 310230PRP algorithm 276153 345375 258571 36701 311777

(a) (b) (c)

Figure 5 Boat (a) the general images (b) the recovered images by Algorithm 1 and (c) the recovered images by the PRP algorithm

Mathematical Problems in Engineering 7

the trust region feature )e global convergence is proved bya simple way )e image restoration problems and com-pressive sensing problems are tested to show that the pro-posed algorithm is better than the normal algorithm In thefuture we will focus on the following aspects to be paidattention (i) we believe there are many perfect CG algo-rithms which can be successfully used for image restorationproblems and compressive sensing (ii) more experimentswill be done to test the performance of the new algorithm

Data Availability

All data are included in the paper

Conflicts of Interest

)ere are no potential conflicts of interest

Acknowledgments

)e authors would like to thank the support of the funds)is work was supported by the National Natural ScienceFoundation of China under Grant no 61772006 the Scienceand Technology Program of Guangxi under Grant noAB17129012 the Science and Technology Major Project ofGuangxi under Grant no AA17204096 the Special Fund forScientific and Technological Bases and Talents of Guangxiunder Grant no 2016AD05050 and the Special Fund forBagui Scholars of Guangxi

References

[1] Y Dai and Y Yuan ldquoA nonlinear conjugate gradient with astrong global convergence propertiesrdquo SIAM Journal onOptimization vol 10 pp 177ndash182 2000

[2] R Fletcher Practical Methods of Optimization John Wileyand Sons New York NY USA 2nd edition 1987

[3] R Fletcher and C M Reeves ldquoFunction minimization byconjugate gradientsrdquo e Computer Journal vol 7 no 2pp 149ndash154 1964

[4] M R Hestenes and E Stiefel ldquoMethods of conjugate gradientsfor solving linear systemsrdquo Journal of Research of the NationalBureau of Standards vol 49 no 6 pp 409ndash436 1952

[5] Y Liu and C Storey ldquoEfficient generalized conjugate gradientalgorithms part 1 theoryrdquo Journal of Optimization eoryand Applications vol 69 no 1 pp 129ndash137 1991

[6] B T Polak ldquo)e conjugate gradient method in extremeproblemsrdquo Computational Mathematics and MathematicalPhysics vol 9 no 4 pp 94ndash112 1969

[7] E Polak and G Ribiere ldquoNote sur la convergence demethodes de directions conjugueesrdquo Revue franccedilaise drsquoin-formatique et de recherche operationnelle Serie rouge vol 3no 16 pp 35ndash43 1969

[8] Y Dai ldquoConvergence properties of the BFGS algorithmrdquoSIAM Journal on Optimization vol 13 no 3 pp 693ndash7012002

[9] Y Dai ldquoAnalysis of conjugate gradient methodsrdquo PhD)esis Institute of Computational Mathematics and Scien-tificEngineering Computing Chese Academy of SciencesBeijing China 1997

[10] M J D Powell ldquoNonconvex minimization calculations andthe conjugate gradient methodrdquo Lecture Notes in

Mathematics vol 1066 pp 122ndash141 Spinger-Verlag BerlinGermany 1984

[11] M J D Powell ldquoConvergence properties of algorithms fornonlinear optimizationrdquo SIAM Review vol 28 no 4pp 487ndash500 1986

[12] G Yuan ldquoModified nonlinear conjugate gradient methodswith sufficient descent property for large-scale optimizationproblemsrdquo Optimization Letters vol 3 no 1 pp 11ndash21 2009

[13] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journalon Optimization vol 2 no 1 pp 21ndash42 1992

[14] W W Hager and H Zhang ldquoA new conjugate gradientmethod with guaranteed descent and an efficient line searchrdquoSIAM Journal on Optimization vol 16 no 1 pp 170ndash1922005

[15] W W Hager and H Zhang ldquoAlgorithm 851rdquo ACM Trans-actions on Mathematical Software vol 32 no 1 pp 113ndash1372006

[16] Z Wei S Yao and L Liu ldquo)e convergence properties ofsome new conjugate gradient methodsrdquo Applied Mathematicsand Computation vol 183 no 2 pp 1341ndash1350 2006

[17] G Yuan Z Wei and X Lu ldquoGlobal convergence of BFGS andPRP methods under a modified weak Wolfe-Powell linesearchrdquo Applied Mathematical Modelling vol 47 pp 811ndash825 2017

[18] X Li S Wang Z Jin and H Pham ldquoA conjugate gradientalgorithm under Yuan-Wei-Lu line search technique forlarge-scale minimization optimization modelsrdquo Mathemati-cal Problems in Engineering vol 2018 Article ID 472931811 pages 2018

[19] G Yuan Z Sheng B Wang W Hu and C Li ldquo)e globalconvergence of a modified BFGS method for nonconvexfunctionsrdquo Journal of Computational and Applied Mathe-matics vol 327 pp 274ndash294 2018

[20] G Yuan Z Wei and Y Yang ldquo)e global convergence of thePolak-Ribiere-Polyak conjugate gradient algorithm underinexact line search for nonconvex functionsrdquo Journal ofComputational and Applied Mathematics vol 362 pp 262ndash275 2019

[21] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2020

[22] G Yuan T Li and W Hu ldquoA conjugate gradient algorithmfor large-scale nonlinear equations and image restorationproblemsrdquo Applied Numerical Mathematics vol 147pp 129ndash141 2020

[23] G Yuan J Lu and Z Wang ldquo)e PRP conjugate gradientalgorithm with a modified WWP line search and its appli-cation in the image restoration problemsrdquo Applied NumericalMathematics vol 152 pp 1ndash11 2020

[24] G Yuan Z Meng and Y Li ldquoA modified Hestenes and Stiefelconjugate gradient algorithm for large-scale nonsmoothminimizations and nonlinear equationsrdquo Journal of Optimi-zation eory and Applications vol 168 no 1 pp 129ndash1522016

[25] G Yu ldquoNonlinear self-scaling conjugate gradient methods forlarge-scale optimization problemsrdquo)esis of Doctors DegreeSun Yat-Sen University Guangzhou China 2007

[26] L Zhang W Zhou and D Li ldquoGlobal convergence of amodified Fletcher-Reeves conjugate gradient method withArmijo-type line searchrdquo Numerische Mathematik vol 104no 4 pp 561ndash572 2006

8 Mathematical Problems in Engineering

[27] Z-F Dai and B-S Tian ldquoGlobal convergence of somemodified PRP nonlinear conjugate gradient methodsrdquo Op-timization Letters vol 5 no 4 pp 615ndash630 2011

[28] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2019

[29] G Yuan Z Wei and Q Zhao ldquoA modified Polak-Ribiere-Polyak conjugate gradient algorithm for large-scale optimi-zation problemsrdquo IIE Transactions vol 46 no 4 pp 397ndash4132014

[30] G Yuan and Z Wei ldquoConvergence analysis of a modifiedBFGS method on convex minimizationsrdquo ComputationalOptimization and Applications vol 47 no 2 pp 237ndash2552010

[31] R H Chan C W Ho C Y Leung and M NikolovaldquoMinimization of detail-preserving regularization functionalby Newtonrsquos method with continuationrdquo in Proceedings ofIEEE International Conference on Image Processing pp 125ndash128 Genova Italy September 2005

[32] J F Cai R H Chan and B Morini ldquoMinimization of anedge-preserving regularization functional by conjugate gra-dient types methodsrdquo in Image Processing Based on PartialDifferential Equations pp 109ndash122 Springer BerlinGermany 2007

[33] Y Dong R H Chan and S Xu ldquoA detection statistic forrandom-valued impulse noiserdquo IEEE Transactions on ImageProcessing vol 16 no 4 pp 1112ndash1120 2007

[34] A Bovik Handbook of Image and Video Processing Aca-demic New York NY USA 2000

[35] F Rahpeymaii K Amini T Allahviranloo andM R Malkhalifeh ldquoA new class of conjugate gradientmethods for unconstrained smooth optimization and absolutevalue equationsrdquo Calcolo vol 56 2019

[36] G Yu J Huang and Y Zhou ldquoA descent spectral conjugategradient method for impulse noise removalrdquo AppliedMathematics Letters vol 23 no 5 pp 555ndash560 2010

Mathematical Problems in Engineering 9

Page 5: ADescentConjugateGradientAlgorithmforOptimization ...Polak–Ribi`ere–Polak(PRP)formula[6,7]isoneofthewell-knownnonlinearCGformulaswith βPRP k gT k g k−g −1 g k−1 2, (4) whereg

be restored A pixel (i j) is identified as uncorrupted and itsoriginal value is kept which means that wlowastij ζ ij with theelement wlowastij of the denoised image w by the two-phasemethod (ii) If (i j) notin N holdswlowastmn ζmn is stetted and ζmn

is restored if (m n) isin ϕij capN Chan et al [31] presented thenew function fα and minimized it for the restored imageswithout a nonsmooth term which has the following form

fα(w) 1113944

(ij)isinN1113944

(mn)isinϕij∖Nψα

wij minus ζmn1113872 1113873 +12

1113944(mn)isinϕij capN

ψαwij minus ζmn1113872 1113873

⎧⎪⎨

⎪⎩

⎫⎪⎬

⎪⎭ (28)

Figure 1 Restoration of Barbara man Baboon and Lena by Algorithm 1 and the PRP algorithm From left to right a noisy image with 20salt-and-pepper noise and restorations obtained by minimizing z with the PRP algorithm and Algorithm 1

Mathematical Problems in Engineering 5

where α is a constant and ψα is an even edge-preservingpotential function )e numerical performance of fα isnoteworthy [32 33]

We choose Barbara (512 times 512) man (256 times 256) Ba-boon (512 times 512) and Lena (256 times 256) as the tested im-ages )e well-known PRP CG algorithm (PRP algorithm) isalso done to compare with Algorithm 1 )e detailed per-formances are listed in Figures 1 and 2

Figures 1 and 2 tell us that these two algorithms (Al-gorithm 1 and the PRP algorithm) are successful to solvethese image restoration problems and the results are good

To directly compare their performances the restorationperformance is assessed by applying the peak signal-to-noiseratio (PSNR) defined in [34ndash36] which is computed andlisted in Table 1 From the value of Table 1 we can see thatAlgorithm 1 is competitive to the PRP algorithm since itsPSNR value is less than that of the PRP algorithm

42 Compressive Sensing In this section the followingcompressive sensing images are tested Phantom(256 times 256) Fruits (256 times 256) and Boat (256 times 256))esethree images are treated as 256 vectors and the size of the

Figure 2 Restoration of Barbara man Baboon and Lena by Algorithm 1 and the PRP algorithm From left to right a noisy image with 40salt-and-pepper noise and restorations obtained by minimizing z with the PRP algorithm and Algorithm 1

6 Mathematical Problems in Engineering

observation matrix is 100 times 256 )e so-called Fouriertransform technology is used and the measurement is theFourier domain

Figures 3ndash5 turn out that these two algorithms work wellfor these figures and they can successfully solve them

5 Conclusion

)is paper by designing a CG algorithm studies the un-constrained optimization problems )e given methodpossesses not only the sufficient descent property but also

(a) (b) (c)

Figure 3 Phantom (a) the general images (b) the recovered images by Algorithm 1 and (c) the recovered images by the PRP algorithm

(a) (b) (c)

Figure 4 Fruits (a) the general images (b) the recovered images by Algorithm 1 and the (c) recovered images by the PRP algorithm

Table 1 PSNR Algorithm 1 and the PRP algorithm

20 noise Barbara Man Baboon Lena AverageAlgorithm 1 31115 380355 294393 410674 349143PRP algorithm 311118 379583 294534 41356 3496940 noise Barbara Man Baboon Lena AverageAlgorithm 1 275415 340063 258947 366496 310230PRP algorithm 276153 345375 258571 36701 311777

(a) (b) (c)

Figure 5 Boat (a) the general images (b) the recovered images by Algorithm 1 and (c) the recovered images by the PRP algorithm

Mathematical Problems in Engineering 7

the trust region feature )e global convergence is proved bya simple way )e image restoration problems and com-pressive sensing problems are tested to show that the pro-posed algorithm is better than the normal algorithm In thefuture we will focus on the following aspects to be paidattention (i) we believe there are many perfect CG algo-rithms which can be successfully used for image restorationproblems and compressive sensing (ii) more experimentswill be done to test the performance of the new algorithm

Data Availability

All data are included in the paper

Conflicts of Interest

)ere are no potential conflicts of interest

Acknowledgments

)e authors would like to thank the support of the funds)is work was supported by the National Natural ScienceFoundation of China under Grant no 61772006 the Scienceand Technology Program of Guangxi under Grant noAB17129012 the Science and Technology Major Project ofGuangxi under Grant no AA17204096 the Special Fund forScientific and Technological Bases and Talents of Guangxiunder Grant no 2016AD05050 and the Special Fund forBagui Scholars of Guangxi

References

[1] Y Dai and Y Yuan ldquoA nonlinear conjugate gradient with astrong global convergence propertiesrdquo SIAM Journal onOptimization vol 10 pp 177ndash182 2000

[2] R Fletcher Practical Methods of Optimization John Wileyand Sons New York NY USA 2nd edition 1987

[3] R Fletcher and C M Reeves ldquoFunction minimization byconjugate gradientsrdquo e Computer Journal vol 7 no 2pp 149ndash154 1964

[4] M R Hestenes and E Stiefel ldquoMethods of conjugate gradientsfor solving linear systemsrdquo Journal of Research of the NationalBureau of Standards vol 49 no 6 pp 409ndash436 1952

[5] Y Liu and C Storey ldquoEfficient generalized conjugate gradientalgorithms part 1 theoryrdquo Journal of Optimization eoryand Applications vol 69 no 1 pp 129ndash137 1991

[6] B T Polak ldquo)e conjugate gradient method in extremeproblemsrdquo Computational Mathematics and MathematicalPhysics vol 9 no 4 pp 94ndash112 1969

[7] E Polak and G Ribiere ldquoNote sur la convergence demethodes de directions conjugueesrdquo Revue franccedilaise drsquoin-formatique et de recherche operationnelle Serie rouge vol 3no 16 pp 35ndash43 1969

[8] Y Dai ldquoConvergence properties of the BFGS algorithmrdquoSIAM Journal on Optimization vol 13 no 3 pp 693ndash7012002

[9] Y Dai ldquoAnalysis of conjugate gradient methodsrdquo PhD)esis Institute of Computational Mathematics and Scien-tificEngineering Computing Chese Academy of SciencesBeijing China 1997

[10] M J D Powell ldquoNonconvex minimization calculations andthe conjugate gradient methodrdquo Lecture Notes in

Mathematics vol 1066 pp 122ndash141 Spinger-Verlag BerlinGermany 1984

[11] M J D Powell ldquoConvergence properties of algorithms fornonlinear optimizationrdquo SIAM Review vol 28 no 4pp 487ndash500 1986

[12] G Yuan ldquoModified nonlinear conjugate gradient methodswith sufficient descent property for large-scale optimizationproblemsrdquo Optimization Letters vol 3 no 1 pp 11ndash21 2009

[13] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journalon Optimization vol 2 no 1 pp 21ndash42 1992

[14] W W Hager and H Zhang ldquoA new conjugate gradientmethod with guaranteed descent and an efficient line searchrdquoSIAM Journal on Optimization vol 16 no 1 pp 170ndash1922005

[15] W W Hager and H Zhang ldquoAlgorithm 851rdquo ACM Trans-actions on Mathematical Software vol 32 no 1 pp 113ndash1372006

[16] Z Wei S Yao and L Liu ldquo)e convergence properties ofsome new conjugate gradient methodsrdquo Applied Mathematicsand Computation vol 183 no 2 pp 1341ndash1350 2006

[17] G Yuan Z Wei and X Lu ldquoGlobal convergence of BFGS andPRP methods under a modified weak Wolfe-Powell linesearchrdquo Applied Mathematical Modelling vol 47 pp 811ndash825 2017

[18] X Li S Wang Z Jin and H Pham ldquoA conjugate gradientalgorithm under Yuan-Wei-Lu line search technique forlarge-scale minimization optimization modelsrdquo Mathemati-cal Problems in Engineering vol 2018 Article ID 472931811 pages 2018

[19] G Yuan Z Sheng B Wang W Hu and C Li ldquo)e globalconvergence of a modified BFGS method for nonconvexfunctionsrdquo Journal of Computational and Applied Mathe-matics vol 327 pp 274ndash294 2018

[20] G Yuan Z Wei and Y Yang ldquo)e global convergence of thePolak-Ribiere-Polyak conjugate gradient algorithm underinexact line search for nonconvex functionsrdquo Journal ofComputational and Applied Mathematics vol 362 pp 262ndash275 2019

[21] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2020

[22] G Yuan T Li and W Hu ldquoA conjugate gradient algorithmfor large-scale nonlinear equations and image restorationproblemsrdquo Applied Numerical Mathematics vol 147pp 129ndash141 2020

[23] G Yuan J Lu and Z Wang ldquo)e PRP conjugate gradientalgorithm with a modified WWP line search and its appli-cation in the image restoration problemsrdquo Applied NumericalMathematics vol 152 pp 1ndash11 2020

[24] G Yuan Z Meng and Y Li ldquoA modified Hestenes and Stiefelconjugate gradient algorithm for large-scale nonsmoothminimizations and nonlinear equationsrdquo Journal of Optimi-zation eory and Applications vol 168 no 1 pp 129ndash1522016

[25] G Yu ldquoNonlinear self-scaling conjugate gradient methods forlarge-scale optimization problemsrdquo)esis of Doctors DegreeSun Yat-Sen University Guangzhou China 2007

[26] L Zhang W Zhou and D Li ldquoGlobal convergence of amodified Fletcher-Reeves conjugate gradient method withArmijo-type line searchrdquo Numerische Mathematik vol 104no 4 pp 561ndash572 2006

8 Mathematical Problems in Engineering

[27] Z-F Dai and B-S Tian ldquoGlobal convergence of somemodified PRP nonlinear conjugate gradient methodsrdquo Op-timization Letters vol 5 no 4 pp 615ndash630 2011

[28] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2019

[29] G Yuan Z Wei and Q Zhao ldquoA modified Polak-Ribiere-Polyak conjugate gradient algorithm for large-scale optimi-zation problemsrdquo IIE Transactions vol 46 no 4 pp 397ndash4132014

[30] G Yuan and Z Wei ldquoConvergence analysis of a modifiedBFGS method on convex minimizationsrdquo ComputationalOptimization and Applications vol 47 no 2 pp 237ndash2552010

[31] R H Chan C W Ho C Y Leung and M NikolovaldquoMinimization of detail-preserving regularization functionalby Newtonrsquos method with continuationrdquo in Proceedings ofIEEE International Conference on Image Processing pp 125ndash128 Genova Italy September 2005

[32] J F Cai R H Chan and B Morini ldquoMinimization of anedge-preserving regularization functional by conjugate gra-dient types methodsrdquo in Image Processing Based on PartialDifferential Equations pp 109ndash122 Springer BerlinGermany 2007

[33] Y Dong R H Chan and S Xu ldquoA detection statistic forrandom-valued impulse noiserdquo IEEE Transactions on ImageProcessing vol 16 no 4 pp 1112ndash1120 2007

[34] A Bovik Handbook of Image and Video Processing Aca-demic New York NY USA 2000

[35] F Rahpeymaii K Amini T Allahviranloo andM R Malkhalifeh ldquoA new class of conjugate gradientmethods for unconstrained smooth optimization and absolutevalue equationsrdquo Calcolo vol 56 2019

[36] G Yu J Huang and Y Zhou ldquoA descent spectral conjugategradient method for impulse noise removalrdquo AppliedMathematics Letters vol 23 no 5 pp 555ndash560 2010

Mathematical Problems in Engineering 9

Page 6: ADescentConjugateGradientAlgorithmforOptimization ...Polak–Ribi`ere–Polak(PRP)formula[6,7]isoneofthewell-knownnonlinearCGformulaswith βPRP k gT k g k−g −1 g k−1 2, (4) whereg

where α is a constant and ψα is an even edge-preservingpotential function )e numerical performance of fα isnoteworthy [32 33]

We choose Barbara (512 times 512) man (256 times 256) Ba-boon (512 times 512) and Lena (256 times 256) as the tested im-ages )e well-known PRP CG algorithm (PRP algorithm) isalso done to compare with Algorithm 1 )e detailed per-formances are listed in Figures 1 and 2

Figures 1 and 2 tell us that these two algorithms (Al-gorithm 1 and the PRP algorithm) are successful to solvethese image restoration problems and the results are good

To directly compare their performances the restorationperformance is assessed by applying the peak signal-to-noiseratio (PSNR) defined in [34ndash36] which is computed andlisted in Table 1 From the value of Table 1 we can see thatAlgorithm 1 is competitive to the PRP algorithm since itsPSNR value is less than that of the PRP algorithm

42 Compressive Sensing In this section the followingcompressive sensing images are tested Phantom(256 times 256) Fruits (256 times 256) and Boat (256 times 256))esethree images are treated as 256 vectors and the size of the

Figure 2 Restoration of Barbara man Baboon and Lena by Algorithm 1 and the PRP algorithm From left to right a noisy image with 40salt-and-pepper noise and restorations obtained by minimizing z with the PRP algorithm and Algorithm 1

6 Mathematical Problems in Engineering

observation matrix is 100 times 256 )e so-called Fouriertransform technology is used and the measurement is theFourier domain

Figures 3ndash5 turn out that these two algorithms work wellfor these figures and they can successfully solve them

5 Conclusion

)is paper by designing a CG algorithm studies the un-constrained optimization problems )e given methodpossesses not only the sufficient descent property but also

(a) (b) (c)

Figure 3 Phantom (a) the general images (b) the recovered images by Algorithm 1 and (c) the recovered images by the PRP algorithm

(a) (b) (c)

Figure 4 Fruits (a) the general images (b) the recovered images by Algorithm 1 and the (c) recovered images by the PRP algorithm

Table 1 PSNR Algorithm 1 and the PRP algorithm

20 noise Barbara Man Baboon Lena AverageAlgorithm 1 31115 380355 294393 410674 349143PRP algorithm 311118 379583 294534 41356 3496940 noise Barbara Man Baboon Lena AverageAlgorithm 1 275415 340063 258947 366496 310230PRP algorithm 276153 345375 258571 36701 311777

(a) (b) (c)

Figure 5 Boat (a) the general images (b) the recovered images by Algorithm 1 and (c) the recovered images by the PRP algorithm

Mathematical Problems in Engineering 7

the trust region feature )e global convergence is proved bya simple way )e image restoration problems and com-pressive sensing problems are tested to show that the pro-posed algorithm is better than the normal algorithm In thefuture we will focus on the following aspects to be paidattention (i) we believe there are many perfect CG algo-rithms which can be successfully used for image restorationproblems and compressive sensing (ii) more experimentswill be done to test the performance of the new algorithm

Data Availability

All data are included in the paper

Conflicts of Interest

)ere are no potential conflicts of interest

Acknowledgments

)e authors would like to thank the support of the funds)is work was supported by the National Natural ScienceFoundation of China under Grant no 61772006 the Scienceand Technology Program of Guangxi under Grant noAB17129012 the Science and Technology Major Project ofGuangxi under Grant no AA17204096 the Special Fund forScientific and Technological Bases and Talents of Guangxiunder Grant no 2016AD05050 and the Special Fund forBagui Scholars of Guangxi

References

[1] Y Dai and Y Yuan ldquoA nonlinear conjugate gradient with astrong global convergence propertiesrdquo SIAM Journal onOptimization vol 10 pp 177ndash182 2000

[2] R Fletcher Practical Methods of Optimization John Wileyand Sons New York NY USA 2nd edition 1987

[3] R Fletcher and C M Reeves ldquoFunction minimization byconjugate gradientsrdquo e Computer Journal vol 7 no 2pp 149ndash154 1964

[4] M R Hestenes and E Stiefel ldquoMethods of conjugate gradientsfor solving linear systemsrdquo Journal of Research of the NationalBureau of Standards vol 49 no 6 pp 409ndash436 1952

[5] Y Liu and C Storey ldquoEfficient generalized conjugate gradientalgorithms part 1 theoryrdquo Journal of Optimization eoryand Applications vol 69 no 1 pp 129ndash137 1991

[6] B T Polak ldquo)e conjugate gradient method in extremeproblemsrdquo Computational Mathematics and MathematicalPhysics vol 9 no 4 pp 94ndash112 1969

[7] E Polak and G Ribiere ldquoNote sur la convergence demethodes de directions conjugueesrdquo Revue franccedilaise drsquoin-formatique et de recherche operationnelle Serie rouge vol 3no 16 pp 35ndash43 1969

[8] Y Dai ldquoConvergence properties of the BFGS algorithmrdquoSIAM Journal on Optimization vol 13 no 3 pp 693ndash7012002

[9] Y Dai ldquoAnalysis of conjugate gradient methodsrdquo PhD)esis Institute of Computational Mathematics and Scien-tificEngineering Computing Chese Academy of SciencesBeijing China 1997

[10] M J D Powell ldquoNonconvex minimization calculations andthe conjugate gradient methodrdquo Lecture Notes in

Mathematics vol 1066 pp 122ndash141 Spinger-Verlag BerlinGermany 1984

[11] M J D Powell ldquoConvergence properties of algorithms fornonlinear optimizationrdquo SIAM Review vol 28 no 4pp 487ndash500 1986

[12] G Yuan ldquoModified nonlinear conjugate gradient methodswith sufficient descent property for large-scale optimizationproblemsrdquo Optimization Letters vol 3 no 1 pp 11ndash21 2009

[13] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journalon Optimization vol 2 no 1 pp 21ndash42 1992

[14] W W Hager and H Zhang ldquoA new conjugate gradientmethod with guaranteed descent and an efficient line searchrdquoSIAM Journal on Optimization vol 16 no 1 pp 170ndash1922005

[15] W W Hager and H Zhang ldquoAlgorithm 851rdquo ACM Trans-actions on Mathematical Software vol 32 no 1 pp 113ndash1372006

[16] Z Wei S Yao and L Liu ldquo)e convergence properties ofsome new conjugate gradient methodsrdquo Applied Mathematicsand Computation vol 183 no 2 pp 1341ndash1350 2006

[17] G Yuan Z Wei and X Lu ldquoGlobal convergence of BFGS andPRP methods under a modified weak Wolfe-Powell linesearchrdquo Applied Mathematical Modelling vol 47 pp 811ndash825 2017

[18] X Li S Wang Z Jin and H Pham ldquoA conjugate gradientalgorithm under Yuan-Wei-Lu line search technique forlarge-scale minimization optimization modelsrdquo Mathemati-cal Problems in Engineering vol 2018 Article ID 472931811 pages 2018

[19] G Yuan Z Sheng B Wang W Hu and C Li ldquo)e globalconvergence of a modified BFGS method for nonconvexfunctionsrdquo Journal of Computational and Applied Mathe-matics vol 327 pp 274ndash294 2018

[20] G Yuan Z Wei and Y Yang ldquo)e global convergence of thePolak-Ribiere-Polyak conjugate gradient algorithm underinexact line search for nonconvex functionsrdquo Journal ofComputational and Applied Mathematics vol 362 pp 262ndash275 2019

[21] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2020

[22] G Yuan T Li and W Hu ldquoA conjugate gradient algorithmfor large-scale nonlinear equations and image restorationproblemsrdquo Applied Numerical Mathematics vol 147pp 129ndash141 2020

[23] G Yuan J Lu and Z Wang ldquo)e PRP conjugate gradientalgorithm with a modified WWP line search and its appli-cation in the image restoration problemsrdquo Applied NumericalMathematics vol 152 pp 1ndash11 2020

[24] G Yuan Z Meng and Y Li ldquoA modified Hestenes and Stiefelconjugate gradient algorithm for large-scale nonsmoothminimizations and nonlinear equationsrdquo Journal of Optimi-zation eory and Applications vol 168 no 1 pp 129ndash1522016

[25] G Yu ldquoNonlinear self-scaling conjugate gradient methods forlarge-scale optimization problemsrdquo)esis of Doctors DegreeSun Yat-Sen University Guangzhou China 2007

[26] L Zhang W Zhou and D Li ldquoGlobal convergence of amodified Fletcher-Reeves conjugate gradient method withArmijo-type line searchrdquo Numerische Mathematik vol 104no 4 pp 561ndash572 2006

8 Mathematical Problems in Engineering

[27] Z-F Dai and B-S Tian ldquoGlobal convergence of somemodified PRP nonlinear conjugate gradient methodsrdquo Op-timization Letters vol 5 no 4 pp 615ndash630 2011

[28] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2019

[29] G Yuan Z Wei and Q Zhao ldquoA modified Polak-Ribiere-Polyak conjugate gradient algorithm for large-scale optimi-zation problemsrdquo IIE Transactions vol 46 no 4 pp 397ndash4132014

[30] G Yuan and Z Wei ldquoConvergence analysis of a modifiedBFGS method on convex minimizationsrdquo ComputationalOptimization and Applications vol 47 no 2 pp 237ndash2552010

[31] R H Chan C W Ho C Y Leung and M NikolovaldquoMinimization of detail-preserving regularization functionalby Newtonrsquos method with continuationrdquo in Proceedings ofIEEE International Conference on Image Processing pp 125ndash128 Genova Italy September 2005

[32] J F Cai R H Chan and B Morini ldquoMinimization of anedge-preserving regularization functional by conjugate gra-dient types methodsrdquo in Image Processing Based on PartialDifferential Equations pp 109ndash122 Springer BerlinGermany 2007

[33] Y Dong R H Chan and S Xu ldquoA detection statistic forrandom-valued impulse noiserdquo IEEE Transactions on ImageProcessing vol 16 no 4 pp 1112ndash1120 2007

[34] A Bovik Handbook of Image and Video Processing Aca-demic New York NY USA 2000

[35] F Rahpeymaii K Amini T Allahviranloo andM R Malkhalifeh ldquoA new class of conjugate gradientmethods for unconstrained smooth optimization and absolutevalue equationsrdquo Calcolo vol 56 2019

[36] G Yu J Huang and Y Zhou ldquoA descent spectral conjugategradient method for impulse noise removalrdquo AppliedMathematics Letters vol 23 no 5 pp 555ndash560 2010

Mathematical Problems in Engineering 9

Page 7: ADescentConjugateGradientAlgorithmforOptimization ...Polak–Ribi`ere–Polak(PRP)formula[6,7]isoneofthewell-knownnonlinearCGformulaswith βPRP k gT k g k−g −1 g k−1 2, (4) whereg

observation matrix is 100 times 256 )e so-called Fouriertransform technology is used and the measurement is theFourier domain

Figures 3ndash5 turn out that these two algorithms work wellfor these figures and they can successfully solve them

5 Conclusion

)is paper by designing a CG algorithm studies the un-constrained optimization problems )e given methodpossesses not only the sufficient descent property but also

(a) (b) (c)

Figure 3 Phantom (a) the general images (b) the recovered images by Algorithm 1 and (c) the recovered images by the PRP algorithm

(a) (b) (c)

Figure 4 Fruits (a) the general images (b) the recovered images by Algorithm 1 and the (c) recovered images by the PRP algorithm

Table 1 PSNR Algorithm 1 and the PRP algorithm

20 noise Barbara Man Baboon Lena AverageAlgorithm 1 31115 380355 294393 410674 349143PRP algorithm 311118 379583 294534 41356 3496940 noise Barbara Man Baboon Lena AverageAlgorithm 1 275415 340063 258947 366496 310230PRP algorithm 276153 345375 258571 36701 311777

(a) (b) (c)

Figure 5 Boat (a) the general images (b) the recovered images by Algorithm 1 and (c) the recovered images by the PRP algorithm

Mathematical Problems in Engineering 7

the trust region feature )e global convergence is proved bya simple way )e image restoration problems and com-pressive sensing problems are tested to show that the pro-posed algorithm is better than the normal algorithm In thefuture we will focus on the following aspects to be paidattention (i) we believe there are many perfect CG algo-rithms which can be successfully used for image restorationproblems and compressive sensing (ii) more experimentswill be done to test the performance of the new algorithm

Data Availability

All data are included in the paper

Conflicts of Interest

)ere are no potential conflicts of interest

Acknowledgments

)e authors would like to thank the support of the funds)is work was supported by the National Natural ScienceFoundation of China under Grant no 61772006 the Scienceand Technology Program of Guangxi under Grant noAB17129012 the Science and Technology Major Project ofGuangxi under Grant no AA17204096 the Special Fund forScientific and Technological Bases and Talents of Guangxiunder Grant no 2016AD05050 and the Special Fund forBagui Scholars of Guangxi

References

[1] Y Dai and Y Yuan ldquoA nonlinear conjugate gradient with astrong global convergence propertiesrdquo SIAM Journal onOptimization vol 10 pp 177ndash182 2000

[2] R Fletcher Practical Methods of Optimization John Wileyand Sons New York NY USA 2nd edition 1987

[3] R Fletcher and C M Reeves ldquoFunction minimization byconjugate gradientsrdquo e Computer Journal vol 7 no 2pp 149ndash154 1964

[4] M R Hestenes and E Stiefel ldquoMethods of conjugate gradientsfor solving linear systemsrdquo Journal of Research of the NationalBureau of Standards vol 49 no 6 pp 409ndash436 1952

[5] Y Liu and C Storey ldquoEfficient generalized conjugate gradientalgorithms part 1 theoryrdquo Journal of Optimization eoryand Applications vol 69 no 1 pp 129ndash137 1991

[6] B T Polak ldquo)e conjugate gradient method in extremeproblemsrdquo Computational Mathematics and MathematicalPhysics vol 9 no 4 pp 94ndash112 1969

[7] E Polak and G Ribiere ldquoNote sur la convergence demethodes de directions conjugueesrdquo Revue franccedilaise drsquoin-formatique et de recherche operationnelle Serie rouge vol 3no 16 pp 35ndash43 1969

[8] Y Dai ldquoConvergence properties of the BFGS algorithmrdquoSIAM Journal on Optimization vol 13 no 3 pp 693ndash7012002

[9] Y Dai ldquoAnalysis of conjugate gradient methodsrdquo PhD)esis Institute of Computational Mathematics and Scien-tificEngineering Computing Chese Academy of SciencesBeijing China 1997

[10] M J D Powell ldquoNonconvex minimization calculations andthe conjugate gradient methodrdquo Lecture Notes in

Mathematics vol 1066 pp 122ndash141 Spinger-Verlag BerlinGermany 1984

[11] M J D Powell ldquoConvergence properties of algorithms fornonlinear optimizationrdquo SIAM Review vol 28 no 4pp 487ndash500 1986

[12] G Yuan ldquoModified nonlinear conjugate gradient methodswith sufficient descent property for large-scale optimizationproblemsrdquo Optimization Letters vol 3 no 1 pp 11ndash21 2009

[13] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journalon Optimization vol 2 no 1 pp 21ndash42 1992

[14] W W Hager and H Zhang ldquoA new conjugate gradientmethod with guaranteed descent and an efficient line searchrdquoSIAM Journal on Optimization vol 16 no 1 pp 170ndash1922005

[15] W W Hager and H Zhang ldquoAlgorithm 851rdquo ACM Trans-actions on Mathematical Software vol 32 no 1 pp 113ndash1372006

[16] Z Wei S Yao and L Liu ldquo)e convergence properties ofsome new conjugate gradient methodsrdquo Applied Mathematicsand Computation vol 183 no 2 pp 1341ndash1350 2006

[17] G Yuan Z Wei and X Lu ldquoGlobal convergence of BFGS andPRP methods under a modified weak Wolfe-Powell linesearchrdquo Applied Mathematical Modelling vol 47 pp 811ndash825 2017

[18] X Li S Wang Z Jin and H Pham ldquoA conjugate gradientalgorithm under Yuan-Wei-Lu line search technique forlarge-scale minimization optimization modelsrdquo Mathemati-cal Problems in Engineering vol 2018 Article ID 472931811 pages 2018

[19] G Yuan Z Sheng B Wang W Hu and C Li ldquo)e globalconvergence of a modified BFGS method for nonconvexfunctionsrdquo Journal of Computational and Applied Mathe-matics vol 327 pp 274ndash294 2018

[20] G Yuan Z Wei and Y Yang ldquo)e global convergence of thePolak-Ribiere-Polyak conjugate gradient algorithm underinexact line search for nonconvex functionsrdquo Journal ofComputational and Applied Mathematics vol 362 pp 262ndash275 2019

[21] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2020

[22] G Yuan T Li and W Hu ldquoA conjugate gradient algorithmfor large-scale nonlinear equations and image restorationproblemsrdquo Applied Numerical Mathematics vol 147pp 129ndash141 2020

[23] G Yuan J Lu and Z Wang ldquo)e PRP conjugate gradientalgorithm with a modified WWP line search and its appli-cation in the image restoration problemsrdquo Applied NumericalMathematics vol 152 pp 1ndash11 2020

[24] G Yuan Z Meng and Y Li ldquoA modified Hestenes and Stiefelconjugate gradient algorithm for large-scale nonsmoothminimizations and nonlinear equationsrdquo Journal of Optimi-zation eory and Applications vol 168 no 1 pp 129ndash1522016

[25] G Yu ldquoNonlinear self-scaling conjugate gradient methods forlarge-scale optimization problemsrdquo)esis of Doctors DegreeSun Yat-Sen University Guangzhou China 2007

[26] L Zhang W Zhou and D Li ldquoGlobal convergence of amodified Fletcher-Reeves conjugate gradient method withArmijo-type line searchrdquo Numerische Mathematik vol 104no 4 pp 561ndash572 2006

8 Mathematical Problems in Engineering

[27] Z-F Dai and B-S Tian ldquoGlobal convergence of somemodified PRP nonlinear conjugate gradient methodsrdquo Op-timization Letters vol 5 no 4 pp 615ndash630 2011

[28] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2019

[29] G Yuan Z Wei and Q Zhao ldquoA modified Polak-Ribiere-Polyak conjugate gradient algorithm for large-scale optimi-zation problemsrdquo IIE Transactions vol 46 no 4 pp 397ndash4132014

[30] G Yuan and Z Wei ldquoConvergence analysis of a modifiedBFGS method on convex minimizationsrdquo ComputationalOptimization and Applications vol 47 no 2 pp 237ndash2552010

[31] R H Chan C W Ho C Y Leung and M NikolovaldquoMinimization of detail-preserving regularization functionalby Newtonrsquos method with continuationrdquo in Proceedings ofIEEE International Conference on Image Processing pp 125ndash128 Genova Italy September 2005

[32] J F Cai R H Chan and B Morini ldquoMinimization of anedge-preserving regularization functional by conjugate gra-dient types methodsrdquo in Image Processing Based on PartialDifferential Equations pp 109ndash122 Springer BerlinGermany 2007

[33] Y Dong R H Chan and S Xu ldquoA detection statistic forrandom-valued impulse noiserdquo IEEE Transactions on ImageProcessing vol 16 no 4 pp 1112ndash1120 2007

[34] A Bovik Handbook of Image and Video Processing Aca-demic New York NY USA 2000

[35] F Rahpeymaii K Amini T Allahviranloo andM R Malkhalifeh ldquoA new class of conjugate gradientmethods for unconstrained smooth optimization and absolutevalue equationsrdquo Calcolo vol 56 2019

[36] G Yu J Huang and Y Zhou ldquoA descent spectral conjugategradient method for impulse noise removalrdquo AppliedMathematics Letters vol 23 no 5 pp 555ndash560 2010

Mathematical Problems in Engineering 9

Page 8: ADescentConjugateGradientAlgorithmforOptimization ...Polak–Ribi`ere–Polak(PRP)formula[6,7]isoneofthewell-knownnonlinearCGformulaswith βPRP k gT k g k−g −1 g k−1 2, (4) whereg

the trust region feature )e global convergence is proved bya simple way )e image restoration problems and com-pressive sensing problems are tested to show that the pro-posed algorithm is better than the normal algorithm In thefuture we will focus on the following aspects to be paidattention (i) we believe there are many perfect CG algo-rithms which can be successfully used for image restorationproblems and compressive sensing (ii) more experimentswill be done to test the performance of the new algorithm

Data Availability

All data are included in the paper

Conflicts of Interest

)ere are no potential conflicts of interest

Acknowledgments

)e authors would like to thank the support of the funds)is work was supported by the National Natural ScienceFoundation of China under Grant no 61772006 the Scienceand Technology Program of Guangxi under Grant noAB17129012 the Science and Technology Major Project ofGuangxi under Grant no AA17204096 the Special Fund forScientific and Technological Bases and Talents of Guangxiunder Grant no 2016AD05050 and the Special Fund forBagui Scholars of Guangxi

References

[1] Y Dai and Y Yuan ldquoA nonlinear conjugate gradient with astrong global convergence propertiesrdquo SIAM Journal onOptimization vol 10 pp 177ndash182 2000

[2] R Fletcher Practical Methods of Optimization John Wileyand Sons New York NY USA 2nd edition 1987

[3] R Fletcher and C M Reeves ldquoFunction minimization byconjugate gradientsrdquo e Computer Journal vol 7 no 2pp 149ndash154 1964

[4] M R Hestenes and E Stiefel ldquoMethods of conjugate gradientsfor solving linear systemsrdquo Journal of Research of the NationalBureau of Standards vol 49 no 6 pp 409ndash436 1952

[5] Y Liu and C Storey ldquoEfficient generalized conjugate gradientalgorithms part 1 theoryrdquo Journal of Optimization eoryand Applications vol 69 no 1 pp 129ndash137 1991

[6] B T Polak ldquo)e conjugate gradient method in extremeproblemsrdquo Computational Mathematics and MathematicalPhysics vol 9 no 4 pp 94ndash112 1969

[7] E Polak and G Ribiere ldquoNote sur la convergence demethodes de directions conjugueesrdquo Revue franccedilaise drsquoin-formatique et de recherche operationnelle Serie rouge vol 3no 16 pp 35ndash43 1969

[8] Y Dai ldquoConvergence properties of the BFGS algorithmrdquoSIAM Journal on Optimization vol 13 no 3 pp 693ndash7012002

[9] Y Dai ldquoAnalysis of conjugate gradient methodsrdquo PhD)esis Institute of Computational Mathematics and Scien-tificEngineering Computing Chese Academy of SciencesBeijing China 1997

[10] M J D Powell ldquoNonconvex minimization calculations andthe conjugate gradient methodrdquo Lecture Notes in

Mathematics vol 1066 pp 122ndash141 Spinger-Verlag BerlinGermany 1984

[11] M J D Powell ldquoConvergence properties of algorithms fornonlinear optimizationrdquo SIAM Review vol 28 no 4pp 487ndash500 1986

[12] G Yuan ldquoModified nonlinear conjugate gradient methodswith sufficient descent property for large-scale optimizationproblemsrdquo Optimization Letters vol 3 no 1 pp 11ndash21 2009

[13] J C Gilbert and J Nocedal ldquoGlobal convergence properties ofconjugate gradient methods for optimizationrdquo SIAM Journalon Optimization vol 2 no 1 pp 21ndash42 1992

[14] W W Hager and H Zhang ldquoA new conjugate gradientmethod with guaranteed descent and an efficient line searchrdquoSIAM Journal on Optimization vol 16 no 1 pp 170ndash1922005

[15] W W Hager and H Zhang ldquoAlgorithm 851rdquo ACM Trans-actions on Mathematical Software vol 32 no 1 pp 113ndash1372006

[16] Z Wei S Yao and L Liu ldquo)e convergence properties ofsome new conjugate gradient methodsrdquo Applied Mathematicsand Computation vol 183 no 2 pp 1341ndash1350 2006

[17] G Yuan Z Wei and X Lu ldquoGlobal convergence of BFGS andPRP methods under a modified weak Wolfe-Powell linesearchrdquo Applied Mathematical Modelling vol 47 pp 811ndash825 2017

[18] X Li S Wang Z Jin and H Pham ldquoA conjugate gradientalgorithm under Yuan-Wei-Lu line search technique forlarge-scale minimization optimization modelsrdquo Mathemati-cal Problems in Engineering vol 2018 Article ID 472931811 pages 2018

[19] G Yuan Z Sheng B Wang W Hu and C Li ldquo)e globalconvergence of a modified BFGS method for nonconvexfunctionsrdquo Journal of Computational and Applied Mathe-matics vol 327 pp 274ndash294 2018

[20] G Yuan Z Wei and Y Yang ldquo)e global convergence of thePolak-Ribiere-Polyak conjugate gradient algorithm underinexact line search for nonconvex functionsrdquo Journal ofComputational and Applied Mathematics vol 362 pp 262ndash275 2019

[21] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2020

[22] G Yuan T Li and W Hu ldquoA conjugate gradient algorithmfor large-scale nonlinear equations and image restorationproblemsrdquo Applied Numerical Mathematics vol 147pp 129ndash141 2020

[23] G Yuan J Lu and Z Wang ldquo)e PRP conjugate gradientalgorithm with a modified WWP line search and its appli-cation in the image restoration problemsrdquo Applied NumericalMathematics vol 152 pp 1ndash11 2020

[24] G Yuan Z Meng and Y Li ldquoA modified Hestenes and Stiefelconjugate gradient algorithm for large-scale nonsmoothminimizations and nonlinear equationsrdquo Journal of Optimi-zation eory and Applications vol 168 no 1 pp 129ndash1522016

[25] G Yu ldquoNonlinear self-scaling conjugate gradient methods forlarge-scale optimization problemsrdquo)esis of Doctors DegreeSun Yat-Sen University Guangzhou China 2007

[26] L Zhang W Zhou and D Li ldquoGlobal convergence of amodified Fletcher-Reeves conjugate gradient method withArmijo-type line searchrdquo Numerische Mathematik vol 104no 4 pp 561ndash572 2006

8 Mathematical Problems in Engineering

[27] Z-F Dai and B-S Tian ldquoGlobal convergence of somemodified PRP nonlinear conjugate gradient methodsrdquo Op-timization Letters vol 5 no 4 pp 615ndash630 2011

[28] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2019

[29] G Yuan Z Wei and Q Zhao ldquoA modified Polak-Ribiere-Polyak conjugate gradient algorithm for large-scale optimi-zation problemsrdquo IIE Transactions vol 46 no 4 pp 397ndash4132014

[30] G Yuan and Z Wei ldquoConvergence analysis of a modifiedBFGS method on convex minimizationsrdquo ComputationalOptimization and Applications vol 47 no 2 pp 237ndash2552010

[31] R H Chan C W Ho C Y Leung and M NikolovaldquoMinimization of detail-preserving regularization functionalby Newtonrsquos method with continuationrdquo in Proceedings ofIEEE International Conference on Image Processing pp 125ndash128 Genova Italy September 2005

[32] J F Cai R H Chan and B Morini ldquoMinimization of anedge-preserving regularization functional by conjugate gra-dient types methodsrdquo in Image Processing Based on PartialDifferential Equations pp 109ndash122 Springer BerlinGermany 2007

[33] Y Dong R H Chan and S Xu ldquoA detection statistic forrandom-valued impulse noiserdquo IEEE Transactions on ImageProcessing vol 16 no 4 pp 1112ndash1120 2007

[34] A Bovik Handbook of Image and Video Processing Aca-demic New York NY USA 2000

[35] F Rahpeymaii K Amini T Allahviranloo andM R Malkhalifeh ldquoA new class of conjugate gradientmethods for unconstrained smooth optimization and absolutevalue equationsrdquo Calcolo vol 56 2019

[36] G Yu J Huang and Y Zhou ldquoA descent spectral conjugategradient method for impulse noise removalrdquo AppliedMathematics Letters vol 23 no 5 pp 555ndash560 2010

Mathematical Problems in Engineering 9

Page 9: ADescentConjugateGradientAlgorithmforOptimization ...Polak–Ribi`ere–Polak(PRP)formula[6,7]isoneofthewell-knownnonlinearCGformulaswith βPRP k gT k g k−g −1 g k−1 2, (4) whereg

[27] Z-F Dai and B-S Tian ldquoGlobal convergence of somemodified PRP nonlinear conjugate gradient methodsrdquo Op-timization Letters vol 5 no 4 pp 615ndash630 2011

[28] J Cao and J Wu ldquoA conjugate gradient algorithm and itsapplications in image restorationrdquo Applied NumericalMathematics vol 152 pp 243ndash252 2019

[29] G Yuan Z Wei and Q Zhao ldquoA modified Polak-Ribiere-Polyak conjugate gradient algorithm for large-scale optimi-zation problemsrdquo IIE Transactions vol 46 no 4 pp 397ndash4132014

[30] G Yuan and Z Wei ldquoConvergence analysis of a modifiedBFGS method on convex minimizationsrdquo ComputationalOptimization and Applications vol 47 no 2 pp 237ndash2552010

[31] R H Chan C W Ho C Y Leung and M NikolovaldquoMinimization of detail-preserving regularization functionalby Newtonrsquos method with continuationrdquo in Proceedings ofIEEE International Conference on Image Processing pp 125ndash128 Genova Italy September 2005

[32] J F Cai R H Chan and B Morini ldquoMinimization of anedge-preserving regularization functional by conjugate gra-dient types methodsrdquo in Image Processing Based on PartialDifferential Equations pp 109ndash122 Springer BerlinGermany 2007

[33] Y Dong R H Chan and S Xu ldquoA detection statistic forrandom-valued impulse noiserdquo IEEE Transactions on ImageProcessing vol 16 no 4 pp 1112ndash1120 2007

[34] A Bovik Handbook of Image and Video Processing Aca-demic New York NY USA 2000

[35] F Rahpeymaii K Amini T Allahviranloo andM R Malkhalifeh ldquoA new class of conjugate gradientmethods for unconstrained smooth optimization and absolutevalue equationsrdquo Calcolo vol 56 2019

[36] G Yu J Huang and Y Zhou ldquoA descent spectral conjugategradient method for impulse noise removalrdquo AppliedMathematics Letters vol 23 no 5 pp 555ndash560 2010

Mathematical Problems in Engineering 9