UPSI Digital Repository (UDRep)
Start | FAQ | About
Menu Icon

QR Code Link :

Type :article
Subject :Q Science (General)
ISSN :1992-9978
Main Author :Nurul Akmal binti Mohamed
Title :An improved version of Rivaie-Mohd-Ismail-Leong conjugate gradient method with application in image restoration
Place of Production :Tanjung Malim
Publisher :Fakulti Sains dan Matematik
Year of Publication :2023
Notes :IAENG International Journal of Applied Mathematics
Corporate Name :Universiti Pendidikan Sultan Idris
HTTP Link :Click to view web link

Abstract : Universiti Pendidikan Sultan Idris
This paper focuses on modifying the existing Conjugate Gradient (CG) method of Rivaie, Mustafa, Ismail and Leong (RMIL). The RMIL technique has been the subject of previous studies to enhance its effectiveness. In this study, a new CG search direction, IRMIL, has been presented. This new variation combines the scaled negative gradient, which acts as an initial direction, and a third-term parameter. This paper proves that the IRMIL satisfies the sufficient descent criteria. The method also exhibits global convergence characteristics for exact and strong Wolfe line searches. The methods efficacy is assessed using two distinct methodologies. The first methodology involved conducting numerical tests on conventional Unconstrained Optimisation (UO) problems. The test shows that, while the IRMIL method performs very similarly to other existing CG methods during exact line search, it excels during strong Wolfe line search and converges more quickly. For the second methodology, the NEWMRIL method is applied to solve issues regarding image restoration. Overall, IRMIL method exhibits excellent theoretical and numerical efficiency potential. (2023), (International Association of Engineers). All Rights Reserved.

References

P. Wolfe, “Convergence Conditions For Ascent Methods. II: Some Corrections,” SIAM Review, vol. 13, no. 2, pp. 185–188, 1971.

Z. Zhang and X. Chen, “A Conjugate Gradient Method For Distributed Optimal Control Problems With Nonhomogeneous Helmholtz Equation,” Applied Mathematics and Computation, vol. 402, p. 126019, 2021.

M. Hestenes and E. Stiefel, “Method of Conjugate Gradients for Solving Linear Systems,” J. Res. Nat. Bur. Standarts, vol. 49, pp. 409– 436, 1952.

R. Fletcher and C. M. Reeves, “Function Minimization by Conjugate Gradients,” The Computer Journal, vol. 7, no. 2, pp. 149–154, 011964.

E. Polak and G. Ribiere, “Note Sur La Convergence De METhodes De ´ Directions ConjuguEEs,” ´ ESAIM: Mathematical Modelling and Numerical Analysis-Modelisation Math ´ ematique et Analyse Num ´ erique ´ vol. 3, no. R1, pp. 35–43, 1969.

B. T. Polyak, “The Conjugate Gradient Method In Extremal Problems,” USSR Computational Mathematics and Mathematical Physics, vol. 9, no. 4, pp. 94–112, 1969.

J. W. Daniel, “The Conjugate Gradient Method For Linear And Nonlinear Operator Equations,” SIAM Journal on Numerical Analysis, vol. 4, no. 1, pp. 10–26, 1967.

R. Fletcher, “Practical Methods of Optimization,” New York, vol. 80, p. 4, 1987.

Y. Liu and C. Storey, “Efficient Generalized Conjugate Gradient Algorithms, Part 1: Theory,” Journal of Optimization Theory and Applications, vol. 69, no. 1, pp. 129–137, 1991.

Y. Dai and Y. Yuan, “A Nonlinear Conjugate Gradient Method With A Strong Global Convergence Property,” SIAM Journal on  ptimization, vol. 10, no. 1, pp. 177–182, 1999.

W. W. Hager and H. Zhang, “A Survey Of Nonlinear Conjugate Gradient Methods,” Pacific journal of Optimization, vol. 2, no. 1, pp.35–58, 2006.

 ——, “A New Conjugate Gradient Method With Guaranteed Descent And An Efficient Line Search,” SIAM Journal on Optimization, vol. 16, no. 1, pp. 170–192, 2005.

Y. Dai and Y. Yuan, “Nonlinear Conjugate Gradient Methods,” Shanghai Science and Technology Publisher, Shanghai, 2000.

M. Rivaie, M. Mamat, L. W. June, and I. Mohd, “A New Conjugate Gradient Coefficient For Large Scale Nonlinear Unconstrained Optimization,” Int. Journal of Math. Analysis, vol. 6, no. 23, pp. 1131–1146, 2012.

——, “A New Class Of Nonlinear Conjugate Gradient Coefficients With Global Convergence Properties,” Applied Mathematics and Computation, vol. 218, no. 22, pp. 11 323–11 332, 2012.

N. Mohamed, “An Extension of RMIL and Hybrid Conjugate Gradient Method with Global Convergence,” Ph.D. dissertation, Universiti Sultan Zainal Abidin, 2017.

M. Rivaie, M. Mamat, and A. Abashar, “A New Class Of Nonlinear Conjugate Gradient Coefficients With Exact And Inexact Line Searches,” Applied Mathematics and Computation, vol. 268, pp. 1152–1163, 2015.

N. Zullpakkal, N. ‘Aini, N. H. A. Ghani, N. S. Mohamed, N. Idalisa, and M. Rivaie, “Covid–19 Data Modelling Using Hybrid Conjugate Gradient Method,” Journal of Information and Optimization Sciences, vol. 43, no. 4, pp. 837–853, 2022. [Online]. Available: https://doi.org/10.1080/02522667.2022.2060610

N. S. Mohamed, M. Mamat, M. Rivaie, and S. M. Shaharudin, “A New Hybrid Coefficient Of Conjugate Gradient Method,” Indonesian J Elec Eng & Comp Sci, vol. 18, no. 3, pp. 1454–1463, 2020.

M. Malik, M. Mamat, S. S. Abas, I. M. Sulaiman et al., “A New Coefficient of the Conjugate Gradient Method with the Sufficient Descent Condition and Global Convergence Properties.” Engineering Letters, vol. 28, no. 3, pp. 704–714, 2020.

A. V. Mandara, M. Mamat, M. Waziri, and M. A. Mohamed, “A Class Of Conjugate Gradient Parameters And Its Global Convergence For Solving Unconstrained Optimization,” Far East Journal of Mathematical Sciences (FJMS), vol. 106, pp. 43–58, 2018.

Z. Dai, “Comments On A New Class Of Nonlinear Conjugate Gradient Coefficients With Global Convergence Properties,” Applied Mathematics and Computation, vol. 276, pp. 297–300, 2016.

J. C. Gilbert and J. Nocedal, “Global Convergence Properties Of Conjugate Gradient Methods For Optimization,” SIAM Journal on optimization, vol. 2, no. 1, pp. 21–42, 1992.

O. O. O. Yousif, “The Convergence Properties Of RMIL+ Conjugate Gradient Method Under The Strong Wolfe Line Search,” Applied Mathematics and Computation, vol. 367, p. 124777, 2020.

N. Zull, “New Conjugate Gradient Methods using Strong Wolfe Line Search for Estimating Dividend Rate,” Ph.D. dissertation, Universiti Sultan Zainal Abidin, 2019.

X. Jiang and J. Jian, “Improved Fletcher–Reeves and Dai–Yuan Conjugate Gradient Methods With The Strong Wolfe Line Search,” Journal of Computational and Applied Mathematics, vol. 348, pp.525–534, 2019.

L. Zhang, “An Improved Wei–Yao–Liu Nonlinear Conjugate Gradient Method For Optimization Computation,” Applied Mathematics and computation, vol. 215, no. 6, pp. 2269–2274, 2009.

F. Rahpeymaii, K. Amini, T. Allahviranloo, and M. R. Malkhalifeh, “A New Class Of Conjugate Gradient Methods For Unconstrained Smooth Optimization And Absolute Value Equations,” Calcolo, vol. 56, no. 1, pp. 1–28, 2019.

W. Khadijah, M. Rivaie, and M. Mamat, “A Three–Term Conjugate Gradient Method Under The Strong-Wolfe Line Search,” in AIP Conference Proceedings, vol. 1870. AIP Publishing LLC, 2017, p.040056.

I. M. Sulaiman, M. Mamat, A. E. Owoyemi, P. L. Ghazali, M. Rivaie, and M. Malik, “The Convergence Properties Of Some Descent Conjugate Gradient Algorithms For Optimization Models,” Journal of Mathematics and Computer Science, vol. 22, no. 3, pp. 204–215, 2021. [Online]. Available: http://dx.doi.org/10.22436/jmcs.022.03.02


This material may be protected under Copyright Act which governs the making of photocopies or reproductions of copyrighted materials.
You may use the digitized material for private study, scholarship, or research.

Back to previous page

Installed and configured by Bahagian Automasi, Perpustakaan Tuanku Bainun, Universiti Pendidikan Sultan Idris
If you have enquiries, kindly contact us at pustakasys@upsi.edu.my or 016-3630263. Office hours only.