THE GRADIENT(MOST OFTEN SALTWORKS) METHOD

Авторы

  • A.I. Ismoilov Автор
  • Odina Isroiljon qizi Ismoiljonova Автор

Ключевые слова:

Gradient method, steepest descent, iterative method, symmetric matrix, positive definiteness, functional, optimization, Python, algorithm, error vector.

Аннотация

This paper discusses the steepest descent algorithm of the gradient method used to solve systems of linear algebraic equations. The method of finding the solution of the system by minimizing a functional is explained step by tep. The theoretical foundations of the iterative approach based on gradient and error vectors are presented. As an example, a system of four-variable equations is solved using the Python programming environment, and the convergence rate and accuracy of the algorithm are demonstrated. The results show that the gradient method is a simple and efficient computational tool suitable for solving large-scale linear systems.

Библиографические ссылки

I.K. Linnik, Methods of Minimizing Quadratic Functionals, Nauka, Moscow, 1970.

R.F. Lichman, Numerical Methods, Moscow, Higher School, 2003.

Taha H.A., Operations Research, Moscow, Williams, 2006.

Sharipov A.A., Optimization Methods, Tashkent, 2019.

Burden R., Faires J. D., Numerical Analysis, Cengage Learning, 10th edition, 2015.

S.P. Shevchuk, Numerical Methods of Linear Algebra, Kiev, Higher School, 1991.

Numpy Documentation – https://numpy.org/doc/

Saad Y., Iterative Methods for Sparse Linear Systems, SIAM, 2003.

Опубликован

2025-05-15

Как цитировать

THE GRADIENT(MOST OFTEN SALTWORKS) METHOD. (2025). ОБРАЗОВАНИЕ НАУКА И ИННОВАЦИОННЫЕ ИДЕИ В МИРЕ, 69(3), 265-272. https://scientific-jl.com/obr/article/view/13771