Iterative Algorithms I

Ioannis K. Argyros
Cameron University, Department of Mathematical Sciences, Lawton, OK, USA

Á. Alberto Magreñán
Universidad Internacional de La Rioja, Departamento de Matemáticas, La Rioja, Spain

Series: Mathematics Research Developments
BISAC: MAT003000

Clear

$270.00

eBook

Digitally watermarked, DRM-free.
Immediate eBook download after purchase.

Product price
Additional options total:
Order total:

Quantity:

Details

It is a well-known fact that iterative methods have been studied concerning problems where mathematicians cannot find a solution in a closed form. There exist methods with different behaviors when they are applied to different functions and methods with higher order of convergence, methods with great zones of convergence, methods which do not require the evaluation of any derivative, and optimal methods among others. It should come as no surprise, therefore, that researchers are developing new iterative methods frequently.

Once these iterative methods appear, several researchers study them in different terms: convergence conditions, real dynamics, complex dynamics, optimal order of convergence, etc. These phenomena motivated the authors to study the most used and classical ones, for example Newton’s method, Halley’s method and/or its derivative-free alternatives.

Related to the convergence of iterative methods, the most well-known conditions are the ones created by Kantorovich, who developed a theory which has allowed many researchers to continue and experiment with these conditions. Many authors in recent years have studied modifications of these conditions related, for example, to centered conditions, omega-conditions and even convergence in Hilbert spaces.

In this monograph, the authors present their complete work done in the past decade in analyzing convergence and dynamics of iterative methods. It is the natural outgrowth of their related publications in these areas. Chapters are self-contained and can be read independently. Moreover, an extensive list of references is given in each chapter in order to allow the reader to use the previous ideas. For these reasons, the authors think that several advanced courses can be taught using this book.

The book’s results are expected to help find applications in many areas of applied mathematics, engineering, computer science and real problems. As such, this monograph is suitable to researchers, graduate students and seminar instructors in the above subjects. The authors believe it would also make an excellent addition to all science and engineering libraries. (Imprint: Nova)

Preface

Chapter 1. Secant-type Methods

Chapter 2. Efficient Steffensen-type Algorithms for Solving Nonlinear Equations

Chapter 3. On the Semilocal Convergence of Halley's Method Under a Center-Lipschitz Condition on the Second Fr¨¦chet Derivative

Chapter 4. An Improved Convergence Analysis of Newton's Method for Twice Fr¨¦chet Differentiable Operators

Chapter 5. Expanding the Applicability of Newton's Method Using Smale's ¦Á-theory

Chapter 6. Newton-type Methods on Riemannian Manifolds Under Kantorovich-type Conditions

Chapter 7. Improved Local Convergence Analysis of Inexact Gauss-Newton Like Methods

Chapter 8. Expending the Applicability of Lavrentiev Regularization Methods for Ill-posed Problems

Chapter 9. A Semilocal Convergence for a Uniparametric Family of Efficient Secant-like Methods

Chapter 10. On the Semilocal Convergence of a Two-step Newton-like Projection Method for Ill-posed Equations

Chapter 11. New Approach to Relaxed Proximal Point Algorithms Based on A-maximal

Chapter 12. Newton-type Iterative Methods for Nonlinear Ill-posed Hammerstein-type Equations

Chapter 13. Enlarging the Convergence Domain of Secant-like Methods for Equations

Chapter 14. Solving Nonlinear Equations System via an Efficient Genetic Algorithm with Symmetric and Harmonious Individuals

Chapter 15. On the Semilocal Convergence of Modified Newton-Tikhonov Regularization Method for Nonlinear Ill-posed Problems

Chapter 16. Local Convergence Analysis of Proximal Gauss-Newton Method for Penalized Nonlinear Least Squares Problems

Chapter 17. On the Convergence of a Damped Newton Method with Modified Right-hand Side Vector

Chapter 18. Local Convergence of Inexact Newton-like Method Under Weak Lipschitz Conditions

Chapter 19. Expanding the Applicability of Secant Method with Applications

Chapter 20. Expanding the Convergence Domain for Chun-Stanica-Neta Family of Third Order Methods in Banach Spaces

Chapter 21. Local Convergence of Modified Halley-like Methods with less Computation of Inversion

Chapter 22. Local Convergence for an Improved Jarratt-type Method in Banach Space

Chapter 23. Enlarging the Convergence Domain of Secant-like Methods for Equations

Index

You have not viewed any product yet.