Get Free Shipping on orders over $79
Conjugate Gradient Algorithms in Nonconvex Optimization - Radoslaw Pytlak

Conjugate Gradient Algorithms in Nonconvex Optimization

By: Radoslaw Pytlak

eText | 18 November 2008

At a Glance

eText


$269.01

or 4 interest-free payments of $67.25 with

 or 

Instant online reading in your Booktopia eTextbook Library *

Why choose an eTextbook?

Instant Access *

Purchase and read your book immediately

Read Aloud

Listen and follow along as Bookshelf reads to you

Study Tools

Built-in study tools like highlights and more

* eTextbooks are not downloadable to your eReader or an app and can be accessed via web browsers only. You must be connected to the internet and have no technical issues with your device or browser that could prevent the eTextbook from operating.
Conjugate Direction Methods were proposed in the early 1950s. With the development of our full computers, attempts were made to lie foundations for mathematical aspects of computations which could take advantage of these computers. This monograph gives an overview of the standard conjugate gradient algorithms, but it goes beyond the treatment of technics and it can therefore be regarded as an extension to the methods originally proposed. The book draws much attention to the preconditioned versions of the method and since limited memory quasi-Newton algorithms are preconditioned conjugate gradient algorithms when apply to quadratics, these variable metric techniques are also discussed.
on
Desktop
Tablet
Mobile

More in Calculus & Mathematical Analysis

AI Breaking Boundaries - Avinash Vanam

eBOOK

Enriques Surfaces I - François Cossec

eTEXT