From: Karl Meerbergen (Karl.Meerbergen_at_[hidden])
Date: 2006-10-24 15:25:50
Nonlinear CG definitely exists: it is an optimization method. As far as I
recall there are conditions on the Hessian evaluated at the optimal point
(i.e. where the gradien is zero).
When the initial guess is close enough to the optimum and the Hessian is
positive definite, convergence is guaranteed. I do not recall the details,
but I can look it up if you're interested.
On Tuesday 24 October 2006 20:16, Gunter Winkler wrote:
> Hi Fred and Karl,
> Karl Meerbergen schrieb:
> > I vaguely recall that conjugate gradients is an optimization method. But
> > I do not recall the details for nonlinear problems. Convergence is only
> > guaranteed for specific math properties, as this is also the case for
> > linear systems.
> Yes. Linear CG is a method to find (the unique) vector x that minimizes
> a (scalar) quadratic function
> f(x) := 1/2 <x, Ax> - <b, x> -> min
> ( <a,b> is any inner product, such that <x, Ax> > 0 for all x<>0 )
> Although I must admit that I never heard of "the nonlinear CG". There
> are lots of gradient based methods for nonlinear minimization.
> Fred, can you explain your method?
> ublas mailing list
-- ============================================== Look at our unique training program and Register on-line at http://www.fft.be/?id=35 ---------------------------------------------- Karl Meerbergen Free Field Technologies Axis Park Louvain-la-Neuve rue Emile Francqui, 1 B-1435 Mont-Saint Guibert - BELGIUM Company Phone: +32 10 45 12 26 Company Fax: +32 10 45 46 26 Mobile Phone: +32 474 26 66 59 Home Phone: +32 2 306 38 10 mailto:Karl.Meerbergen_at_[hidden] http://www.fft.be ==========================================================================================