## Octave – Equazioni non lineari III – 113 Continuo da qui, copio qui.

Ricerca di minimi (e massimi)

Often it is useful to find the minimum value of a function rather than just the zeroes where it crosses the x-axis. `fminbnd` is designed for the simpler, but very common, case of a univariate function where the interval to search is bounded. For unbounded minimization of a function with potentially many variables use `fminunc` or `fminsearch`. The two functions use different internal algorithms and some knowledge of the objective function is required. For functions which can be differentiated, `fminunc` is appropriate. For functions with discontinuities, or for which a gradient search would fail, use `fminsearch`. See Optimization, for minimization with the presence of constraint functions. Note that searches can be made for maxima by simply inverting the objective function (`Fto_max = -Fto_min`).

Function File: `[x, fval, info, output] = fminbnd (fun, a, b, options)`
Find a minimum point of a univariate function.
`fun` should be a function handle or name. `a`, `b` specify a starting interval. options is a structure specifying additional options. Currently, fminbnd recognizes these options: `"FunValCheck"`, `"OutputFcn"`, `"TolX"`, `"MaxIter"`, `"MaxFunEvals"`. For a description of these options, see `optimset`.
On exit, the function returns `x`, the approximate minimum point and `fval`, the function value thereof.

`info` is an exit flag that can have these values:

• `1` The algorithm converged to a solution.
• `0` Maximum number of iterations or function evaluations has been exhausted.
• `-1` The algorithm has been terminated from user output function.

Notes: The search for a minimum is restricted to be in the interval bound by `a` and `b`. If you only have an initial point to begin searching from you will need to use an unconstrained minimization algorithm such as `fminunc` or `fminsearch`. `fminbnd` internally uses a Golden Section search strategy. Function File: `fminunc (fcn, x0)`
Function File: `fminunc (fcn, x0, options)`
Function File: `[x, fval, info, output, grad, hess] = fminunc (fcn, ...)`

Solve an unconstrained optimization problem defined by the function `fcn`.
`fcn` should accept a vector (array) defining the unknown variables, and return the objective function value, optionally with gradient. `fminunc` attempts to determine a vector `x` such that `fcn (x)` is a local minimum.
`x0` determines a starting guess. The shape of `x0` is preserved in all calls to `fcn`, but otherwise is treated as a column vector.
`options` is a structure specifying additional options. Currently, `fminunc` recognizes these options: `"FunValCheck"`, `"OutputFcn"`, `"TolX"`, `"TolFun"`, `"MaxIter"`, `"MaxFunEvals"`, `"GradObj"`, `"FinDiffType"`, `"TypicalX"`, `"AutoScaling"`.
If `"GradObj"` is “on”, it specifies that `fcn`, when called with two output arguments, also returns the Jacobian matrix of partial first derivatives at the requested point. `TolX` specifies the termination tolerance for the unknown variables `x`, while `TolFun` is a tolerance for the objective function value `fval`. The default is 1e-7 for both options.
For a description of the other options, see `optimset`.
On return, `x` is the location of the minimum and `fval` contains the value of the objective function at `x`.

`info` may be one of the following values:

• `1` Converged to a solution point. Relative gradient error is less than specified by `TolFun`.
• `2` Last relative step size was less than `TolX`.
• `3` Last relative change in function value was less than `TolFun`.
• `0` Iteration limit exceeded—either maximum number of algorithm iterations `MaxIter` or maximum number of function evaluations `MaxFunEvals`.
• `-1` Algorithm terminated by `OutputFcn`.
• `-3` The trust region radius became excessively small.

Optionally, `fminunc` can return a structure with convergence statistics (`output`), the output gradient (`grad`) at the solution x, and approximate Hessian (hess) at the solution `x`.
Application Notes: If the objective function is a single nonlinear equation of one variable then using `fminbnd` is usually a better choice.
The algorithm used by `fminunc` is a gradient search which depends on the objective function being differentiable. If the function has discontinuities it may be better to use a derivative-free algorithm such as `fminsearch`. Function File: `x = fminsearch (fun, x0)`
Function File: `x = fminsearch (fun, x0, options)`
Function File: `[x, fval] = fminsearch (...)`

Find a value of `x` which minimizes the function `fun`.
The search begins at the point `x0` and iterates using the Nelder & Mead Simplex algorithm (a derivative-free method). This algorithm is better-suited to functions which have discontinuities or for which a gradient-based search such as `fminunc` fails.
Options for the search are provided in the parameter options using the function `optimset`. Currently, `fminsearch` accepts the options: `"TolX"`, `"MaxFunEvals"`, `"MaxIter"`, `"Display"`. For a description of these options, see `optimset`.
On exit, the function returns `x`, the minimum point, and `fval`, the function value thereof.  Posta un commento o usa questo indirizzo per il trackback.