## Documentation Center |

Consider the problem of finding a set of values [*x*_{1}, *x*_{2}]
that solves

To solve this two-dimensional problem, write a file that returns
the function value. Then, invoke the unconstrained minimization routine `fminunc`.

This code ships with the toolbox. To view, enter `type
objfun`:

function f = objfun(x) f = exp(x(1)) * (4*x(1)^2 + 2*x(2)^2 + 4*x(1)*x(2) + 2*x(2) + 1);

x0 = [-1,1]; % Starting guess options = optimoptions(@fminunc,'Algorithm','quasi-newton'); [x,fval,exitflag,output] = fminunc(@objfun,x0,options);

This produces the following output:

Local minimum found. Optimization completed because the size of the gradient is less than the default value of the function tolerance.

View the results:

x,fval,exitflag,output x = 0.5000 -1.0000 fval = 3.6609e-15 exitflag = 1 output = iterations: 8 funcCount: 66 stepsize: 1 firstorderopt: 1.2284e-007 algorithm: 'medium-scale: Quasi-Newton line search' message: 'Local minimum found. Optimization completed because the size of the gradie...'

The `exitflag` tells whether the algorithm
converged. `exitflag = 1` means a local minimum was
found. The meanings of exitflags are given in function reference pages.

The `output` structure gives more details about
the optimization. For `fminunc`,
it includes the number of iterations in `iterations`,
the number of function evaluations in `funcCount`,
the final step-size in `stepsize`, a measure of first-order
optimality (which in this unconstrained case is the infinity norm
of the gradient at the solution) in `firstorderopt`,
the type of algorithm used in `algorithm`, and the
exit message (the reason the algorithm stopped).

Pass the variable `options` to `fminunc` to change characteristics of the
optimization algorithm, as in

x = fminunc(@objfun,x0,options);

`options` contains values for termination tolerances
and algorithm choices. Create `options` using the `optimoptions` function:

options = optimoptions(@fminunc,'Algorithm','quasi-newton');

You can also create options by exporting from the Optimization app.

In this example, we have used the `quasi-newton` algorithm.
Other options include controlling the amount of command line display
during the optimization iteration, the tolerances for the termination
criteria, whether a user-supplied gradient or Jacobian is to be used,
and the maximum number of iterations or function evaluations. See `optimoptions`, the individual optimization
functions, and Optimization Options Reference for
more options and information.

Was this topic helpful?