Specifying Known Derivatives

Top  Previous  Next

FreeFlyer's optimization interfaces provide a variety of ways to customize the optimization process to improve speed and chances of convergence. Users have the option to supply any known derivatives of the constraints and objective function, which can drastically improve performance by taking the burden of numerically computing the derivatives off of the Optimizer object.

 

 

Specifying the Jacobian and Gradient


If desired, the user can specify any known derivatives of the problem constraints or objective function by populating the Jacobian Matrix and Gradient Array properties of the Optimizer object. Supplying the Optimizer object with known derivatives can drastically improve runtime for complex problems. If not specified by the user, FreeFlyer will fill the Jacobian and Gradient with a constant (-999) so that any derivatives that the user provides can be automatically detected. All derivatives that are not provided by the user will be calculated numerically via finite differencing.

 

The Optimizer.Jacobian and Optimizer.Gradient properties should be updated within the evaluation loop on every nominal case evaluation, which can be identified through the Optimizer.OptimizationPhase property. A simple example is presented below: the problem is to find a solution to the equation "x + y^2 = 12" such that "(x + y)^2" is minimized and y is between 0 and 5. There are a number of solutions to this problem that satisfy the constraint, but the optimal solution is x = -4 and y = 4. In this case, the analytic derivatives are easy to calculate, so populating the Jacobian and Gradient properties is simple. For more complex problems that have complicated derivatives, it may be sensible to perform the Jacobian and Gradient calculations within a Procedure that is called inside of the evaluation loop.

 

Variable x;

Variable y;

Optimizer opt;

 

// Define Problem

opt.AddStateVariable("x", 0);

opt.AddStateVariable("y", 1, 0, 5);

 

opt.AddConstraint("constraintExpression");

opt.Constraints[0].SetEqualityBounds(12);

 

// Load Engine

opt.LoadEngine();

 

// Evaluation Loop

While (opt.HasNotConverged());

 

      opt.UpdateStateVariables();

 

      x = opt.GetStateVariableValue("x");

      y = opt.GetStateVariableValue("y");

 

      opt.SetConstraintValue("constraintExpression", x + y^2);

 

      // Supply derivatives on each nominal evaluation

      If (opt.OptimizationPhase == 1);

 

            opt.Jacobian[0, 0] = 1;      // Derivative of constraintExpression wrt x

            opt.Jacobian[0, 1] = 2*y;    // Derivative of constraintExpression wrt y

 

            opt.Gradient[0] = 2*(x + y); // Derivative of objective function wrt x

            opt.Gradient[1] = 2*(x + y); // Derivative of objective function wrt y

 

      End;

 

      opt.Minimize((x + y)^2);

 

End;

 

Report opt.GetBestStateVariableValues();

 

The Jacobian and Gradient can also be configured through the Optimizer.SetJacobianValue() and Optimizer.SetGradientValue() methods, which offer a convenient approach to keeping track of which element is being assigned by using the state variable and constraint labels, as shown below:

 

opt.SetJacobianValue("constraintExpression""x", 1);

opt.SetJacobianValue("constraintExpression""y", 2*y);

         

opt.SetGradientValue("x", 2*(x + y));

opt.SetGradientValue("y", 2*(x + y));

 

 

Derivative Tuning Parameters


There are a number of additional options directly on the Optimizer object that can be configured to change how FreeFlyer handles derivative calculation.

 

FiniteDifferenceMethod

FreeFlyer's Optimizer uses a finite differencing implementation to compute numerical derivatives during the optimization process. The user can adjust the Optimizer.FiniteDifferenceMethod property to choose between a forward or central differencing method (forward differencing is used by default).

 

opt.FiniteDifferenceMethod = 0; // Forward Difference

opt.FiniteDifferenceMethod = 1; // Central Difference

 

ValidateUserDerivatives

This boolean property indicates whether FreeFlyer should report an error if a user-provided derivative differs largely from the finite difference derivative evaluated during the sampling process. When set to true, if a user provided a value for the Gradient or Jacobian that was more than 10 percent different than the value calculated through the finite differencing method, the process would report an error. This functionality is turned on by default.

 

opt.ValidateUserDerivatives = 1;

 

UseJacobianSparsity

The sparsity of a Jacobian matrix is determined by the number of elements that are equal to zero. This boolean property indicates whether the optimization engine should take advantage of the sparsity of the Jacobian to simplify the optimization algorithm, or always process the full dense Jacobian. This functionality is turned on by default, but has no effect when using NLopt, which always uses a dense Jacobian.

 

opt.UseJacobianSparsity = 1;

 

 

Specifying the Sparsity Structure


Users have the option to specify the sparsity structure of the derivatives. Supplying the optimizer with the derivative sparsity structure can greatly increase the efficiency of the optimizer's sampling phase for large optimization problems. In order to achieve this increase in efficiency, the following two steps must be completed:

1.Specify the sparsity for all derivative elements that are known to be zero/non-zero (Optimizer.JacobianSparsity and Optimizer.GradientSparsity)

2.Disable the validation of user provided derivative values (Optimizer.ValidateUserDerivatives = 0)

 

If partial derivative sparsity information is provided for the problem, then the efficiency increase will be proportional to the number of full columns of the derivative sparsity that are specified (including the gradient).

 

// Configure variables and constraints

Variable C1 = 2 + 3*sqrt(2);

Variable C2 = -2 + 2*sqrt(2);

Variable C3 = 2;

 

Optimizer opt;

opt.AddStateVariableBlock(5, "x", 2.0);

opt.AddConstraintBlock(3, "g");

 

opt.Constraints[0].SetEqualityBounds(C1);

opt.Constraints[1].SetEqualityBounds(C2);

opt.Constraints[2].SetEqualityBounds(C3);

 

// Set sparsity to all zeroes

opt.JacobianSparsity = Matrix.Zero(opt.JacobianSparsity.RowCount, opt.JacobianSparsity.ColumnCount);

opt.GradientSparsity = Matrix.Zero(opt.GradientSparsity.Count, 1).ToArrayColumnMajor();

 

// Apply gradient structure

opt.SetGradientSparsityValue("x1", 1);

opt.SetGradientSparsityValue("x2", 1);

opt.SetGradientSparsityValue("x3", 1);

opt.SetGradientSparsityValue("x4", 1);

opt.SetGradientSparsityValue("x5", 1);

 

// Apply Jacobian structure

opt.SetJacobianSparsityValue("g1""x1", 1);

opt.SetJacobianSparsityValue("g1""x2", 1);

opt.SetJacobianSparsityValue("g1""x3", 1);

 

opt.SetJacobianSparsityValue("g2""x2", 1);

opt.SetJacobianSparsityValue("g2""x3", 1);

opt.SetJacobianSparsityValue("g2""x4", 1);

 

opt.SetJacobianSparsityValue("g3""x1", 1);

opt.SetJacobianSparsityValue("g3""x5", 1);

 

// Turn off user derivative value validation

opt.ValidateUserDerivatives = 0;

 

// Solve the problem

opt.LoadEngine();

 

Array x[5];

Variable objective;

 

While (opt.HasNotConverged());

 opt.UpdateStateVariables();

 

 x = opt.GetStateVariableValues();

 

 opt.SetConstraintValue("g1", x[0] + x[1]^2 + x[2]^3);

 opt.SetConstraintValue("g2", x[1] - x[2]^2 + x[3]);

 opt.SetConstraintValue("g3", x[0] * x[4]);

 

 objective = (x[0]-1)^2 + (x[0]-x[1])^2 + (x[1]-x[2])^2 + (x[2]-x[3])^4 + (x[3]-x[4])^4;

 

 opt.Minimize(objective);

 

 If (opt.OptimizationPhase == 1);

         Report opt.TotalEvaluationCount, opt.NominalEvaluationCount, opt.MaximumInfeasibility, opt.MaximumInfeasibilitySource, opt.ObjectiveFunctionValue;

 End;        

End;

 

Report opt.ReturnString;

 

 

See Also


Optimizer Properties and Methods