SciPy - Unconstrained Optimization



Unconstrained Optimization in SciPy

Unconstrained optimization in SciPy refers to the process of finding the minimum or maximum of an objective function without any restrictions or constraints on the variables. Unconstrained optimization is typically performed using the scipy.optimize.minimize() function which provides a wide range of algorithms suited to different types of optimization problems. Lets go into detail about how this works along with different aspects.

Syntax

Following is the syntax of the function scipy.optimize.minimize() which is used to find the minimum or maximum of an objective function −

scipy.optimize.minimize(
   fun, 
   x0, 
   args=(), 
   method=None, 
   jac=None, 
   hess=None, 
   hessp=None, 
   bounds=None, 
   constraints=(), 
   tol=None, 
   callback=None, 
   options=None
)

Parameters

Here are the Parameters of scipy.optimize.minimize() function −

  • fun: Objective function to minimize
  • x0: Initial guess for the variables.
  • method: Optimization algorithm. There are different methods such as 'BFGS', 'Nelder-Mead', 'L-BFGS-B', 'trust-constr'.
  • jac(optional): Gradient of objetcive function.
  • hess(optional): Hessian of objective function.
  • bounds: Variable bounds for constrained problems
  • constraints: Equality or inequality constraints.
  • options: Solver-specific settings.

Basic Minimization

Following is the basic Minimization example in which we will minimize a simple quadratic function f(x) = x2+x+2

>
from scipy.optimize import minimize

# Objective function
def objective(x):
    return x**2 + x + 2

# Initial guess
x0 = [0]

# Minimize the function
result = minimize(objective, x0)

# Display results
print("Optimal solution:", result.x)
print("Function value at optimum:", result.fun)

Here is the output of the basic minimization by using the function scipy.optimize.minimize()

Optimal solution: [-0.50000001]
Function value at optimum: 1.75

Minimization with Variable Bounds

When variables have specific bounds we can define them using the bounds parameter in scipy.optimize.minimize() function. This is useful for constrained problems. The following example shows how to minimize a function with variable bounds −

>
from scipy.optimize import minimize

# Objective function
def objective(x):
    return x[0]**2 + x[1]**2

# Bounds on the variables
bounds = [(0, 1), (-1, 1)]  # x in [0, 1], y in [-1, 1]

# Initial guess
x0 = [0.5, 0]  # A point within the bounds

# Minimization
result = minimize(objective, x0, method='L-BFGS-B', bounds=bounds)

# Output
print("Optimal solution:", result.x)
print("Function value at optimum:", result.fun)

Here is the output of the function scipy.optimize.minimize() used with bounds parameter −

Optimal solution: [ 0.00000000e+00 -1.11022301e-08]
Function value at optimum: 1.2325951233541654e-16

Optimization methods in minimize() Function

The scipy.optimize.minimize() function supports a variety of optimization methods in which each tailored for specific types of problems such as unconstrained or constrained optimization, gradient-based or gradient-free methods and problems with bounds.

Method Description Type Gradient Needed? Hessian Needed? Use Case
Nelder-Mead Simplex algorithm that minimizes based only on function values. Gradient-free No No Non-smooth functions, small problems
Powell Directional search algorithm optimizing along chosen directions. Gradient-free No No Non-smooth functions, high dimensions
CG Conjugate Gradient method minimizing quadratic approximation of functions. Gradient-based Yes No Smooth functions, large-scale problems
BFGS Quasi-Newton method approximating the inverse Hessian. Gradient-based Yes Approximation Smooth functions, efficient optimization
Newton-CG Newtons method with conjugate gradient to improve efficiency. Gradient-based Yes Optional Large-scale problems with Hessian-vector products
trust-ncg Trust-region Newton-Conjugate Gradient method. Trust-region, Newton's Yes Approximation Large-scale problems
trust-krylov Trust-region method using Krylov subspace approximation of Hessians. Trust-region, Krylov-based Yes Approximation Large-scale problems
trust-exact Trust-region method leveraging exact Hessians for precise solutions. Trust-region, Newton's Yes Yes Small-scale problems with exact Hessians
dogleg Trust-region method that uses a dogleg step for solving sub-problems. Trust-region, Newton's Yes Yes Medium-scale problems with exact Hessians
Advertisements