
- SciPy - Home
- SciPy - Introduction
- SciPy - Environment Setup
- SciPy - Basic Functionality
- SciPy - Relationship with NumPy
- SciPy Clusters
- SciPy - Clusters
- SciPy - Hierarchical Clustering
- SciPy - K-means Clustering
- SciPy - Distance Metrics
- SciPy Constants
- SciPy - Constants
- SciPy - Mathematical Constants
- SciPy - Physical Constants
- SciPy - Unit Conversion
- SciPy - Astronomical Constants
- SciPy - Fourier Transforms
- SciPy - FFTpack
- SciPy - Discrete Fourier Transform (DFT)
- SciPy - Fast Fourier Transform (FFT)
- SciPy Integration Equations
- SciPy - Integrate Module
- SciPy - Single Integration
- SciPy - Double Integration
- SciPy - Triple Integration
- SciPy - Multiple Integration
- SciPy Differential Equations
- SciPy - Differential Equations
- SciPy - Integration of Stochastic Differential Equations
- SciPy - Integration of Ordinary Differential Equations
- SciPy - Discontinuous Functions
- SciPy - Oscillatory Functions
- SciPy - Partial Differential Equations
- SciPy Interpolation
- SciPy - Interpolate
- SciPy - Linear 1-D Interpolation
- SciPy - Polynomial 1-D Interpolation
- SciPy - Spline 1-D Interpolation
- SciPy - Grid Data Multi-Dimensional Interpolation
- SciPy - RBF Multi-Dimensional Interpolation
- SciPy - Polynomial & Spline Interpolation
- SciPy Curve Fitting
- SciPy - Curve Fitting
- SciPy - Linear Curve Fitting
- SciPy - Non-Linear Curve Fitting
- SciPy - Input & Output
- SciPy - Input & Output
- SciPy - Reading & Writing Files
- SciPy - Working with Different File Formats
- SciPy - Efficient Data Storage with HDF5
- SciPy - Data Serialization
- SciPy Linear Algebra
- SciPy - Linalg
- SciPy - Matrix Creation & Basic Operations
- SciPy - Matrix LU Decomposition
- SciPy - Matrix QU Decomposition
- SciPy - Singular Value Decomposition
- SciPy - Cholesky Decomposition
- SciPy - Solving Linear Systems
- SciPy - Eigenvalues & Eigenvectors
- SciPy Image Processing
- SciPy - Ndimage
- SciPy - Reading & Writing Images
- SciPy - Image Transformation
- SciPy - Filtering & Edge Detection
- SciPy - Top Hat Filters
- SciPy - Morphological Filters
- SciPy - Low Pass Filters
- SciPy - High Pass Filters
- SciPy - Bilateral Filter
- SciPy - Median Filter
- SciPy - Non - Linear Filters in Image Processing
- SciPy - High Boost Filter
- SciPy - Laplacian Filter
- SciPy - Morphological Operations
- SciPy - Image Segmentation
- SciPy - Thresholding in Image Segmentation
- SciPy - Region-Based Segmentation
- SciPy - Connected Component Labeling
- SciPy Optimize
- SciPy - Optimize
- SciPy - Special Matrices & Functions
- SciPy - Unconstrained Optimization
- SciPy - Constrained Optimization
- SciPy - Matrix Norms
- SciPy - Sparse Matrix
- SciPy - Frobenius Norm
- SciPy - Spectral Norm
- SciPy Condition Numbers
- SciPy - Condition Numbers
- SciPy - Linear Least Squares
- SciPy - Non-Linear Least Squares
- SciPy - Finding Roots of Scalar Functions
- SciPy - Finding Roots of Multivariate Functions
- SciPy - Signal Processing
- SciPy - Signal Filtering & Smoothing
- SciPy - Short-Time Fourier Transform
- SciPy - Wavelet Transform
- SciPy - Continuous Wavelet Transform
- SciPy - Discrete Wavelet Transform
- SciPy - Wavelet Packet Transform
- SciPy - Multi-Resolution Analysis
- SciPy - Stationary Wavelet Transform
- SciPy - Statistical Functions
- SciPy - Stats
- SciPy - Descriptive Statistics
- SciPy - Continuous Probability Distributions
- SciPy - Discrete Probability Distributions
- SciPy - Statistical Tests & Inference
- SciPy - Generating Random Samples
- SciPy - Kaplan-Meier Estimator Survival Analysis
- SciPy - Cox Proportional Hazards Model Survival Analysis
- SciPy Spatial Data
- SciPy - Spatial
- SciPy - Special Functions
- SciPy - Special Package
- SciPy Advanced Topics
- SciPy - CSGraph
- SciPy - ODR
- SciPy Useful Resources
- SciPy - Reference
- SciPy - Quick Guide
- SciPy - Cheatsheet
- SciPy - Useful Resources
- SciPy - Discussion
SciPy - Optimize
SciPy Optimize
SciPy's optimize module is a collection of tools for solving mathematical optimization problems. It helps minimize or maximize functions, find function roots, and fit models to data. This makes it useful for tasks like data analysis, engineering, and scientific research.
The scipy.optimize package provides several commonly used optimization algorithms. This module contains the following aspects −
-
Unconstrained and constrained minimization of multivariate scalar functions using minimize() function. Supports various algorithms like BFGS, Nelder-Mead simplex, Newton Conjugate Gradient, COBYLA or SLSQP
-
Global (Brute-Force) Optimization Routines. Examples: anneal(), basinhopping()
-
Least-Squares Minimization and Curve Fitting. Examples: leastsq()) and curve fitting (curve_fit()) algorithms
-
Scalar univariate functions minimizers. Examples: minimize_scalar()) and root finders − newton()
-
Multivariate equation system solvers using root() function. Examples: Hybrid Powell, Levenberg-Marquardt or large-scale methods such as Newton-Krylov
Unconstrained and Constrained Minimization
The minimize() function provides a common interface to unconstrained and constrained minimization algorithms for multivariate scalar functions in scipy.optimize. To demonstrate the minimization function, consider the problem of minimizing the Rosenbrock function of the NN variables −
We can use scipy.optimize.minimize() function to minimize the above Rosenbrock function
The minimize() function takes the following arguments:
fun: The objective function that you want to be minimize in our case it is Rosenbrock function.
x(0): The x0 argument is an array-like structure that represents the initial guess for the variables
method: Which optimization algorithm to use. Supporting algorithms include.
-
Unconstrained: Nelder-Mead, BFGS( Quasi-Newton method), CG(Conjugate Gradient).
-
Constrained: L-BFGS-B, TNC, trust-constr
By default the method is BFGS.
Let us minimize the Rosenbrock function using the Nelder-Mead simplex algorithm (method = 'Nelder-Mead')
import numpy as np from scipy.optimize import minimize def rosenbrock(x): return sum(100.0 * (x[1:] - x[:-1]**2)**2 + (1 - x[:-1])**2) x0 = np.array([1.3, 0.7, 0.8, 1.9, 1.2]) res = minimize(rosenbrock, x0, method='nelder-mead') print(res.x)
The above program will generate the following output.
[0.99910115 0.99820923 0.99646346 0.99297555 0.98600385]
The minimum value of this function is 0, which is achieved when xi = 1.
Global (Brute-Force) Optimization Routines
The basinhopping algorithm is a global optimization technique that successfully avoids local minima while locating the global minimum of complex multimodal functions by combining random perturbations with local minimizations.
Let us minimize the quadratic equation using the basinhopping algorithm −
import numpy as np from scipy.optimize import basinhopping def quadratic(x): return (x - 3)**2 + 1 # Initial guess x0 = np.array([0]) res = basinhopping(quadratic, x0) # Print the result print("global minimum:", res.x)
The above program will generate the following output.
global minimum: [2.99999999]
Least Squares Minimization
Solve a nonlinear least-squares problem with bounds on the variables. Given the residuals f(x) (an m-dimensional real function of n real variables) and the loss function rho(s) (a scalar function), least_squares find a local minimum of the cost function F(x). Let us consider the following example.
In this example, we find a minimum of the Rosenbrock function without bounds on the independent variables.
import numpy as np def fun_rosenbrock(x): return np.array([10 * (x[1] - x[0]**2), (1 - x[0])]) from scipy.optimize import least_squares input = np.array([2, 2]) res = least_squares(fun_rosenbrock, input) print(res)
Notice that, we only provide the vector of the residuals. The algorithm constructs the cost function as a sum of squares of the residuals, which gives the Rosenbrock function. The exact minimum is at x = [1.0,1.0].
Following is the output of the above code −
message: `gtol` termination condition is satisfied. success: True status: 1 fun: [ 4.441e-15 1.110e-16] x: [ 1.000e+00 1.000e+00] cost: 9.866924291084687e-30 jac: [[-2.000e+01 1.000e+01] [-1.000e+00 0.000e+00]] grad: [-8.893e-14 4.441e-14] optimality: 8.892886493421953e-14 active_mask: [ 0.000e+00 0.000e+00] nfev: 3 njev: 3
Root Finding
Let us understand how root finding helps in SciPy.
Scalar Functions
If one has a single-variable equation, there are four different root-finding algorithms, which can be tried. Each of these algorithms require the endpoints of an interval in which a root is expected (because the function changes signs). In general, brentq is the best choice, but the other methods may be useful in certain circumstances or for academic purposes.
Fixed-point solving
A problem closely related to finding the zeros of a function is the problem of finding a fixed point of a function. A fixed point of a function is the point at which evaluation of the function returns the point: g(x) = x. Clearly the fixed point of gg is the root of f(x) = g(x)x. Equivalently, the root of ff is the fixed_point of g(x) = f(x)+x. The routine fixed_point provides a simple iterative method using the Aitkens sequence acceleration to estimate the fixed point of gg, if a starting point is given.
Sets of equations
Finding a root of a set of non-linear equations can be achieved using the root() function. Several methods are available, amongst which hybr (the default) and lm, respectively use the hybrid method of Powell and the Levenberg-Marquardt method from the MINPACK.
The following example considers the single-variable transcendental equation.
x2 + 2cos(x) = 0
A root of which can be found as follows −
import numpy as np from scipy.optimize import root def func(x): return x*2 + 2 * np.cos(x) sol = root(func, 0.3) print(sol)
The above program will generate the following output.
message: The solution converged. success: True status: 1 fun: [ 2.220e-16] x: [-7.391e-01] nfev: 10 fjac: [[-1.000e+00]] r: [-3.347e+00] qtf: [-2.777e-12]
Multivariate Equation System Solvers
The root() function in scipy.optimize solves multivariate equations. Here, we solve a system of nonlinear equations using the Hybrid Powell method.
Consider the following equations −
x2 + y2 - 4 = 0 x * y - 1 = 0
Following is an example −
import numpy as np from scipy.optimize import root def equations(vars): x, y = vars return [x**2 + y**2 - 4, x * y - 1] x0 = [1.5, 1.5] # Initial guess res = root(equations, x0, method='hybr') print(res.x)
The above program will generate the following output.
[1.93185165 0.51763809]