Random search
Encyclopedia
Random search is a family of numerical optimization
methods that do not require the gradient
of the problem to be optimized and RS can hence be used on functions that are not continuous
or differentiable. Such optimization methods are also known as direct-search, derivative-free, or black-box methods.
The name, random search, is attributed to Rastrigin who made an early presentation of RS along with basic mathematical analysis. RS works by iteratively moving to better positions in the search-space which are sampled from a hypersphere
surrounding the current position.
Optimization (mathematics)
In mathematics, computational science, or management science, mathematical optimization refers to the selection of a best element from some set of available alternatives....
methods that do not require the gradient
Gradient
In vector calculus, the gradient of a scalar field is a vector field that points in the direction of the greatest rate of increase of the scalar field, and whose magnitude is the greatest rate of change....
of the problem to be optimized and RS can hence be used on functions that are not continuous
Continuous function
In mathematics, a continuous function is a function for which, intuitively, "small" changes in the input result in "small" changes in the output. Otherwise, a function is said to be "discontinuous". A continuous function with a continuous inverse function is called "bicontinuous".Continuity of...
or differentiable. Such optimization methods are also known as direct-search, derivative-free, or black-box methods.
The name, random search, is attributed to Rastrigin who made an early presentation of RS along with basic mathematical analysis. RS works by iteratively moving to better positions in the search-space which are sampled from a hypersphere
Hypersphere
In mathematics, an n-sphere is a generalization of the surface of an ordinary sphere to arbitrary dimension. For any natural number n, an n-sphere of radius r is defined as the set of points in -dimensional Euclidean space which are at distance r from a central point, where the radius r may be any...
surrounding the current position.
Algorithm
Let f: n → be the fitness or cost function which must be minimized. Let x ∈ n designate a position or candidate solution in the search-space. The basic RS algorithm can then be described as:- Initialize x with a random position in the search-space.
- Until a termination criterion is met (e.g. number of iterations performed, or adequate fitness reached), repeat the following:
- Sample a new position y from the hypersphereHypersphereIn mathematics, an n-sphere is a generalization of the surface of an ordinary sphere to arbitrary dimension. For any natural number n, an n-sphere of radius r is defined as the set of points in -dimensional Euclidean space which are at distance r from a central point, where the radius r may be any...
of a given radius surrounding the current position x (see e.g. Marsaglia's technique for sampling a hypersphere.) - If (f(y) < f(x)) then move to the new position by setting x = y
- Sample a new position y from the hypersphere
- Now x holds the best-found position.
Variants
A number of RS variants have been introduced in the literature:- Fixed Step Size Random Search (FSSRS) is Rastrigin's basic algorithm which samples from a hypersphere of fixed radius.
- Optimum Step Size Random Search (OSSRS) by Schumer and Steiglitz is primarily a theoretical study on how to optimally adjust the radius of the hypersphere so as to allow for speedy convergence to the optimum. An actual implementation of the OSSRS needs to approximate this optimal radius by repeated sampling and is therefore expensive to execute.
- Adaptive Step Size Random Search (ASSRS) by Schumer and Steiglitz attempts to heuristically adapt the hypersphere's radius. The algorithm, however, is somewhat complicated.
- Optimized Relative Step Size Random Search (ORSSRS) by Schrack and Choit approximate the optimal step size by a simple exponential decrease. Again, however, the formula for computing the decrease-factor is somewhat complicated.
See also
- Random optimizationRandom optimizationRandom optimization is a family of numerical optimization methods that do not require the gradient of the problem to be optimized and RO can hence be used on functions that are not continuous or differentiable...
is a closely related family of optimization methods which sample from a normal distribution instead of a hypersphere. - Luus–JaakolaLuus–JaakolaIn computational engineering, Luus–Jaakola denotes a heuristic for global optimization of a real-valued function. In engineering use, LJ is not an algorithm that terminates with an optimal solution; nor is it an iterative method that generates a sequence of points that converges to an optimal...
is a closely related optimization method using a uniform distributionUniform distribution (continuous)In probability theory and statistics, the continuous uniform distribution or rectangular distribution is a family of probability distributions such that for each member of the family, all intervals of the same length on the distribution's support are equally probable. The support is defined by...
in its sampling and a simple formula for exponentially decreasing the sampling range. - Pattern searchPattern search (optimization)Pattern search is a family of numerical optimization methods that do not require the gradient of the problem to be optimized. Hence PS can be used on functions that are not continuous or differentiable. Such optimization methods are also known as direct-search, derivative-free, or black-box...
takes steps along the axes of the search-space using exponentially decreasing step sizes.