Graduated optimization
Encyclopedia
Graduated optimization is a global optimization
technique that attempts to solve a difficult optimization problem by initially solving a greatly simplified problem, and progressively transforming that problem (while optimizing) until it is equivalent to the difficult optimization problem.
that enables a hill climber to avoid settling into local optima. It breaks a difficult optimization problem into a sequence of optimization problems, such that the first problem in the sequence is convex (or nearly convex), the solution to each problem gives a good starting point to the next problem in the sequence, and the last problem in the sequence is the difficult optimization problem that it ultimately seeks to solve. Often, graduated optimization gives better results than simple hill climbing. Further, when certain conditions exist, it can guarantees to find an optimal solution to the final problem in the sequence. These conditions are:
It can be shown inductively that if these conditions are met, then a hill climber will arrive at the global optimum for the difficult problem. Unfortunately, it can be difficult to find a sequence of optimization problems that meet these conditions. Often, graduated optimization yields good results even when the sequence of problems cannot be proven to strictly meet all of these conditions.
, but pyramid representation is also used for other purposes besides finding objects with graduated optimization.)
Graduated optimization can be used in manifold learning. The Manifold Sculpting algorithm, for example, uses graduated optimization to seek a manifold embedding for non-linear dimensionality reduction. It gradually scales variance out of extra dimensions within a data set while optimizing in the remaining dimensions. It, therefore, begins in a globally optimal state with a trivial problem, and as it scales variance out of the extra dimensions, it gradually transforms the problem into that of projecting data into a small number of dimensions.
It has also been used to calculate conditions for fractionation with tumors, for object tracking in computer vision, and other purposes.
is closely related to graduated optimization. Instead of smoothing the function over which it is optimizing, simulated annealing randomly perturbs the current solution by a decaying amount, which may have a similar effect.
Global optimization
Global optimization is a branch of applied mathematics and numerical analysis that deals with the optimization of a function or a set of functions to some criteria.- General :The most common form is the minimization of one real-valued function...
technique that attempts to solve a difficult optimization problem by initially solving a greatly simplified problem, and progressively transforming that problem (while optimizing) until it is equivalent to the difficult optimization problem.
Technique description
Graduated optimization is an improvement to hill climbingHill climbing
In computer science, hill climbing is a mathematical optimization technique which belongs to the family of local search. It is an iterative algorithm that starts with an arbitrary solution to a problem, then attempts to find a better solution by incrementally changing a single element of the solution...
that enables a hill climber to avoid settling into local optima. It breaks a difficult optimization problem into a sequence of optimization problems, such that the first problem in the sequence is convex (or nearly convex), the solution to each problem gives a good starting point to the next problem in the sequence, and the last problem in the sequence is the difficult optimization problem that it ultimately seeks to solve. Often, graduated optimization gives better results than simple hill climbing. Further, when certain conditions exist, it can guarantees to find an optimal solution to the final problem in the sequence. These conditions are:
- The first optimization problem in the sequence can be solved given the initial starting point.
- The locally convex region around the global optimum of each problem in the sequence includes the point that corresponds to the global optimum of the previous problem in the sequence.
It can be shown inductively that if these conditions are met, then a hill climber will arrive at the global optimum for the difficult problem. Unfortunately, it can be difficult to find a sequence of optimization problems that meet these conditions. Often, graduated optimization yields good results even when the sequence of problems cannot be proven to strictly meet all of these conditions.
Some examples
Graduated optimization is commonly used in image processing for locating objects within a larger image. This problem can be made to be more convex by blurring the images. Thus, objects can be found by first searching the most-blurred image, then starting at that point and searching within a less-blurred image, and continuing in this manner until the object is located with precision in the original sharp image. (Representing images with varying levels of blurring is called pyramid representationPyramid (image processing)
Pyramid or pyramid representation is a type of multi-scale signal representation developed by the computer vision, image processing and signal processing communities, in which a signal or an image is subject to repeated smoothing and subsampling...
, but pyramid representation is also used for other purposes besides finding objects with graduated optimization.)
Graduated optimization can be used in manifold learning. The Manifold Sculpting algorithm, for example, uses graduated optimization to seek a manifold embedding for non-linear dimensionality reduction. It gradually scales variance out of extra dimensions within a data set while optimizing in the remaining dimensions. It, therefore, begins in a globally optimal state with a trivial problem, and as it scales variance out of the extra dimensions, it gradually transforms the problem into that of projecting data into a small number of dimensions.
It has also been used to calculate conditions for fractionation with tumors, for object tracking in computer vision, and other purposes.
Related optimization techniques
Simulated annealingSimulated annealing
Simulated annealing is a generic probabilistic metaheuristic for the global optimization problem of locating a good approximation to the global optimum of a given function in a large search space. It is often used when the search space is discrete...
is closely related to graduated optimization. Instead of smoothing the function over which it is optimizing, simulated annealing randomly perturbs the current solution by a decaying amount, which may have a similar effect.