Cheung–Marks theorem
Encyclopedia
In information theory
, the Cheung–Marks theorem, named after K. F. Cheung and Robert J. Marks II
, specifies conditions where restoration of a signal by the sampling theorem can become ill-posed. It offers conditions whereby "reconstruction error with unbounded variance
[results] when a bounded variance noise is added to the samples."
, Claude Shannon offered the following generalization of the sampling theorem:
Although true in the absence of noise, many of the expansions proposed by Shannon become ill-posed. An arbitrarily small amount of noise on the data renders restoration unstable. Such sampling expansions are not useful in practice since sampling noise, such as quantization noise, rules out stable interpolation and therefore any practical use.
The theorem also shows sensitivity increases with derivative order.
) of the squared magnitude of the interpolation function over all time is not finite.
"While the generalized sampling concept is relatively straightforward, the reconstruction is not always feasible because of potential instabilities."
Information theory
Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and...
, the Cheung–Marks theorem, named after K. F. Cheung and Robert J. Marks II
Robert J. Marks II
Robert Jackson Marks II is a Distinguished Professor of Electrical and Computer Engineering at Baylor University and proponent of intelligent design. From 1977 to 2003, he was on the faculty of the University of Washington in Seattle...
, specifies conditions where restoration of a signal by the sampling theorem can become ill-posed. It offers conditions whereby "reconstruction error with unbounded variance
Variance
In probability theory and statistics, the variance is a measure of how far a set of numbers is spread out. It is one of several descriptors of a probability distribution, describing how far the numbers lie from the mean . In particular, the variance is one of the moments of a distribution...
[results] when a bounded variance noise is added to the samples."
Background
In the sampling theorem, the uncertainty of the interpolation as measured by noise variance is the same as the uncertainty of the sample data when the noise is i.i.d. In his classic 1948 paper founding information theoryInformation theory
Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and...
, Claude Shannon offered the following generalization of the sampling theorem:
Although true in the absence of noise, many of the expansions proposed by Shannon become ill-posed. An arbitrarily small amount of noise on the data renders restoration unstable. Such sampling expansions are not useful in practice since sampling noise, such as quantization noise, rules out stable interpolation and therefore any practical use.
Example
Shannon's suggestion of simultaneous sampling of the signal and its derivative at half the Nyquist rate results in well behaved interpolation. The Cheung–Marks theorem shows counter-intuitively that interlacing signal and derivative samples makes the restoration problem ill-posed.The theorem also shows sensitivity increases with derivative order.
The theorem
Generally, the Cheung–Marks theorem shows the sampling theorem becomes ill-posed when the area (integralIntegral
Integration is an important concept in mathematics and, together with its inverse, differentiation, is one of the two main operations in calculus...
) of the squared magnitude of the interpolation function over all time is not finite.
"While the generalized sampling concept is relatively straightforward, the reconstruction is not always feasible because of potential instabilities."