Implicit data structure
Encyclopedia
In computer science
, an implicit data structure is a data structure
that uses very little memory besides the actual data elements i.e. very little information other than main data is stored in these structures. These are storage schemes which retain no pointers and represent the file of n k-key records as a simple n by k array n thus retrieve faster. In implicit data structures the only structural information to be given is to allow the array to grow and shrink as n. No extra information is required. It is called "implicit" because most of the structure of the elements is expressed implicitly by their order. Another term used interchangeably is space efficient. Definitions of “very little” is vague and can mean from O(1) to O(log n) extra space. Implicit data structure encodes data efficiently,so that it does not need to be decoded to be used. Everything is accessed in-place, by reading bits at various position in data. To achieve optimal coding, we use bits instead of bytes. Implicit data structures are frequently also succinct data structure
s.
Although one may argue that disk space is no longer a problem and we should not concern ourselves with improving space utilization, the issue that implicit data structures are designed to improve is main memory utilization. Hard disks, or any other means of large data capacity, I/O devices, are orders of magnitudes slower than main memory. Hence, the higher percentage of a task can fit in buffers in main memory the less dependence is on slow I/O devices. Hence, if a larger chunk of an implicit data structure fits in main memory the operations performed on it can be faster even if the asymptotic running time is not as good as its space-oblivious counterpart. Furthermore, since the CPU-cache is usually much smaller than main-memory, implicit data structures can improve cache-efficiency and thus running speed, especially if the method used improves locality. Keys are scanned very efficiently. Downloading indexed data in mobiles becomes easier.
search time in terms of rank of weight of elements w.r.t multi-set of weights.If the elements are drawn from uniform distribution
, then variation of this structure takes average time.The same result obtain for the data structures in which the intervals between consecutive values have access probabilities.
Computer science
Computer science or computing science is the study of the theoretical foundations of information and computation and of practical techniques for their implementation and application in computer systems...
, an implicit data structure is a data structure
Data structure
In computer science, a data structure is a particular way of storing and organizing data in a computer so that it can be used efficiently.Different kinds of data structures are suited to different kinds of applications, and some are highly specialized to specific tasks...
that uses very little memory besides the actual data elements i.e. very little information other than main data is stored in these structures. These are storage schemes which retain no pointers and represent the file of n k-key records as a simple n by k array n thus retrieve faster. In implicit data structures the only structural information to be given is to allow the array to grow and shrink as n. No extra information is required. It is called "implicit" because most of the structure of the elements is expressed implicitly by their order. Another term used interchangeably is space efficient. Definitions of “very little” is vague and can mean from O(1) to O(log n) extra space. Implicit data structure encodes data efficiently,so that it does not need to be decoded to be used. Everything is accessed in-place, by reading bits at various position in data. To achieve optimal coding, we use bits instead of bytes. Implicit data structures are frequently also succinct data structure
Succinct data structure
In computer science, a succinct data structure is data structure which uses an amount of space that is "close" to the information-theoretic lower bound, but still allows for efficient query operations. The concept was originally introduced by Jacobson to encode bit vectors, trees, and planar...
s.
Although one may argue that disk space is no longer a problem and we should not concern ourselves with improving space utilization, the issue that implicit data structures are designed to improve is main memory utilization. Hard disks, or any other means of large data capacity, I/O devices, are orders of magnitudes slower than main memory. Hence, the higher percentage of a task can fit in buffers in main memory the less dependence is on slow I/O devices. Hence, if a larger chunk of an implicit data structure fits in main memory the operations performed on it can be faster even if the asymptotic running time is not as good as its space-oblivious counterpart. Furthermore, since the CPU-cache is usually much smaller than main-memory, implicit data structures can improve cache-efficiency and thus running speed, especially if the method used improves locality. Keys are scanned very efficiently. Downloading indexed data in mobiles becomes easier.
Implicit data structure for weighted element
For presentation of elements with different weight several data structures are required.The structure uses one more location besides required for values of elements.The first structure supports worst caseWorst Case
Worst Case is the 3rd book in the Michael Bennett series from James Patterson and Michael Ledwidge.-Plot summary:NYPD Detective Mike Bennett and his new partner FBI Special Agent Emily Parker are on the trail of Francis Mooney, a Manhattan trusts and estates lawyer with terminal lung cancer...
search time in terms of rank of weight of elements w.r.t multi-set of weights.If the elements are drawn from uniform distribution
Uniform distribution
-Probability theory:* Discrete uniform distribution* Continuous uniform distribution-Other:* "Uniform distribution modulo 1", see Equidistributed sequence*Uniform distribution , a type of species distribution* Distribution of military uniforms...
, then variation of this structure takes average time.The same result obtain for the data structures in which the intervals between consecutive values have access probabilities.
Further reading
- See publications of Hervé Brönnimann, J. Ian Munro, Greg Frederickson