Bit slip
Encyclopedia
In digital
transmission, bit slip is the loss of a bit
or bits, caused by clock drift
– variations in the respective clock
rates of the transmitting and receiving devices.
One cause of bit slippage is overflow
of a receive buffer that occurs when the transmitter's clock rate
exceeds that of the receiver. This causes one or more bits to be dropped for lack of storage capacity.
One way to maintain timing between transmitting and receiving devices is to employ an asynchronous protocol
such as start-stop
. Alternatively, bit slip can be prevented by using a self-clocking signal
(such as a signal modulated using OQPSK) or using a line coding such as Manchester encoding.
Another cause is "losing count", as on a hard drive: if a hard drive encounters a long string of 0s, without any 1s (or a string of 1s without 0s), it may lose track of the frame between fields, and suffer bit slip. Thus one prevents long strings without change via such devices as run length limited
codes.
Digital
A digital system is a data technology that uses discrete values. By contrast, non-digital systems use a continuous range of values to represent information...
transmission, bit slip is the loss of a bit
Bit
A bit is the basic unit of information in computing and telecommunications; it is the amount of information stored by a digital device or other physical system that exists in one of two possible distinct states...
or bits, caused by clock drift
Clock drift
Clock drift refers to several related phenomena where a clock does not run at the exact right speed compared to another clock. That is, after some time the clock "drifts apart" from the other clock. This phenomenon is also used for instance in computers to build random number generators...
– variations in the respective clock
Clock signal
In electronics and especially synchronous digital circuits, a clock signal is a particular type of signal that oscillates between a high and a low state and is utilized like a metronome to coordinate actions of circuits...
rates of the transmitting and receiving devices.
One cause of bit slippage is overflow
Buffer overflow
In computer security and programming, a buffer overflow, or buffer overrun, is an anomaly where a program, while writing data to a buffer, overruns the buffer's boundary and overwrites adjacent memory. This is a special case of violation of memory safety....
of a receive buffer that occurs when the transmitter's clock rate
Clock rate
The clock rate typically refers to the frequency that a CPU is running at.For example, a crystal oscillator frequency reference typically is synonymous with a fixed sinusoidal waveform, a clock rate is that frequency reference translated by electronic circuitry into a corresponding square wave...
exceeds that of the receiver. This causes one or more bits to be dropped for lack of storage capacity.
One way to maintain timing between transmitting and receiving devices is to employ an asynchronous protocol
Asynchronous communication
In telecommunications, asynchronous communication is transmission of data without the use of an external clock signal, where data can be transmitted intermittently rather than in a steady stream. Any timing required to recover data from the communication symbols is encoded within the symbols...
such as start-stop
Asynchronous start-stop
Asynchronous serial communication describes an asynchronous, serial transmission protocol in which a start signal is sent prior to each byte, character or code word and a stop signal is sent after each code word...
. Alternatively, bit slip can be prevented by using a self-clocking signal
Self-clocking signal
In telecommunications and electronics, a self-clocking signal is one that can be decoded without the need for a separate clock signal or other source of synchronization...
(such as a signal modulated using OQPSK) or using a line coding such as Manchester encoding.
Another cause is "losing count", as on a hard drive: if a hard drive encounters a long string of 0s, without any 1s (or a string of 1s without 0s), it may lose track of the frame between fields, and suffer bit slip. Thus one prevents long strings without change via such devices as run length limited
Run Length Limited
Run length limited or RLL coding is a line coding technique that is used to send arbitrary data over a communications channel with bandwidth limits. This is used in both telecommunication and storage systems which move a medium past a fixed head. Specifically, RLL bounds the length of stretches ...
codes.