First page Back Continue Last page Overview Graphics
Synchronous vs. Asynchronous
Synchronous data transfer: sender and receiver use the same clock signal
- supports high data transfer rate
- needs clock signal between the sender and the receiver
- requires master/slave configuration
Asynchronous data transfer: sender provides a synchronization signal to the receiver before starting the transfer of each message
- does not need clock signal between the sender and the receiver
- slower data transfer rate
There are many serial data transfer protocols. The protocols for serial data transfer can be grouped into two types: synchronous and asynchronous. For synchronous data transfer, both the sender and receiver access the data according to the same clock. Therefore, a special line for the clock signal is required. A master (or one of the senders) should provide the clock signal to all the receivers in the synchronous data transfer.
For asynchronous data transfer, there is no common clock signal between the sender and receivers. Therefore, the sender and the receiver first need to agree on a data transfer speed. This speed usually does not change after the data transfer starts. Both the sender and receiver set up their own internal circuits to make sure that the data accessing is follows that agreement. However, just like some watches run faster than others, computer clocks also differ in accuracy. Although the difference is very small, it can accumulate fast and eventually cause errors in data transfer. This problem is solved by adding synchronization bits at the front, middle or end of the data. Since the synchronization is done periodically, the receiver can correct the clock accumulation error. The synchronization information may be added to every byte of data or to every frame of data. Sending these extra synchronization bits may account for up to 50% data transfer overhead and hence slows down the actual data transfer rate.