-2

In computer networks, we are trying to increase the transmission speed of data. Since data is nothing but electrical signals. How these electric signals can be converted into bits so quickly? This conversion is done by ADC - DAC. We can’t control the speed of computation of ADC then how can we translate the electric signals to bits so quickly. Next, Is this ADC integrated in our computer chipset? Also, does it mean that every peripheral has ADC. For example, NIC card will have ADC. Is the information carried in the LAN cable like CAT 5, 6 are analog in nature?

1 Answers1

0

You clock bits in by detecting a rising edge on a signal wired up to one of the pins of your chip. Then the rise lasts for a certain period of time, but only a fraction of a millisecond. There's a bit of tolerance so sender and receiver don't have to be exactly synchronised. The chip then transfers the bit to a buffer in very low level code. When it has a byte, slightly higher level code transfers the byte to another buffer, then the next level is user level - we have a stream of input bytes.

Whilst the wire is of course analogue, that is not analogue to digital conversion. Analogue to digital conversion is where we measure the signal, quantise it, then create a binary representation in place value notation.

Malcolm McLean
  • 6,258
  • 1
  • 17
  • 18