0

I understand that most FEC algorithms, such as Reed-Solomon encoding, were designed to specifically fix bit flips in data streams. Also, if you know the position of where an erasure or insertion has occurred, RS can fix these streams, too. My question is - how do people practically fix very noisy data streams where bit/byte drops may occur? Are there specific algorithms that can be used, like a modified RS code?

We have a packetized data stream over a very long (thousands of feet) multi-drop rs-485 network. We send a request to a node, it responds with a multi-kilobyte response. Data can be randomly dropped or inserted due to a physical capacitive effects we're seeing on the line due to impedance mismatches, the long cable, and tristate effects of the node transceivers. We were supposed to be placing strong pull-up/pull-down resistors along the entire cable length, this was an oversight. Rs-485 networks can be extremely complicated at long lengths. We're wondering if we can somehow fix this effect in software using some error correction algorithm rather than having to respin hardware (which will be extremely expensive and effect scheduling).

demarr
  • 108
  • 9
  • Software always coming to the rescue of Hardware screwups... :-D – guga Jul 09 '18 at 10:10
  • Usually some type of unique "sync" pattern is used to identify the start of each packet of data. This can be used to identify which packets are bad (too short or too long). Using "sync" patterns is common for magnetic media (disk, tape), but I don't know which communication protocols use them. – rcgldr Jul 09 '18 at 21:44

0 Answers0