Above is the graph showing the BER (bit error rate) at different Eb/No values using BPSK over AWGN channel. The pink curve shows the BER of the uncoded system (without channel encoder and decoder) while the black curve represent the BER of the digital communication with the used of hamming (7,4) code for channel encoding. However, I can't explain why both curves started to intersect and cross over at 6dB.
-
Pretty sure this should be on another Stack Exchange section. Anyways, My thought would be your data density is different in both situations. With a hamming code you have data+overhead as opposed to just data. This means as you will end up with more errors per data bits (given the same transmission rate) with the overhead than without. This would cause after some given point for the encoded version to have enough errors to cause more errors per data in any given time frame. – Goblinlord Jan 24 '16 at 08:30
-
about this sentence: This would cause after some given point for the encoded version to have enough errors to cause more errors per data in any given time frame. do you mind elaborate more? – wolong91 Jan 24 '16 at 15:22
1 Answers
I started writing this in a comment and it started getting long. I figure this is either correct or not. It makes sense to me though so maybe you will have to do more research beyond this.
Note: I am aware BER is normally over seconds but for our purpose we will look at something smaller.
My first assumption though (based on your graph) is that the BER is on the actual data and not the signal. If we have a BER on these 2 different encoding schemes of 1 error every 7 bits we have a BER on the hamming encoded signal of 0 errors every 7 bits compared to 1 in 7.
Initial:
- Unencoded: 1 error every 7 bits received
- Hamming(7,4): 0 errors every 4 bits (if corrected)
Now lets increase the noise thereby increasing the error rate of the entire signal.
Highly increased BER:
- Unencoded: 3.5 errors in 7 bits (50%) (multiple sequences to get an average)
- Hamming(7,4): 2 errors in 4 bits (50%)
Somewhere during the increase in BER these must crossover as you are seeing on your graph. Beyond the crossover I would expect to see it worse on the Hamming side because of less data per error (lower actual data density). I am sure you could calculate this mathematically... unfortunately, it would take me more time to look into that than I care to though as it just intuitively makes sense to me.

- 3,290
- 1
- 20
- 24