1

I am new to audio programming, But I am wondering formula of bitRate,

According to wiki https://en.wikipedia.org/wiki/Bit_rate#Audio,

bit rate = sample rate X bit depth X channels

and

  • sample rate is the number of samples (or snapshots taken) per second obtained by a digital audio device.
  • bit depth is the number of bits of information in each sample.

So why bit rate = sample rate X bit depth X channels?

From my perspective, if bitDepth = 2 bit, sample rate = 3 HZ then I can transfer 6 bit data in 1 second

For example:

Sample data = 00 //at 1/3 second.  
Sample data = 01 //at 2/3 second.  
Sample data = 10 //at 3/3 second. 

So I transfer 000110 in 1 second, is that correct logic?

VC.One
  • 14,790
  • 4
  • 25
  • 57
crazyeyes
  • 53
  • 5

2 Answers2

1

Bit-rate is the expected amount of bits per interval (eg: per second).

Sound cycles are measured in hertz, where 1 hertz == 1 second. So to get full sound data that represents that 1 second of audio, you calculate how many bits are needed to be sent (or for media players, they check the bit-rate in a file-format's settings so they can read & playback correctly).

Why is channels involved (isn't sample rate X bit-depth enough)?

In digital audio the samples are sent for each "ear" (L/R channel). There will always be double the amount of samples in a stereo sound versus if it was mono sound. Usually there is a "flag" to specify if sound is stereo or mono.

Logic Example: (without bit-depth, and assuming 1-bit per sample)...

There is speech "Hello" recorded at 200 samples/sec at bitrate of 100/sec. What happens?

  • If stereo flag, each ear gets 100 samples per sec (correct total of 200 played)
  • If mono, audio speech will sound slow by half (since only 100 samples played at expected bit-rate of 100, but remember, a full second was recorded at 200 sample/sec. You get half of "hello" in one second and the other at next second to (== slowed speech).

Taking the above example, you will find these audio gives slow/double speed adventures in your "new to audio programming" experience. The fix will be either setting channels amount or setting bit-rate correctly. Good luck.

VC.One
  • 14,790
  • 4
  • 25
  • 57
1

The 'sample rate' is the rate at which each channel is sampled.

So 'sample rate X bit depth' will give you the bit rate for a single channel.

You then need to multiply that by the number of channels to get the total bit rate flowing through the system.

For example the CD standard has a sample rate of 44100 samples per second and a bit depth of 16 giving a bit rate of 705600 per channel and a total bit rate of 1411200 bits per seconds for stereo.

greg-449
  • 109,219
  • 232
  • 102
  • 145