Bit depth calculation doesn't seem to add up

0

I was looking to verify my understanding of audio file representations, so I decided to take an audio file at random and try to deduce its bit depth. According to MediaInfo, the file in question has a sample rate of 44.1kHz, 1 channel, and a bit rate of 160 kb/s:

enter image description here

I had always thought that (sample rate) * (bit depth) * (# of channels) = (bit rate), which makes sense.

However, when I try plugging in my "known" values, I get a non-integer answer for my supposed bit depth value, which makes no sense.

How can this weird behavior be explained? Is my equation wrong? Is there some quirk I'm not taking into consideration?

auradun

Posted 2020-04-01T22:13:22.497

Reputation: 3

Answers

3

It's an MP3 file. You can't deduce anything from that other than the fact that it's probably got a bit depth of 16 bits. Try the same exercise but using a WAV file. You will have better luck. MP3 is a lossy compression format. Lossy means that by compressing, you 'lose' data.

Also with a WAV file the bit-depth is contained in the header. Like it just straight out tells you.

Mark

Posted 2020-04-01T22:13:22.497

Reputation: 7 535