Why do bitrates increase in powers of 10?


Bitrates supported by switches, routers and network cards are often 10 MBit/s, 100 MBit/s or 1GBit/s. Newer developments promise up to 10 GBit/s. I do not see a reason why powers of 10 should have a certain meaning here, unlike powers of 2 in memory storage increase.

Why do bitrates increase in that pattern?

Raimund Krämer

Posted 2016-10-25T14:17:29.813

Reputation: 108

Did any answer help you? If so, you should accept the answer so that the question doesn't keep popping up forever, looking for an answer. Alternatively, you could provide and accept your own answer. – Ron Maupin – 2017-08-15T04:52:36.353



You are incorrect; bitrates don't always increase by 10 times.

Memory storage increases by powers of two because it is addressed with binary (base 2) addresses.

Increasing ethernet speeds by a power of 10 has been a convenient goal, but there is now 40 Gb ethernet and 100 Gb ethernet. Also, Wi-Fi speeds have not been anything like a linear 10 times increase: 802.11 was 2 Mb, 802.11a was 54 Mb, 802.11b was 11 Mb, 802.11g was 54 Mb, 802.11n was 150 Mb per stream, etc.

Ron Maupin

Posted 2016-10-25T14:17:29.813

Reputation: 60 371


This may look that way, but in reality there's more. For instance have a look at the SFP page on Wikipedia. There you'll see listed speeds from 1 GB, 2.5GB, 10 GB, 25 GB. And the next will be 40 GB, 100 GB and even 400 GB.

Jaap Keuter

Posted 2016-10-25T14:17:29.813

Reputation: 523