KCK wrote:Physicists don't care. But ask a computer scientist what 100 Mb/s and 10 Gb/s mean for Ethernet!
That's because an electrical engineer designed Ethernet, not a computer scientist.
KCK wrote:Actually it seems that 1 M = 10^6 throughout all ANSI and IEEE standards. Some companies (e.g., IBM) specify what 1 MB means in their documentations for RAM sizes, HDD capacities, and network transfers.
No surprise there, since IEEE came up with the kibi-mebi stuff in the first place. Moreover, most standards don't talk about file sizes or memory sizes, so the issue never comes up. They talk about things like MHz which has always been accepted to mean 1,000,000 Hz. And I would hope that any standard is quite explicit in its definition and usage of terms whenever there is any chance of confusion.
On the other hand, so far I have failed to locate even one standard which employs 1 MB = 2^20 B. If you do, please let me know!
I'm not aware of too many standards documents that refer to MBytes at all, regardless of usage, and I'm not going to search through documents looking for an example, but if I ever come across one I will post it.
However, that's not really my point. It may be that there are standards documents that treat 1 MByte as 1,000,000 Bytes, but in common
usage, that isn't the case. There are hundreds of millions of desktops out there and programs running on them that display file sizes using the convention that 1 MByte = 1,048,576 Bytes. Of course, those same computers typically transfer data over IDE buses that define data transfer rates using the definition 1 MByte = 1,000,000 Bytes. So, what can I say but reiterate that it is confusing? I agree that it would be best if everything was expressed unambiguously.