I am guessing there are a lot of Americans in here. I am not trying to be an a** hole to Americans here, but I really do feel you guys mess around alot with standards, units of measure, or pretty much anything that could be logical and unambiguous. Not to say Americans suck, but in this regard the general population seems to use a pretty messed up system.
1. I do realize that using pounds, feet etc. seems natural to you, but do you not see the merit (superiority?) of the SI system? That is, if you were to make an objective choice, what system of units would you choose, SI or Imperial (punds, feet etc.)?
2. Now to the standards of notation. We all agree on how to measure time, but we don't seem to agree on how to present the information. AFAIK dates are written in the opposite order lot's of places (here in Norway it is normally written dd-mm-yyyy). Americans like to jumble it up a bit, with mm-dd-yyyy. That is, Americans use neither the most significant nor the least significant value first. There is a good definition for how date is written, it is in the form yyyy-mm-dd, while time is in the form hh:mm:ss (hh from 00-24, never AM/PM). This is defined in ISO 8601. Is this system not logical? Both date and time is written with the most significant value first. There are no letters in there making the time ambiguous with AM/PM. Is this not reasonable?
3. Then comes the annoying binary prefixes. kilo has never, ever been defined as 1024, mega has never been defined as 1048576 and so on. 1 kB is, and has always been 1000 bytes. I have not found any standard in a major agency stating otherwise (though I may be wrong). Anyhow, nowadays both IEEE and IEC (60027-2) have defined binary prefixes, so that 1 KiB = 1024 bytes, 1 MiB = 1048576 and so on. Do you use this standard? If not, why?
I just needed to air this. I am very curious how many people actually use these standard notations, and how many even know that there is a defined standard. Please comment, unless I have offended you or something (not my intent).
1. I do realize that using pounds, feet etc. seems natural to you, but do you not see the merit (superiority?) of the SI system? That is, if you were to make an objective choice, what system of units would you choose, SI or Imperial (punds, feet etc.)?
2. Now to the standards of notation. We all agree on how to measure time, but we don't seem to agree on how to present the information. AFAIK dates are written in the opposite order lot's of places (here in Norway it is normally written dd-mm-yyyy). Americans like to jumble it up a bit, with mm-dd-yyyy. That is, Americans use neither the most significant nor the least significant value first. There is a good definition for how date is written, it is in the form yyyy-mm-dd, while time is in the form hh:mm:ss (hh from 00-24, never AM/PM). This is defined in ISO 8601. Is this system not logical? Both date and time is written with the most significant value first. There are no letters in there making the time ambiguous with AM/PM. Is this not reasonable?
3. Then comes the annoying binary prefixes. kilo has never, ever been defined as 1024, mega has never been defined as 1048576 and so on. 1 kB is, and has always been 1000 bytes. I have not found any standard in a major agency stating otherwise (though I may be wrong). Anyhow, nowadays both IEEE and IEC (60027-2) have defined binary prefixes, so that 1 KiB = 1024 bytes, 1 MiB = 1048576 and so on. Do you use this standard? If not, why?
I just needed to air this. I am very curious how many people actually use these standard notations, and how many even know that there is a defined standard. Please comment, unless I have offended you or something (not my intent).