• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Do you follow the standards for time, date notation etc?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I've adopted Prefix_YYYY_MM_DD.ext as my file naming convention for one simple reason: If I use file names in that format they sort correctly. I've gone to military time about 50% of the time where I'm at all worried about ambiguity.

As far as SI units, in my job they make sense for many things and no sense whatsoever for other things. I use maps alot in my work and use UTMs (meter based) for coordinates but use miles and acres for distance and area measurements simply because most of the U.S. is laid out on the Township and Range, Section system which is based on miles and acres. The 640 acre, one mile square section is so entrenched into our system of property division that it isn't ever going to go away.
 
Originally posted by: Savij
You're completely wrong on number 3. You don't know where those figures came from and you shouldn't talk about things you don't know. 1 KB has ALWAYS been 2^10 bytes. There is no "now days" about it. Deal with it.
Can you find any standard from any major standardization organization defining k = 1024? I for one find it absolutely incredible that an engineering field has created such an ambiguous notation as this. It is obvious that there must be standards for these things, and computer engineers have collectively f***ed up on this one. Seems like engineers need scientists to hold their hands or something.

AFAIK 1 kB has always meant 1000 bytes in telecomunications. And if 1 kB has never been defined as 1024 bytes, then it should never have been used as such by an engineer. This is the realm of science and engineering, and anarchy of standards is not something that should be tolerated (and no good engineer should contrubute to it, especially these days when 1 kB = 1024 bytes is just wrong, according to IEEE, IEC, SI, NIST etc.)
 
Back
Top