I was just wondering about this, as I recently came across a 'feature' that MS had removed from Windows 7 - but had previously been present in vista and W2k8.
Not really knowing what it did, I investigated - and after finding dozens of claims on various overclocking and enthusiast forums, I eventually found the answer in an MS technet article. It was a bug, introduced in windows 3.11, but fixed in W95 - where the correct behavior caused such chaos with badly written software, that the bug had to be reinstated (thankfully, as an option).
This is the mysterious 'enable advanced performance' option in the 'policies' tab for hard drives in the windows device manager (underneath the option to enable write caching).
Just in case anyone is wondering what this option is all about. Here's a brief descrption. W3.11 was brought a revolutionaly performance boost thanks to it's 32 bit file handling code (yeah, right!). However, an unfortunate bug meant that the command to 'flush' disk caches (i.e. guarantee that all files which are waiting in disk caches have been safely written to disk) didn't work. The app would be told that the flush was complete, when in fact, it never initiated. As a result, a lot of developers were blown away by the sheer speed of W3.11's file handling. Lots of business software, where data integrity was critical, started using a ridiculous number of 'flush' commands to ensure that their data was safe.
W95 fixed this bug. But there was a problem - lots of critical business software: payroll, accounts, stock management, databases, etc. was designed with W3.11s broken flush handling - and the excessive number of disk flushes, made the computers so slow as to be unusable. MS started getting a lot of bad press, about how badly performing W95 was. The result was an update to provide an option to 'enable advanced performance' - but in fact, it just disabled the flush function, so that it acted like W3.11.
So, incredibly, we've had quite a major OS bug (i.e. one that can cause data loss) deliberately preserved for 14 years - simply because there was so much bad mouthing of MS when they first fixed the bug!
The question is really, should MS really have preserved this bug for so long? There are thousands of web pages, discussing the benefits of this 'advanced performance', but because of it's obscure name, virtually none actually knew what 'advanced performance' actually did.
And if 14 years is too long, what sort of time period would be more appropriate?
Cliffs:
Windows 3.11 has broken file buffer 'flush' command
W95 fixes this serious data corruption bug
Badly written software depends on bug for acceptable performance
Bug restored to W95
Bug still present in Vista and Win 2k8 server
Bug removed in Windows 7
Not really knowing what it did, I investigated - and after finding dozens of claims on various overclocking and enthusiast forums, I eventually found the answer in an MS technet article. It was a bug, introduced in windows 3.11, but fixed in W95 - where the correct behavior caused such chaos with badly written software, that the bug had to be reinstated (thankfully, as an option).
This is the mysterious 'enable advanced performance' option in the 'policies' tab for hard drives in the windows device manager (underneath the option to enable write caching).
Just in case anyone is wondering what this option is all about. Here's a brief descrption. W3.11 was brought a revolutionaly performance boost thanks to it's 32 bit file handling code (yeah, right!). However, an unfortunate bug meant that the command to 'flush' disk caches (i.e. guarantee that all files which are waiting in disk caches have been safely written to disk) didn't work. The app would be told that the flush was complete, when in fact, it never initiated. As a result, a lot of developers were blown away by the sheer speed of W3.11's file handling. Lots of business software, where data integrity was critical, started using a ridiculous number of 'flush' commands to ensure that their data was safe.
W95 fixed this bug. But there was a problem - lots of critical business software: payroll, accounts, stock management, databases, etc. was designed with W3.11s broken flush handling - and the excessive number of disk flushes, made the computers so slow as to be unusable. MS started getting a lot of bad press, about how badly performing W95 was. The result was an update to provide an option to 'enable advanced performance' - but in fact, it just disabled the flush function, so that it acted like W3.11.
So, incredibly, we've had quite a major OS bug (i.e. one that can cause data loss) deliberately preserved for 14 years - simply because there was so much bad mouthing of MS when they first fixed the bug!
The question is really, should MS really have preserved this bug for so long? There are thousands of web pages, discussing the benefits of this 'advanced performance', but because of it's obscure name, virtually none actually knew what 'advanced performance' actually did.
And if 14 years is too long, what sort of time period would be more appropriate?
Cliffs:
Windows 3.11 has broken file buffer 'flush' command
W95 fixes this serious data corruption bug
Badly written software depends on bug for acceptable performance
Bug restored to W95
Bug still present in Vista and Win 2k8 server
Bug removed in Windows 7
