Hmm... interesting question. I don't pretend to know the technical details here, perhaps someone could elaborate for me?
My impression was that a monitor has an upper limit to its clock (in terms of pixels per second) that you don't want to exceed. You can calculate the clock rate by simply multiplying the resolution numbers with the refresh rate. So if your monitor can do 1600x1200@75Hz, that means the screen has 1600x1200 = 1,920,000 pixels, which are refreshed 75 times per second, for a total of 1920000x75 = 144,000,000 pixels per second. But, 1153x864x100 = 99,619,200 pixels per second, well under the pixel/sec rate at max resolution. This would seem to imply that 1153x864@100 is OK for the monitor. Now, presumably, the manufacturer has good reason to recommend 85Hz for that resolution. Is there another reason to run at a lower pixel/sec rate, or is the manufacturer just hoping people will settle for 85Hz, putting less stress on the hardware (?) and causing fewer returns?