This is what I remember about the 1280 x 1024 standard.
It all started with the Ramdac. The Ramdac is the chip (often embedded) in the graphics chip that has driven the resolution standards. The Ramdac is the chip that converts the parallel digital signals from the memory chips to the analog serial bit stream (video).
Note: Ever notice that all resolutions are divisible by 8? Since computers and memory chips use hexadecimal (Base 16) logic, the resolutions must be divisible by 8. Also most CRT's are 4:3 aspect ratio. When the resolutions where calculated this was a factor.
History:
Ramdacs are very difficult chips to design and as video speeds increased over the years the complexity also increased. The chipmakers designed the Ramdacs to meet certain resolutions. One of the first standards I remember was 135Mhz. OK I'm dating myself. As chip technology improved frequencies progressed to 170Mhz, 220Mhz, 250Mhz and today we can find Ramdacs that run 320Mhz up to 360Mhz. Many names, pixel clock, dot clock, video rate, and video bandwidth commonly call these frequencies.
The chipmakers realized they could mass-produce 135Mhz chips economically and reliably with good yields. Then the marketing guys got involved. They said what is the maximum resolution a 135Mhz Ramdac can produce at a reasonable refresh rate. The answer is 1280 x 1024 at 75Hz, great in those days. The next step was 170Mhz. To get 85Hz at 1280 x 1024 you need a 157.5Mhz pixel clock, 170Mhz fit the bill nicely. All of these resolutions and refresh rate standards can be traced back to the designers of the Ramdacs on the video controller boards. How do I know? I used to design video cards.
You can also contact the VESA committee and ask them
http://www.VESA.org
GoLeafs