I once had a once "state of the art" voodoo2, and I know that the intel pentium II 333 with it couldn't have produced the same IQ at the same performance.
However, if I'm not mistaken, everything was once done in software and it was still orders of magnitude ahead of consoles and at least one of the first 3d accellerators was referred to as a "decellerator".
I mean, 256 color VGA Graphics looked a lot better than a Super NES in every way and the Voodoo2 was, as 3dfx would say later on, "so powerful it's really kind of ridiculous". That was true because forsaken on voodoo2 (8MB) would run at well more than 100fps @ 640x480 and still look worlds better than the N64 one. Doom was just as revolutionary in 1993 as Quake III which required a hardware accellerator was in 1999.
I don't necessarily think the CPU makers dropped the ball, but how the hell did we come to need hardware acceleration? Was it that performance in excess of 100 fps was necessary or was it that the game would be a slide show on the highest end CPUs or was it something else?
Do you think that intel will ever get so far ahead on CPU development that we'll one day return to everything being in software again? In other words, could CPU development disproportionately outpace the development of GPUs and then GPUs will die? Intel definitely has a process advantage, so I'm not so sure that it's impossible that the GPU will die. I believe that some programmers also see no advantage between genericized and specific function. After all, there is no longer any hardware audio accelleration and EAX 5.0 quality uses a negligible amount of the CPU if I'm not mistaken. It's just that there hasn't been much of a will for games to have revolutionary audio (lossless music, better fx, and filled with better fx) for quite some time so that's why we don't see it.
I know this is a retarded thread, but I think it's not a bad discussion.
However, if I'm not mistaken, everything was once done in software and it was still orders of magnitude ahead of consoles and at least one of the first 3d accellerators was referred to as a "decellerator".
I mean, 256 color VGA Graphics looked a lot better than a Super NES in every way and the Voodoo2 was, as 3dfx would say later on, "so powerful it's really kind of ridiculous". That was true because forsaken on voodoo2 (8MB) would run at well more than 100fps @ 640x480 and still look worlds better than the N64 one. Doom was just as revolutionary in 1993 as Quake III which required a hardware accellerator was in 1999.
I don't necessarily think the CPU makers dropped the ball, but how the hell did we come to need hardware acceleration? Was it that performance in excess of 100 fps was necessary or was it that the game would be a slide show on the highest end CPUs or was it something else?
Do you think that intel will ever get so far ahead on CPU development that we'll one day return to everything being in software again? In other words, could CPU development disproportionately outpace the development of GPUs and then GPUs will die? Intel definitely has a process advantage, so I'm not so sure that it's impossible that the GPU will die. I believe that some programmers also see no advantage between genericized and specific function. After all, there is no longer any hardware audio accelleration and EAX 5.0 quality uses a negligible amount of the CPU if I'm not mistaken. It's just that there hasn't been much of a will for games to have revolutionary audio (lossless music, better fx, and filled with better fx) for quite some time so that's why we don't see it.
I know this is a retarded thread, but I think it's not a bad discussion.