• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

And we thought NV30 needed a lot of power....

robg1701

Senior member
http://tech-report.com/onearticle.x/4889

:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q
:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q
:Q:Q:Q:Q:Q:Q:Q😀😀😀😀😀:Q:Q:Q:Q:Q:Q:Q
:Q:Q:Q:Q:Q😀😀😀😀😀😀😀😀😀:Q:Q:Q:Q:Q
:Q:Q:Q:Q😀😀😀😀😀😀😀😀😀😀😀:Q:Q:Q:Q
:Q:Q:Q😀😀😀:Q😀😀😀😀😀:Q😀😀😀:Q:Q:Q
:Q:Q:Q😀😀:Q🙁:Q😀😀😀:Q🙁:Q😀😀:Q:Q:Q
:Q:Q😀😀😀😀:Q😀😀😀😀😀:Q😀😀😀😀:Q:Q
:Q:Q😀😀😀😀😀😀😀😀😀😀😀😀😀😀😀:Q:Q
:Q:Q😀😀😀😀😀😀😀😀😀😀😀😀😀😀😀:Q:Q
:Q:Q😀😀😀😀😀😀:Q:Q:Q😀😀😀😀😀😀:Q:Q
:Q:Q:Q😀😀😀😀:Q:Q:Q:Q:Q😀😀😀😀:Q:Q:Q
:Q:Q:Q😀😀😀😀:Q:Q:Q:Q:Q😀😀😀😀:Q:Q:Q
:Q:Q:Q:Q😀😀😀:Q:Q:Q:Q:Q😀😀😀:Q:Q:Q:Q
:Q:Q:Q:Q:Q😀😀😀:Q:Q:Q😀😀😀:Q:Q:Q:Q:Q
:Q:Q:Q:Q:Q:Q:Q😀😀😀😀😀:Q:Q:Q:Q:Q:Q:Q
:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q
:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q:Q
 
WTF do we bother developing standards for? If everyone is going to ignore them why waste time and money developing them in the first place. IMHO Graphics chip/card manufacturers should be forced to develop within the 50W that AGP50 slots provide (or better yet within the 23-27W that standard AGP slots provide).
nVidia is reeeeaaly going down
ATI is just as bad.

Thorin
 
WTF do we bother developing standards for?
Technology advancements can't be limited to any standards set at a given point in time. People want increased performance, and this is the price. Its the same reason we have 30 different CPU interfaces and hundreds of different chipsets. Enabling new technology requires new technology. Of course some technology advances slower than others, so they still manage to stay within set specifications and standards, but surely you don't think the AGP spec is perfect do you? What's that old saying? Technology breeds invention or something to that effect?

Chiz

 
Lonyo, yes i did, but that was a looooooooong time ago and i saved it for future use 😛


I should note, im not trying to rip into nvidia for this, its inevitable that theyll need that kind of power with the leaps vid cards are making these days, its just rather sooner than i thought...i actually half expect NV35 to use LESS power what with it only being rumoured 5million more transistors than NV30, and id hope they redo their layout a bit...ATi seem to be able to keep the R350 toned enough....so i hope this is NV40 or preferably beyond kinda talk.
 
but surely you don't think the AGP spec is perfect do you?
Definately not. But I do think there's a point when companies 'bend' the rules too far. And yes it is us consumers who are pushing them but we should also be forcing them to be efficient just as much as we force them to be inovative (if you get what I mean).
What's that old saying? Technology breeds invention or something to that effect?
Yes and the invention should be kept within the bounds of the technology (IMHO).

Thorin
 
People shoudn't be super surprised by the rising power requirements of GPUs. GPUs will soon need more power than CPUs becuase GPUs can attain much higher parallelism while sufferring very little from stalls due to long pipes. They also have more compact instructions sets which tend to have higher percentage utilization. GPUs will soon be the central cooling issue in PCs - not CPUs.
 
Originally posted by: thorin
but surely you don't think the AGP spec is perfect do you?
Definately not. But I do think there's a point when companies 'bend' the rules too far. And yes it is us consumers who are pushing them but we should also be forcing them to be efficient just as much as we force them to be inovative (if you get what I mean).
What's that old saying? Technology breeds invention or something to that effect?
Yes and the invention should be kept within the bounds of the technology (IMHO).

Thorin

Yah I agree with you, the speed at which standards change are a big-time concern. Its doubly frustrating b/c there seems to be so much marginal improvement out there that requires complete hardware overhauls (look at Via lately for a good example), which makes it increasingly difficult to plan out logical upgrade paths w/out shelling out wads of cash.

What frustrates me more is that standards lag behind in some areas, but are too forward looking in others. For instance, the R300 and NV30 have already shown that future GPUs will need more power and cooling consideration than the AGP 3.0 spec allows for w/out going out-of-spec. Yet there's a host of features on today's boards that are enablers for the future. ATA133 and now SATA are out there, yet there isn't a device out there that can take full advantage.

Anyways, I seriously doubt that the 125W will be for the NV35. Xbit's tour has shown server stacks running compilers for NV50 already, so that's a possibility.

Chiz

Edit: Oh yah, I remembered the cliche, its "necessity breeds innovation", but it makes sense in the above context as well.
 
Back
Top