GDDR1 vs. GDDR3

imported_ST

Senior member
Oct 10, 2004
733
0
0
I was about to purchase an additional 6800 based video card for my gf's new computer (a 3.2E Prescott) when i noticed that 6800 GT derivatives with 128MB GDDR1 mem and 350/700MHz core respectively proliferating around. Compared to my 6800 Ultra with 256MB GDDR3 running at stock 425/1.1GHz, you;d think it would be immensely slower. Well think again! I did some benchmarks on my board, underclocking it per se, to see the difference (although I can't account for the 128MB difficiency), and here are my results

Baseline: 6800 Ultra + (425/1.1GHz)
Aquamark3 - 65496
Doom3 (1280x1024 default) - 85.8fps
3DM05 - 5179

GT128: 6800 GT (350/700MHz)
Aquamark3 - 59244
Doom3 (1280x1024 default) - 73.3fps
3DM05 - 3866

GT128+: 6800 GT (425/700MHz)
Aquamark3 - 63019
Doom3 (1280x1024 default) - 79.2fps
3DM05 - 4355

What we see is that, in the most synthetic of benchmarks (3DMark05), the performance delta between the Ultras and the stock "defunct" 6800 GT is huge (up to 25%), but is more miniscule as you go to more "real world" testing scenarios (AM3 - 9.5% / Doom3 14.5%). SHould you effectively raise the core to be comparable to the 6800 Ultra, the GDDR1 memory doesn't hinder you THAT MUCH: 16% in 3DMark05 / 7% in DOOM3 / 3.1% in AM3. So in essence, although theoretically an increase in GDDR1 to GDDR3 should net high performance, as evident in more synthetic situations, more real world tests don't show it as meaningful yet IMHO. I believe these "detuned" 6800 GT's could be a steal, provided the price is right of course and don't give much way to a "real" stock GT , with core clock being comparable.....

Note: I realize this is a small sampling of tests, but I believe DOOM3 to be indicative of the "next" generation in game engines and AQ3 as somewhat of an old defacto benchmark.....as always YMMV...
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
There will be a much bigger difference in high resolutions where there are very high fillrates.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: virtualgames0
There will be a much bigger difference in high resolutions where there are very high fillrates.

true (as 3DM05 shows), but how many of us play at 16x12 AA/AF? Even on my Ultra card, i stop at 12x10 4X AA at the most...which then again begs up the original question, does it make a difference in real world?

 

df96817

Member
Aug 31, 2004
183
0
0
My belief is that "you should never go backwards in technology when upgrading". Of course that's if you can afford it. But if you can I'd say go for it.