Interesting X-Bit Results

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
ok i have sort of a general question. it seems like manufacturers keep raising and raising the clock/memory speeds but eventually they are going to hit a wall at some point. what else can they do to increase the power of these GPU's other than simply using faster memory and adding more pipelines?
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Sudheer Anne
ok i have sort of a general question. it seems like manufacturers keep raising and raising the clock/memory speeds but eventually they are going to hit a wall at some point. what else can they do to increase the power of these GPU's other than simply using faster memory and adding more pipelines?
They can do what ATI did with their thread dispatcher and new memory bus, which is to increase efficiency (keep more of the GPU busy more of the time, rather than have some parts wait to retrieve info from other parts). Besides that, it's either more bits (ALUs/TMUs/ROPs/etc.) or higher clocks.

Originally posted by: CPlusPlusGeek
Bigger memory bus (IE 512-bit)
Actually, I think all cards with 256-bit DDR external memory buses operate with two 256-bit SDR (= 512-bit) buses internally, as it's cheaper in terms of transistors. That's what I read in threads about the R520's bus, anyway.

Unless you meant they can move to a wider external bus to increase speed, in which case excuse my lecture. But I've also read that a 512-bit DDR external bus will just take too many traces and too much board space to be cost-effective in the near future. Then imagine that they'll have to double the GPU space to accomodate two 512-bit SDR buses, and we're talking a whole lotta space. It seems easier to just bump the speed.
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Gamingphreek
While the X1800XT leads 99% of the time with any sort of eye candy features on, it is interesting that the 7800's have much higher minimum framerates in many games for some reason.

Example?
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
D3D games which 7800 GTX leads with a >1 minimum fps are:

1. Fear
2. Pariah
3. Warhammer


D3D games which the x1800xt leads with a >1 min fps:

1. Project Snowblind
2. SC:CT
3. Colin McRae Rally
4. Perimeter


So the X1800XT has more D3D games it's faster in with min. fps counts (and they tend to be a lot larger). If you count OpenGL games, then the numbers are about even. But since we all know ATi cards suck at OpenGL games, I discounted those from the list. There were more D3D games listed in the benchmarks but they didn't have min. fps numbers (e.g. UT2004 and BF2). Also it should be noted that the ATi card takes a much smaller hit when AA/AF are enabled in most D3D games. Overall the cards come out about even in the X-bit benchmarks.

However, what a lot of these reviewers leave out are the extras you get with each companies video card or drivers. For example, with the nVidia cards you get the option of modifying your monitors frequency (very important for LCDs like the 2005 FPW) where the ATi card only lets you choose from the driver selected frequency. Just to give you an example of this, take a look at this thread: http://www.hardforum.com/showthread.php?t=942537
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Yeah, interesting. I mean, its kind of stupid to discout opengl games as though they somehow aren't relevant (Riddick, Doom, Quake 4)


But in D3D the XT does seem to do better on Mins so I'm not quite sure what Xbit is talking about. I have complained in the past that the GTX takes an absurd hit when AA is turned on, I'm talking in BF2 w/ 2AA to 4AA nearly a 30% hit, which is BS because I didnt buy this card to not have AA
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Frackal
Yeah, interesting. I mean, its kind of stupid to discout opengl games as though they somehow aren't relevant (Riddick, Doom, Quake 4)


But in D3D the XT does seem to do better on Mins so I'm not quite sure what Xbit is talking about. I have complained in the past that the GTX takes an absurd hit when AA is turned on, I'm talking in BF2 w/ 2AA to 4AA nearly a 30% hit, which is BS because I didnt buy this card to not have AA


Well OpenGL games were discounted because ATi has no chance in winning any of them. So you can do the math and add them in yourself if needed. I don't think even the most fanATical ATi fan will argue that ATi has a chance against nVidia in OpenGL. Keep in mind they didn't do min. fps for some of the D3D games which ATi has a strong lead in either.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: 5150Joker
Originally posted by: Frackal
Yeah, interesting. I mean, its kind of stupid to discout opengl games as though they somehow aren't relevant (Riddick, Doom, Quake 4)


But in D3D the XT does seem to do better on Mins so I'm not quite sure what Xbit is talking about. I have complained in the past that the GTX takes an absurd hit when AA is turned on, I'm talking in BF2 w/ 2AA to 4AA nearly a 30% hit, which is BS because I didnt buy this card to not have AA


Well OpenGL games were discounted because ATi has no chance in winning any of them. So you can do the math and add them in yourself if needed. I don't think even the most fanATical ATi fan will argue that ATi has a chance against nVidia in OpenGL.


I've always wondered if that was due to nV being formed by ex- SGI people?
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Originally posted by: CPlusPlusGeek
ATI handles AA better due to it having 10GB more memory bandwidth @ 1500mhz DDR


It could be.

But everyone does know that the ring bus was actually made to use GDDR4 as well. And the R5xx boards are actually compatible with that memory.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
He said "memory BANDWIDTH"



Now the interesting thing is that I checked the Anandtech review of the GTX @ 500/1350 and it STILL takes a huge ass hit from AA


~ 40%!! (82-48!) in BF2 2024x1536

~ Almost 50% in Doom 3! @ 1920x1440 (83fps down to 45!)



In my experience as well the GTX takes an absurd performance hit from AA, probably my biggest complaint about it.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Frackal
He said "memory BANDWIDTH"



Now the interesting thing is that I checked the Anandtech review of the GTX @ 500/1350 and it STILL takes a huge ass hit from AA


~ 40%!! (82-48!) in BF2 2024x1536

~ Almost 50% in Doom 3! @ 1920x1440 (83fps down to 45!)



In my experience as well the GTX takes an absurd performance hit from AA, probably my biggest complaint about it.


Yeah that is one of the things I noticed is the big hit with AA. I was hoping nVidia would optmize this a bit in the 80xx series drivers and they have slightly but more work needs to be done. Anyway, the reason you don't see as big of a hit with these ATi cards is the abundance of memory bandwidth they have in addition to the 512 bit ring bus architecture which helps with effeciency.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
What I wonder is how a

GTX oc'ed w/ 1400mhz memory compares to

X1800XT stock

X1800XT OC'ed

 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Frackal
What I wonder is how a

GTX oc'ed w/ 1400mhz memory compares to

X1800XT stock

X1800XT OC'ed

Well I've seen 1700 mhz on an OC'd x1800xt so that is about 54.4 GB/s and 1400 mhz on a GTX would be 44.8 GB/s so you'd still have a 10 GB/s difference.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Hmmm, i suppose i wasn't thinking. I think my question would be better phrased had i said:

"The 7800GTX's minimum framerates are much more comparable and competitive in contrast to their max framerates in which ATI sort of runs away with AA)"

-Kevin