The Buyer's Guide: 6600GT - GTX280 / HD4870X2

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

guezz

Member
May 10, 2006
45
0
0
Originally posted by: Elfear
Originally posted by: guezz
Originally posted by: Elfear
Looks very good. I'll echo what others have said that the guide must have taken you a long time to complete.

The only thing I'd add in the SLI negative comments is the tearing that can occur with LCDs (maybe CRTs too??). Myself and many others have noted that unfortunate effect. One of the reasons I decided to go back to a single card for awhile.
So tearing with SLI can be worse than with a single card? Did using v-sync (some games work, no) solve the problem?

Edit
Is still triple buffering ****** up?

Tearing with SLI was much worse than with a single card. It was really weird too because framerate would be great but I'd get very annoying tearing. V-sync would fix some of the problem but when you get into a hairy game that can't maintain 60fps than you get horrible performance. Triple buffering kinda worked in OGL games (if my memory serves me correctly) but I was out of luck in D3D games. The games where it occured the worst were Source-based games, Fear, and I think COD2.
Thanks for the reply, much appreciated.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Elfear
Originally posted by: guezz
Originally posted by: Elfear
Looks very good. I'll echo what others have said that the guide must have taken you a long time to complete.

The only thing I'd add in the SLI negative comments is the tearing that can occur with LCDs (maybe CRTs too??). Myself and many others have noted that unfortunate effect. One of the reasons I decided to go back to a single card for awhile.
So tearing with SLI can be worse than with a single card? Did using v-sync (some games work, no) solve the problem?

Edit
Is still triple buffering ****** up?
Tearing with SLI was much worse than with a single card. It was really weird too because framerate would be great but I'd get very annoying tearing. V-sync would fix some of the problem but when you get into a hairy game that can't maintain 60fps than you get horrible performance. Triple buffering kinda worked in OGL games (if my memory serves me correctly) but I was out of luck in D3D games. The games where it occured the worst were Source-based games, Fear, and I think COD2.
Triple-buffering normally doesn't work with D3D games unless you use a third party tool like DirectX Tweaker. Are you sure you did this? If not, then SLI may not be the problem.
 

Elfear

Diamond Member
May 30, 2004
7,126
738
126
Originally posted by: nullpointerus

Triple-buffering normally doesn't work with D3D games unless you use a third party tool like DirectX Tweaker. Are you sure you did this? If not, then SLI may not be the problem.

I never tried DXTweaker to tell you the truth. I perused NvNews for awhile trying to find a solution to the tearing. The lack of success I found with trying to get DXTweaker to make Triple Buffering work with the games I played was disheartening. I did a lot of research in an effort to make my 7800GTs play nice, but I never got it resolved. If you do a search over at NvNews and here at AT, you can see it's definetely a limitation of SLI.
 

guezz

Member
May 10, 2006
45
0
0
The guide is now updated:
02 August 06: Added: Fillrate and memory bandwith and two negative things about SLI. Updated dongle-less Crossfire.

I will look at encoding/decoding later.
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Very informative but how about a normalized ranking of the cards added at the end... choosing a couple common resolutions like 16x12 and 10x7 with some eye candy and rank?
 

Vallybally

Senior member
Oct 5, 2004
259
0
0
Guezz, thank you for that excellent post. Will you keep it updated for years to come, and where is the one main location where you think it will last the longest time, 3dmark forum?

Edited to add 2 observations and 2 questions.

Observation 1: I believe you should have the ATI X1900 XT/X as slightly better/faster than equivalently marketted nVidia 7800/7900 GTX's. Particularly in new shader heavy games like Oblivion where those market leading 48 pixel shader units seem to come in handy. I mean it doesn't quite make sense to me that you say the X1900XT is slightly slower while the X1900XTX is slightly faster than the 7900 GTX considering its very minor overclock of 25/50. For sure, the 7950 GX2 is slightly ahead of both current ATI flagships, (just keep in mind it's really 2 cards in 1 SLI card configuration and may suffer from any SLI bugs out there and extra power/heat). Also you may want to make a consideration for Image Quality where most consumers/reviewers seem to favor ATI in the IQ department.

Observation 2: Woot, according to that interesting PSU calculator, my system with a new X1900 XT should only need a PSU rated at 429w! Mine's an Antec TrueBlue 480w so I should be ok in theory (I was worried about this and did not want to waste more money). I hope it takes into account that most PSUs are about 80% effective...

Question 1: In all your research, do you believe that the general trend in overall performance can generally be attributed to the few variables listed underneath the cards? (pixel shader units, TMUs, ROPs, vertex shader units, core/mem speed). In other words, if there is a very strong correlation, I wonder if some simple formula or calculator can be devised to allow us to just plug in the values for the card we are considering and get a raw simple number that shows the relative speed/performance of the card.

Question 2: What does "* Games can be depended on profiles for optimal performance. This is the reason for still profile support despite all games being supported." mean?
 

guezz

Member
May 10, 2006
45
0
0
Originally posted by: Vallybally
Guezz, thank you for that excellent post. Will you keep it updated for years to come, and where is the one main location where you think it will last the longest time, 3dmark forum?
Thanks!

I will try, and yes, use the guide at Futuremark.


Observation 1: I believe you should have the ATI X1900 XT/X as slightly better/faster than equivalently marketted nVidia 7800/7900 GTX's. Particularly in new shader heavy games like Oblivion where those market leading 48 pixel shader units seem to come in handy. I mean it doesn't quite make sense to me that you say the X1900XT is slightly slower while the X1900XTX is slightly faster than the 7900 GTX considering its very minor overclock of 25/50. For sure, the 7950 GX2 is slightly ahead of both current ATI flagships, (just keep in mind it's really 2 cards in 1 SLI card configuration and may suffer from any SLI bugs out there and extra power/heat). Also you may want to make a consideration for Image Quality where most consumers/reviewers seem to favor ATI in the IQ department.
Those 25 MHz and 100 MHz results in a improved fillrate of 0.4 GPixel/s and memory bandwith 3.2 GB/s - it's a bit faster. It's true that in reviews the difference between XT and XTX aren't large. When I rank cards it's to show the overall performance (D3D, OpenGL, AA, resolution, etc). A 7900 GTX can beat a X1900 XT in various D3D games and the overall OpenGL performance is better.

I have taken somewhat into concideration "Pixel shader intensive" games with this quote: "When more pixel shader intensive games are released an increase in performance is expected."

IQ
Have you read the ?Technologies? section?

A 7950 GX2 isn't just slightly faster than a X1900 XTX.

Observation 2: Woot, according to that interesting PSU calculator, my system with a new X1900 XT should only need a PSU rated at 429w! Mine's an Antec TrueBlue 480w so I should be ok in theory (I was worried about this and did not want to waste more money). I hope it takes into account that most PSUs are about 80% effective...
To be honest I don't know.

Question 1: In all your research, do you believe that the general trend in overall performance can generally be attributed to the few variables listed underneath the cards? (pixel shader units, TMUs, ROPs, vertex shader units, core/mem speed). In other words, if there is a very strong correlation, I wonder if some simple formula or calculator can be devised to allow us to just plug in the values for the card we are considering and get a raw simple number that shows the relative speed/performance of the card.
If I must give you a few formulas it must be these:
Pixel Fill-rate (pixel output): number of ROPs * core frequency
Texture Fill-rate: number of TMUs * core frequency
Memory bandwith: (width of bus * memory frequency) / 8

It should be noted that you can?t blindly trust these numbers (theoretical maximum) to give the correct real-life performance. Excellent example to this is X1900 XTX vs. X850 XT PE

X1900 XTX likes pixel shading intensive games while 7900 GTX likes texture (fill-rate) intensive games. It should be noted that I think pixel shading will play a larger role in the future.

nVidia is more efficient clock-for-clock - architectonical differences can result in different performance.

Question 2: What does "* Games can be depended on profiles for optimal performance. This is the reason for still profile support despite all games being supported." mean?
After the 8x. drivers you can use ?global settings: AFR, AFR2 or SFR? to force one SLI rendering to all games. If you choose SFR and a game?s performance scales better with AFR ? a driver profile is then used. Also some games may have redering issues if you use the ?incorrect? rendering.
 

guezz

Member
May 10, 2006
45
0
0
Originally posted by: Zebo
Very informative but how about a normalized ranking of the cards added at the end... choosing a couple common resolutions like 16x12 and 10x7 with some eye candy and rank?
No, I don't think so, sorry.

The guide is now updated:
06 August 06: Added: Softmodding of: X800 GT, X1800 GTO and X1900 GT. Fixed: 6800 XT softmodding clarification and better R4X0 SM3 (no support) clarification.
 

guezz

Member
May 10, 2006
45
0
0
Big update!

18 September 06: Added: X1650 Pro, X1900 XT 256MB, X1950 XTX, 7300 GT, 7900 GS and 7950 GT. TAAA for NV4X. Corrected: Terrible typo 7900 GTX vs. 7900 GT (30%, not 15%!) and updated X1900 GT performance.
 

guezz

Member
May 10, 2006
45
0
0
16 October 06: Added: 7900 GTO, 7600 GT AGP, 7300 GT GDDR3, new revision of X1900 GT and X1300 XT. Native HDCP cards and (D)-DLDVI included. New sections added under Media.
 

guezz

Member
May 10, 2006
45
0
0
Big Update!

21 November 06: Added: 8800GTS, 8800GTX and X1950 XT 256MB. Technology section for G8X. Update of Crossfire.
24 November 06: Added: X1650 XT and a Disclaimer. New sections under G8X, updated HDCP and corrected 8800 GTX performance.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Um...

Positive things about Crossfire

*SuperAA supports up to 14xAA and 32xAF

32xAF? i dont think such a level of anistropic filtering exists.

Maybe you could add in SLi/Crossfire performance ratings.

Also its Transparency Anti Aliasing. There is no word "adaptive". The 6 series hence the nv4x series can use this feature as well.

For the G8x, you should put both CSAA and the 'perfect AF' under the heading of lumenex technology/engine.

# NVIDIA Lumenex Technology (G80)

* Full FP32 floating point support throughout the entire pipeline
* FP32 floating point frame buffer support
* Up to 8x, gamma adjusted, native multisampling FSAA with jittered or rotated grids
* Up to 16x, coverage sample antialiasing
* Transparent multisampling and supersampling
* Lossless color, texture, Z and stencil data compression
* Fast Z clear
* Up to 16x anisotropic filtering

Could also add in the quantum physics technology for G80. This havent been used yet since havokFX haasnt really been launched.

If youve added technologies such as 3dc, maybe you should add in the other stuff like ultra shadow 2 technology for nVIDIA for example. This is one of the reasons that nVIDIA performs much better than ATi cards when lots of stencil shadows are used.

Also, you could add stuff like the process these cards are on, the code names e.g NV40, and maybe thing that are important to a consumer, e.g heat/noise by taking in people's personal experience with the cards.

Other than this, good stuff~~~ :thumbsup:
 

guezz

Member
May 10, 2006
45
0
0
Originally posted by: Cookie Monster
32xAF? i dont think such a level of anistropic filtering exists.
You're correct, and it was corrected months ago, in the former Norwegian edition ... Thank you for noticing it.

Maybe you could add in SLi/Crossfire performance ratings.
The suggestion is interesting and has crossed my mind earlier. I will reconsider the issue, although it should be noted that this would require a lot of work.

Also its Transparency Anti Aliasing. There is no word "adaptive". The 6 series hence the nv4x series can use this feature as well.
I will look at it, also re-read the section (the end of it) about NV4X ...

For the G8x, you should put both CSAA and the 'perfect AF' under the heading of lumenex technology/engine.
I don't think this is of importance since it's clearly G8X only technologies as presented by the guide, this is IMO enough.

Could also add in the quantum physics technology for G80. This havent been used yet since havokFX haasnt really been launched.
Interesting, if so, also ATI's equivalent should be mentioned as well.

If youve added technologies such as 3dc, maybe you should add in the other stuff like ultra shadow 2 technology for nVIDIA for example. This is one of the reasons that nVIDIA performs much better than ATi cards when lots of stencil shadows are used.
It's noted. Can you mention unbiased sites where this feature is discussed (not nVidia hype crap).

Also, you could add stuff like the process these cards are on, the code names e.g NV40, and maybe thing that are important to a consumer, e.g heat/noise by taking in people's personal experience with the cards.
I will think about adding process and detailed code names.

Noise is a touchy issue since it?s hard to accurately quantify such a thing. I have earlier thought about it but was not adviced to do so by a physics teacher (a Beyond3D writer). Also, personal experience of sound is highly subjective so comparisons are impossible. Heat has too many unknown factors (case ventilation, room temperature, ect.). :confused:

Other than this, good stuff~~~ :thumbsup:
Thanks, and your criticism aren't all that severe.

Thanks a lot for your reply. :beer:
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
In the Video Decoding section:

Delivers the most superior IQ on the marked.

I suppose you meant market?

REad through this and maybe could write some more stuff to the section e.g pure video HD.

The PureVideo HD software from NVIDIA is built into their NVIDIA Forceware drivers and is mostly transparent to the end user.

Im sure you dont need the pure video decoder anymore or am i wrong? someone correct me please.

http://www.hothardware.com//viewarticle.aspx

So you can see that AVIVO is evenly matched with pure video as of today.
 

WobbleWobble

Diamond Member
Jun 29, 2001
4,867
1
0
Great guide!

Side note - your Pure Video link is broken.

- G7X
* Very good decoding performance, also in the new formats: VC-1 and H.264.
* Purevideo costs: $20-50
* The complete list of supported formats
 

guezz

Member
May 10, 2006
45
0
0
Originally posted by: Cookie Monster
In the Video Decoding section:

Delivers the most superior IQ on the marked.

I suppose you meant market?

REad through this and maybe could write some more stuff to the section e.g pure video HD.

The PureVideo HD software from NVIDIA is built into their NVIDIA Forceware drivers and is mostly transparent to the end user.

Im sure you dont need the pure video decoder anymore or am i wrong? someone correct me please.

http://www.hothardware.com//viewarticle.aspx

So you can see that AVIVO is evenly matched with pure video as of today.
I will look into it. Fix the link?

Originally posted by: WobbleWobble
Great guide!

Side note - your Pure Video link is broken.

- G7X
* Very good decoding performance, also in the new formats: VC-1 and H.264.
* Purevideo costs: $20-50
* The complete list of supported formats
*Fixed*
 

guezz

Member
May 10, 2006
45
0
0
18 February 07: Added: 8800 GTS 320MB, X1950 GT and provided more information about G80 performance. Update: video decoding (e.g. added G8X).