The Buyer's Guide: 6600GT - GTX280 / HD4870X2

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Its pretty awesome as it is, the only thing I would change is x1800xl > 7800 gt in about every game once AA is applied (ok, minus OGL ones)
 

LW07

Golden Member
Feb 16, 2006
1,537
2
81
It's very well done, but usually a X1900XT can match a 7900GTX in performance, and is usually faster than the 512mb 7800GTX if you're not running an OpenGL game, which then sometimes it runs a bit slower than a 7800GTX 256mb.
 

Ike0069

Diamond Member
Apr 28, 2003
4,276
2
76
After some thought, I think you could add one more direct comparison to the review.

Put in one of the factory OC'd 7900GT's. Since so many of 7900GT owners either OC their stock card, or buy one factopry OC'd, or OC their already factory OC'd 7900GT; it actually makes more sense to me to compare one say at at least 500/1500 (as opposed to 450/1320 stock speeds).

Now I realize most all VC's can be OC'd somewhat, but also most 7900GT's can be clocked higher than 500/1500. That is definitely one for the draws for this card.

For example, I would like to see where an EVGA 7900GT KO (520/1540) compares to the 1800XT and X1900XT. Is it really worth paying $400 for a X1900XT, when I can get a 7900GT KO for $270 ($240 AR)? Does the extra 256MB RAM on the X1900XT/7900GTX make much of a difference in today's games?
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
i have question. sorry to hijack thread.

you said.
- R4X0
* Supports encoding and decoding of major formats like: MPEG1/2/4, Real Media, DivX og WMV9. It?s very important to note that only VIVO cards can encode content from external devices (video camera, etc).
* Supports techniques which improves IQ though the use of: deinterlacing and shaders.

does the x800 series support mpeg4 encoding. if so which programs ?
would it be possible to capture tv (thorugh a pci tv tuner ) and have the encoding burden shared by the processor and gpu. (my x800gt )
 

guezz

Member
May 10, 2006
45
0
0
I'm sorry for the very late response.

Thanks for the kind words! :)

Originally posted by: ShadowOfMyself
Its pretty awesome as it is, the only thing I would change is x1800xl > 7800 gt in about every game once AA is applied (ok, minus OGL ones)
It reflects the overall performance level, although it's kinda mentioned:
Better performance (AA / AF)
ATI is normally less penalized in performance when AA and AF are being used. This can turn a tie into an overall victory if AA and AF are enabled.
In about every D3D game is a bit exaggerated by you.
Originally posted by: LW07
It's very well done, but usually a X1900XT can match a 7900GTX in performance, and is usually faster than the 512mb 7800GTX if you're not running an OpenGL game, which then sometimes it runs a bit slower than a 7800GTX 256mb.
I feel the overall performance (D3D/OpenGL/pure speed/AA and AF/resolution scaling) is very good reflected. Yes, sometimes the X1900 XT matches or exceeds a 7900 GTX, especially when using high resolutions with AA and AF in D3D games. The guide doesn't in any way give the impression that the performance gap between the two is significant.

Originally posted by: Ike0069
After some thought, I think you could add one more direct comparison to the review.

Put in one of the factory OC'd 7900GT's. Since so many of 7900GT owners either OC their stock card, or buy one factopry OC'd, or OC their already factory OC'd 7900GT; it actually makes more sense to me to compare one say at at least 500/1500 (as opposed to 450/1320 stock speeds).

Now I realize most all VC's can be OC'd somewhat, but also most 7900GT's can be clocked higher than 500/1500. That is definitely one for the draws for this card.

For example, I would like to see where an EVGA 7900GT KO (520/1540) compares to the 1800XT and X1900XT. Is it really worth paying $400 for a X1900XT, when I can get a 7900GT KO for $270 ($240 AR)? Does the extra 256MB RAM on the X1900XT/7900GTX make much of a difference in today's games?
I will never deviate from nVidia's reference frequencies or else the guide will turn into chaos.
You talk about 7900 GT, but there're other cards (7600 GT, etc) which this also applies to.

256 vs. 512 MB is a valid point which I kinda want to address in a general video card guide. I'm still undecided if such a guide will be created, though. Do you think this subject is better suited here (by creating a ?buying advices? section or similar)?

Originally posted by: tanishalfelven
i have question. sorry to hijack thread.

you said.
- R4X0
* Supports encoding and decoding of major formats like: MPEG1/2/4, Real Media, DivX og WMV9. It?s very important to note that only VIVO cards can encode content from external devices (video camera, etc).
* Supports techniques which improves IQ though the use of: deinterlacing and shaders.

does the x800 series support mpeg4 encoding. if so which programs ?
would it be possible to capture tv (thorugh a pci tv tuner ) and have the encoding burden shared by the processor and gpu. (my x800gt )
Decoding/encoding support and actual usage are my achilles heel in the guide because of a pretty limited knowledge of these particular subjects.

It says it supports decoding and encoding of these formats at ATI's webpage, although if they mean hardware based for all, I don't know. Relevant information is imo hard to find but I will try to improve it by researching more.

I will revise the guide if found incorrect.
 

guezz

Member
May 10, 2006
45
0
0
Originally posted by: Frackal
ATI has digital vibrance too under the saturation setting in "Avivo Color"
Colour saturation
The feature is very similar to nVidia?s Digital Vibrance: ?It makes colours more vibrant. This is especially an advantage for those who have a cheap monitor with bland colours. This is mainly a love/hate feature.? ATI?s approach is hardware based while nVidia does it via software. Picture
;)
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Great, I love it. After a brief glance though, about the digital vibrance/color saturation: ATI's is hardware based while NVIDIA's is software based? I'm not sure about that. What I do know is that NVIDIA's controls the whole DVI port so I would call that more hardware than software (overlay video is also affected).

To my knowledge, there are no encoders that currently use the GPU (except AVIVO?). Several programs can use the pixel shaders whenever they wish to speed up certain operations for filters (for example 3D noise removal in avisynth). MPEG-2 decode acceleration is very common, and H.264 works for a couple decoders on the 7xxx series. The main point is that these features have to be specifically engaged. It's not just like putting a faster CPU in your PC.
 

guezz

Member
May 10, 2006
45
0
0
Originally posted by: xtknight
Great, I love it. After a brief glance though, about the digital vibrance/color saturation: ATI's is hardware based while NVIDIA's is software based? I'm not sure about that. What I do know is that NVIDIA's controls the whole DVI port so I would call that more hardware than software (overlay video is also affected).

To my knowledge, there are no encoders that currently use the GPU (except AVIVO?). Several programs can use the pixel shaders whenever they wish to speed up certain operations for filters (for example 3D noise removal in avisynth). MPEG-2 decode acceleration is very common, and H.264 works for a couple decoders on the 7xxx series. The main point is that these features have to be specifically engaged. It's not just like putting a faster CPU in your PC.
NVIDIA confirmed that DVC was all software and there?s no special hardware that the control takes advantage of to somehow make your images look brighter. Since it?s all software it makes sense that DVC could be applied to any of NVIDIA's video cards, but this feature seems to be more marketing hype rather than a truly useful feature.
Source (I don't know if the newer DVC 3.0 is actually hardware based though).

I'm humble about what I have written about encoding as previously stated. Can you provide information to confirm actual hardware enconding acceleration.

 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
All I have to confirm acceleration is in terms of decoding. Here are benchmarks for H.264/7800GT/Nero ShowTime running the Da Vinci Code 1080i Teaser. Acceleration can be turned on/off in the Nero ShowTime player.

HW accel: http://xtknight.atothosting.com/benchmarks/h264hw.png
HW accel, lower core speed: http://xtknight.atothosting.com/benchmarks/h264hwlc.png
HW accel, lower RAM speed: http://xtknight.atothosting.com/benchmarks/h264hwlm.png
No HW accel: http://xtknight.atothosting.com/benchmarks/h264nohw.png

The rest of the system is listed in 'Main Rig' below. If you decide to link/use these in your thread, go right ahead.
 

guezz

Member
May 10, 2006
45
0
0
Originally posted by: xtknight
All I have to confirm acceleration is in terms of decoding. Here are benchmarks for H.264/7800GT/Nero ShowTime running the Da Vinci Code 1080i Teaser. Acceleration can be turned on/off in the Nero ShowTime player.

HW accel: http://xtknight.atothosting.com/benchmarks/h264hw.png
HW accel, lower core speed: http://xtknight.atothosting.com/benchmarks/h264hwlc.png
HW accel, lower RAM speed: http://xtknight.atothosting.com/benchmarks/h264hwlm.png
No HW accel: http://xtknight.atothosting.com/benchmarks/h264nohw.png

The rest of the system is listed in 'Main Rig' below. If you decide to link/use these in your thread, go right ahead.
Thanks for the graphs, although this isn't something new to me. Hardware accelerated encoding is for me more interesting to know more about.

I might include them (credited of course) since it shows the benefit of hardware accelerated decoding so clearly.
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Looks very good. I'll echo what others have said that the guide must have taken you a long time to complete.

The only thing I'd add in the SLI negative comments is the tearing that can occur with LCDs (maybe CRTs too??). Myself and many others have noted that unfortunate effect. One of the reasons I decided to go back to a single card for awhile.
 

guezz

Member
May 10, 2006
45
0
0
Originally posted by: Elfear
Looks very good. I'll echo what others have said that the guide must have taken you a long time to complete.

The only thing I'd add in the SLI negative comments is the tearing that can occur with LCDs (maybe CRTs too??). Myself and many others have noted that unfortunate effect. One of the reasons I decided to go back to a single card for awhile.
So tearing with SLI can be worse than with a single card? Did using v-sync (some games work, no) solve the problem?

Edit
Is still triple buffering ****** up?
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Originally posted by: guezz
Originally posted by: Elfear
Looks very good. I'll echo what others have said that the guide must have taken you a long time to complete.

The only thing I'd add in the SLI negative comments is the tearing that can occur with LCDs (maybe CRTs too??). Myself and many others have noted that unfortunate effect. One of the reasons I decided to go back to a single card for awhile.
So tearing with SLI can be worse than with a single card? Did using v-sync (some games work, no) solve the problem?

Edit
Is still triple buffering ****** up?

Tearing with SLI was much worse than with a single card. It was really weird too because framerate would be great but I'd get very annoying tearing. V-sync would fix some of the problem but when you get into a hairy game that can't maintain 60fps than you get horrible performance. Triple buffering kinda worked in OGL games (if my memory serves me correctly) but I was out of luck in D3D games. The games where it occured the worst were Source-based games, Fear, and I think COD2.