***7950GX2 Thread*** (w/ Review Collection)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

thilanliyan

Lifer
Jun 21, 2005
12,064
2,277
126
Originally posted by: Bobthelost
Originally posted by: Genx87
At the end of the day people always take performance over the pretty gimicks. That 7950X2 getting about 50% more frames in Oblvion at 1600X1200 than a X1900XTX will sway many people to pay 100 bucks more for it.

Over the next few months the price will drop and push the X1900XTX further down.

You got a source for that? I looked at 5 of the reviews on the first page and none of them benched with oblivion. (Where ATI has a firm lead from all i've heard, odd that it's not being used to bench with :disgust: )

If we say that the 7950X2 performs around as well as 7900GT SLI (give or take a bit) then looking at this test http://www.anandtech.com/video/showdoc.aspx?i=2746&p=4 i can't see how on earth that sort of improvement is happening since the X1900XTX leads by 8%. A slight win to the 7950X2 perhaps, but 50% !?

They bench Oblivion in the Techreport review. However, they have a 7900GTX beating the X1900XTX also, which goes against the reviews I've seen, so take it with a pinch o salt. Maybe they used the numbers for other cards from previous reviews they did.
 

Bobthelost

Diamond Member
Dec 1, 2005
4,360
0
0
Originally posted by: thilan29
Originally posted by: Bobthelost
Originally posted by: Genx87
At the end of the day people always take performance over the pretty gimicks. That 7950X2 getting about 50% more frames in Oblvion at 1600X1200 than a X1900XTX will sway many people to pay 100 bucks more for it.

Over the next few months the price will drop and push the X1900XTX further down.

You got a source for that? I looked at 5 of the reviews on the first page and none of them benched with oblivion. (Where ATI has a firm lead from all i've heard, odd that it's not being used to bench with :disgust: )

If we say that the 7950X2 performs around as well as 7900GT SLI (give or take a bit) then looking at this test http://www.anandtech.com/video/showdoc.aspx?i=2746&p=4 i can't see how on earth that sort of improvement is happening since the X1900XTX leads by 8%. A slight win to the 7950X2 perhaps, but 50% !?

They bench Oblivion in the Techreport review. However, they have a 7900GTX beating the X1900XTX also, which goes against the reviews I've seen, so take it with a pinch o salt. Maybe they used the numbers for other cards from previous reviews they did.

Thanks, i could have sworn i'd checked that one :eek:

Those numbers do look rather iffy.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: Genx87
Originally posted by: Bobthelost
Originally posted by: Genx87
At the end of the day people always take performance over the pretty gimicks. That 7950X2 getting about 50% more frames in Oblvion at 1600X1200 than a X1900XTX will sway many people to pay 100 bucks more for it.

Over the next few months the price will drop and push the X1900XTX further down.

You got a source for that? I looked at 5 of the reviews on the first page and none of them benched with oblivion. (Where ATI has a firm lead from all i've heard, odd that it's not being used to bench with :disgust: )

If we say that the 7950X2 performs around as well as 7900GT SLI (give or take a bit) then looking at this test http://www.anandtech.com/video/showdoc.aspx?i=2746&p=4 i can't see how on earth that sort of improvement is happening since the X1900XTX leads by 8%. A slight win to the 7950X2 perhaps, but 50% !?

So it was 35%, sue me :D

http://www.techreport.com/reviews/2006q2/geforce-7950-gx2/index.x?pg=5

In the game oblivion you don't need that much speed. It looks fine at 30fps and that is something the ATI card can do with better image quality IMO. Those test don't say where they where at in the game, oblivion fps change alot when out side with trees and such. Not only that but the ATI runs on high quality stock where as NV runs quality and when put on high quality has a noticeable hit with performance.

There comes a time when more speed doesn't change the gaming experience whereas better IQ does. This is the reason we went to AA, AF, and higher resolutions in the first place. Ya you could probably get 100+fps if you ran at 1024x768 without AA or AF but why would you want to do that when you can run at 60fps at 1600x1200 4xAA and 16xAF along with HDR?
 

imported_obsidian

Senior member
May 4, 2004
438
0
0
Originally posted by: redbox
Originally posted by: Genx87
Originally posted by: Bobthelost
Originally posted by: Genx87
At the end of the day people always take performance over the pretty gimicks. That 7950X2 getting about 50% more frames in Oblvion at 1600X1200 than a X1900XTX will sway many people to pay 100 bucks more for it.

Over the next few months the price will drop and push the X1900XTX further down.

You got a source for that? I looked at 5 of the reviews on the first page and none of them benched with oblivion. (Where ATI has a firm lead from all i've heard, odd that it's not being used to bench with :disgust: )

If we say that the 7950X2 performs around as well as 7900GT SLI (give or take a bit) then looking at this test http://www.anandtech.com/video/showdoc.aspx?i=2746&p=4 i can't see how on earth that sort of improvement is happening since the X1900XTX leads by 8%. A slight win to the 7950X2 perhaps, but 50% !?

So it was 35%, sue me :D

http://www.techreport.com/reviews/2006q2/geforce-7950-gx2/index.x?pg=5

In the game oblivion you don't need that much speed. It looks fine at 30fps and that is something the ATI card can do with better image quality IMO. Those test don't say where they where at in the game, oblivion fps change alot when out side with trees and such. Not only that but the ATI runs on high quality stock where as NV runs quality and when put on high quality has a noticeable hit with performance.

There comes a time when more speed doesn't change the gaming experience whereas better IQ does. This is the reason we went to AA, AF, and higher resolutions in the first place. Ya you could probably get 100+fps if you ran at 1024x768 without AA or AF but why would you want to do that when you can run at 60fps at 1600x1200 4xAA and 16xAF along with HDR?
Well first off, you can't compare "high quality" vs "quality" settings across drivers. Second, image quality and speed go hand in hand. Your statement makes absolutely no sense. In the open areas on higher settings of Oblivion ALL cards will fall below 30 fps so yes, this extra speed is getting used even now, and most likely even more in future games.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: obsidian
Originally posted by: redbox
Originally posted by: Genx87
Originally posted by: Bobthelost
Originally posted by: Genx87
At the end of the day people always take performance over the pretty gimicks. That 7950X2 getting about 50% more frames in Oblvion at 1600X1200 than a X1900XTX will sway many people to pay 100 bucks more for it.

Over the next few months the price will drop and push the X1900XTX further down.

You got a source for that? I looked at 5 of the reviews on the first page and none of them benched with oblivion. (Where ATI has a firm lead from all i've heard, odd that it's not being used to bench with :disgust: )

If we say that the 7950X2 performs around as well as 7900GT SLI (give or take a bit) then looking at this test http://www.anandtech.com/video/showdoc.aspx?i=2746&p=4 i can't see how on earth that sort of improvement is happening since the X1900XTX leads by 8%. A slight win to the 7950X2 perhaps, but 50% !?

So it was 35%, sue me :D

http://www.techreport.com/reviews/2006q2/geforce-7950-gx2/index.x?pg=5

In the game oblivion you don't need that much speed. It looks fine at 30fps and that is something the ATI card can do with better image quality IMO. Those test don't say where they where at in the game, oblivion fps change alot when out side with trees and such. Not only that but the ATI runs on high quality stock where as NV runs quality and when put on high quality has a noticeable hit with performance.

There comes a time when more speed doesn't change the gaming experience whereas better IQ does. This is the reason we went to AA, AF, and higher resolutions in the first place. Ya you could probably get 100+fps if you ran at 1024x768 without AA or AF but why would you want to do that when you can run at 60fps at 1600x1200 4xAA and 16xAF along with HDR?
Well first off, you can't compare "high quality" vs "quality" settings across drivers. Second, image quality and speed go hand in hand. Your statement makes absolutely no sense. In the open areas on higher settings of Oblivion ALL cards will fall below 30 fps so yes, this extra speed is getting used even now, and most likely even more in future games.

You can compare "high quality" vs. "quality" settings based on IQ and the difference between the fps. The point I was trying to make was that Oblivion doesn't need high frames to shine. Alot of people are used to FPS where you need high fps. Sure the extra speed is being used because Oblivion is a very hard game to run, however the way ATI and NV use this speed is different. On the ATI card you can enable different settings that make the game look a ton better. NV just make it run faster. And in a game where making it run faster doesn't make it look much better I would rather spend my money where it counts: in IQ.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Very nice to see this kind of progress with video cards. This card is pretty cool. About the same performance as 2x7900GT's, a little more money, more ram, and in a single card config. Despite the SLI in the drivers. Im all for it. Now, if NV would only put a little effort into IQ..

Originally posted by: Genx87
Originally posted by: redbox
Originally posted by: Jephph
1. SLI but no Crossfire benches on the Anandtech review? Hmmm, seems like someone's getting some dough from this situation.

2. Again in the Anandtech review, why at the end did they compare the 7950GX2 to an X1900XT? They're nowhere near the same price.

because that is the card eveyone is waiting for NV to beat. When did it come out and NV is still trying to beat it. NV needs to learn that they can't just make more horsepower, they need to do something different with that power. At the end of the day this just does the same thing my 7800gt that I bought nine months ago does....only faster. The ATI x1900xt however has better AF IMO and can do HDR+AA plus is very good with Oblivion. So I am very interested to see what NV can do against a x1900xt price aside.

At the end of the day people always take performance over the pretty gimicks. That 7950X2 getting about 50% more frames in Oblvion at 1600X1200 than a X1900XTX will sway many people to pay 100 bucks more for it.

Over the next few months the price will drop and push the X1900XTX further down.

While I agree, most people take pure performance over IQ, HQ AF is not a "gimmick". In the benchmark you linked, I can see the subpar AF right off. After having used the HQ AF for so long now, "regular" AF stands out easily. The X1900XT mentioned, and compared to the GX2 is at least $220 cheaper from newegg right now.

Thats without AA, who plays without AA with a $600+ card.. ? These sites need to run two different types of Oblivion benches, with HDR, and then again with AA. Since NV cant do them both. Some prefer AA over HDR. In fact, Id say most do. More numbers is better for customers.

And redbox makes some very good points.

All in all, pretty interesting card. For peole who do not have a SLI board, this is better than 2x7900GT's in SLI. If you can get it working correctly.
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
Not including X1900XTX but including 7800GTX 512 in the review is indeed strange. But it seems like the reviewer's focus is cleary on 7900GT SLI vs 7950GX2 throughout the review. It might be that the reviewer was considering that practically everyone buys "OC" version of 7900 series. All in all the big picture hasn't changed. Still, why the 7800GTX 512 is tested is curious. X1900GT, too. What price/performance bracket does X1900GT fit in? Is it faster than X1800XT?


 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
Originally posted by: Ackmed
Some prefer AA over HDR. In fact, Id say most do.

No. I'd just say No. Of course HDR+AA is most ideal and with X1900 series there is no reason to give up either of them. But most people prefer AA over HDR? That's a extremely bold statement. "Most people" <-- this should be used very, very cautiously since an argument will probably run circles around that "most people".

Many reviewers seem to prefer HDR to AA @1600x1200 and beyond. Large LCDs are cheaper than ever now, and I expect "most" AAA titles from 1/2 2006 and on will have HDR. Everybody does HDR. If not in the original game, via patches or expansions. I'd venture to guess @1024x768 and below resolution AA would be more beneficial. I'm playing games @1920x1200 and my choice is definitely HDR over AA.

I've recently re-played a few levels of Splinter Cell: Chaos Theory without HDR, and everything looked like painting on paper. Flat. Flat. Flat. I'll dig up some screenshots.
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
Reviews that make sense for me are ones that include 7900GTX, 7900 GTX SLI, 7950 GX2, 7900 GT SLI, x1900xtx, x1900xtx Crossfire

Why? this card is aimed @ high end so it should be all inclusive of the high end options for people like me.

A big downer is that there's no Quad SLI yet. Also, speed is bunk in some games like quake 4 and HL2 cos it'll be over 60 fps average @ insane resolutions. So, that's when you need to start using SLI AA modes. It seems that ATI has the upper hand in this department.

I'm always suspicious of reviews that post atypical results. Like the techreport review that has 7900 GTX beating an x1900xtx in Oblivion (btw, that pic doesn't seem to actually have 16xAF enabled...). Makes you wonder if they enabled every single optimization in the forceware drivers.
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
Apparently NV insisted that this is a "single" card to reviewers all around. And there is some truth to it. This card will run on motherboards with only 1 PCI-E x16 slot, and that is whether it's on Intel chipset, ATI chipset, or NV chipset. Indeed, I thought it's funny that the "supported motherboards" lists A8R32-MVP. At least AT promised a more in-depth review in near future so we'll see. This review seems nothing more than graphs and numbers.

P.S. I dug and found my old post with screenshots. HDR VS.

No HDR 1
HDR 1

No HDR 2
HDR 2

No HDR 3
HDR 3

Of course everybody has different tastes. Choices are good. ;)
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: Genx87
Originally posted by: Frackal
I wonder if they are releasing this now because G80 won't be out for awhile

That is what I am thinking. The G80 must be a late 06 or early 07 part now. I cant imagine them releasing this in June and then releasing a G80 in the next 3-4 months.

I can remember the 7800GTX512?
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Noise from guru3d

Ok, that is really quiet.

Its also amazing that this card consumes less power than the X1900XTX and still be fast as up to 50% in some games (mostly OpenGL titles, F.E.A.R etc)

This card garauntees the user to use 4xTR SSAA all the time or more (16xAA on almost all old games) with HQ turned on. Nice.

 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Interesting results there Cookie Monster.

I do like the power consumptions and such, but what about overclocking? I loved how the 7800GT had a good amount of overclocking head room, but with the build of this card I can't see any cooler out there to enable some good overclocks.

Also, I don't know why people keep comparing it to a regular X1900XTX, its not the same kind of card. It has two GPU's for crying out loud it should beat an X1900XTX.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: josh6079
Interesting results there Cookie Monster.

I do like the power consumptions and such, but what about overclocking? I loved how the 7800GT had a good amount of overclocking head room, but with the build of this card I can't see any cooler out there to enable some good overclocks.

Also, I don't know why people keep comparing it to a regular X1900XTX, its not the same kind of card. It has two GPU's for crying out loud it should beat an X1900XTX.

They keep comparing it to that card because it is at the top of the stack along with the 7900gtx. It is only natural to compare the two.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: redbox
Originally posted by: josh6079
Interesting results there Cookie Monster.

I do like the power consumptions and such, but what about overclocking? I loved how the 7800GT had a good amount of overclocking head room, but with the build of this card I can't see any cooler out there to enable some good overclocks.

Also, I don't know why people keep comparing it to a regular X1900XTX, its not the same kind of card. It has two GPU's for crying out loud it should beat an X1900XTX.

They keep comparing it to that card because it is at the top of the stack along with the 7900gtx. It is only natural to compare the two.

So far I've seen them compare it to a 7900GT SLI, but not an X1900XT Cross Fire. The only ATI comparision I've seen against it so far is a single X1900XT.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: josh6079
Originally posted by: redbox
Originally posted by: josh6079
Interesting results there Cookie Monster.

I do like the power consumptions and such, but what about overclocking? I loved how the 7800GT had a good amount of overclocking head room, but with the build of this card I can't see any cooler out there to enable some good overclocks.

Also, I don't know why people keep comparing it to a regular X1900XTX, its not the same kind of card. It has two GPU's for crying out loud it should beat an X1900XTX.

They keep comparing it to that card because it is at the top of the stack along with the 7900gtx. It is only natural to compare the two.

So far I've seen them compare it to a 7900GT SLI, but not an X1900XT Cross Fire. The only ATI comparision I've seen against it so far is a single X1900XT.

They did a 7900gt SLI for reverence. Because that is the closest card to it they wanted to see how the technology involved in the 7950GX2 handled against reqular SLI. It makes sense to me.
 

skooma

Senior member
Apr 13, 2006
635
28
91
Originally posted by: Cookie Monster
Noise from guru3d

Ok, that is really quiet.

Its also amazing that this card consumes less power than the X1900XTX and still be fast as up to 50% in some games (mostly OpenGL titles, F.E.A.R etc)

This card garauntees the user to use 4xTR SSAA all the time or more (16xAA on almost all old games) with HQ turned on. Nice.
Thanks man, I missed that.

Now I need to find a list of games that can run hdr+aa at the same time :p

 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: lopri
Originally posted by: Ackmed
Some prefer AA over HDR. In fact, Id say most do.

No. I'd just say No. Of course HDR+AA is most ideal and with X1900 series there is no reason to give up either of them. But most people prefer AA over HDR? That's a extremely bold statement. "Most people" <-- this should be used very, very cautiously since an argument will probably run circles around that "most people".

Many reviewers seem to prefer HDR to AA @1600x1200 and beyond. Large LCDs are cheaper than ever now, and I expect "most" AAA titles from 1/2 2006 and on will have HDR. Everybody does HDR. If not in the original game, via patches or expansions. I'd venture to guess @1024x768 and below resolution AA would be more beneficial. I'm playing games @1920x1200 and my choice is definitely HDR over AA.

I've recently re-played a few levels of Splinter Cell: Chaos Theory without HDR, and everything looked like painting on paper. Flat. Flat. Flat. I'll dig up some screenshots.


There have been polls made asking just that question. AA won by a landslide.

Not everyone like HDR. Most people havent even seen it. The only game I even use it in, is Oblivion. I dislike it in Farcry, SS2, and every other game Ive tried it in. Given the choice, I would prefer AA easily.

The fact remains, to reach their audience better, they should have AA, and HDR seperate benchmarks. Som like HDR, some like AA, and NV cant do them both. More numbers, is better for the customers.

Originally posted by: Cookie Monster
Noise from guru3d

Ok, that is really quiet.

Two other reivews Ive read, said it was louder than the 7900GTX. yet guru shows it 1-2db quieter. So annoying everyone isnt on the same page. :eek:
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
I'd take a 7950GX2 over a 1900(AA+HDR) any day. 7950GX2 is much faster, quieter, and runs cooler..

its more expensive, but the performance/buck stays reasonable.

to Ackmed:

doesn't really matter if its louder or quieter than a GTX, ALL REVIEWS AGREE ITS QUIETER THAN A 1900.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: beggerking
I'd take a 7950GX2 over a 1900(AA+HDR) any day. 7950GX2 is much faster, quieter, and runs cooler..

its more expensive, but the performance/buck stays reasonable.

You must have forgot to mention HQ AF, etc.

Yes the GX2 is almost always faster, in almost every game. But as you mentioned, it costs more. The cheapest retail X1900XT is $370 at newegg, the cheapest GX2 is $600. $230 between the two. Thats a pretty big difference. But thats everyones choice.

Originally posted by: beggerking
to Ackmed:

doesn't really matter if its louder or quieter than a GTX, ALL REVIEWS AGREE ITS QUIETER THAN A 1900.

It matters for people wanting the quietest PC. Try thinking, before flaming. Try some manners, and not acting 12 yelling all the time. Guru is the only site Ive read, that says the GX2 is quieter than the GTX. All others say its between the GTX, and X1900's. Or between 44db and 50db. I didnt say the GX2 was louder, just tha I would like to see the same thing from all reviewers.

 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
Originally posted by: Ackmed
Originally posted by: lopri
Originally posted by: Ackmed
Some prefer AA over HDR. In fact, Id say most do.

No. I'd just say No. Of course HDR+AA is most ideal and with X1900 series there is no reason to give up either of them. But most people prefer AA over HDR? That's a extremely bold statement. "Most people" <-- this should be used very, very cautiously since an argument will probably run circles around that "most people".

Many reviewers seem to prefer HDR to AA @1600x1200 and beyond. Large LCDs are cheaper than ever now, and I expect "most" AAA titles from 1/2 2006 and on will have HDR. Everybody does HDR. If not in the original game, via patches or expansions. I'd venture to guess @1024x768 and below resolution AA would be more beneficial. I'm playing games @1920x1200 and my choice is definitely HDR over AA.

I've recently re-played a few levels of Splinter Cell: Chaos Theory without HDR, and everything looked like painting on paper. Flat. Flat. Flat. I'll dig up some screenshots.


There have been polls made asking just that question. AA won by a landslide.

Not everyone like HDR. Most people havent even seen it. The only game I even use it in, is Oblivion. I dislike it in Farcry, SS2, and every other game Ive tried it in. Given the choice, I would prefer AA easily.

The fact remains, to reach their audience better, they should have AA, and HDR seperate benchmarks. Som like HDR, some like AA, and NV cant do them both. More numbers, is better for the customers.

Again, "Most people"!! Arggggggggh!!!!!

1. Most people prefer AA over HDR
2. Most people haven't seen HDR

Who are those "Most People"? :confused: (Well, I guess this could be a possible explanation to the poll results, at least ;) )

I agree with the review methodology, though. It's been so annoying to me that they mix AA / no AA / HDR / no HDR in the charts with no apparent logic or explanation.