NVIDIA 9800GTX+ Review Thread

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MarcVenice

Moderator Emeritus <br>
Apr 2, 2007
5,664
0
0
You're arguing with Chizow, it's useless, he won't budge, ever, no matter how damning the evidence you give him. The HD4850 might not be impressive from a performance crown point of view, but it was never produced to be ATI's topperformer, let alone beat Nvidia's top performer. The HD4850 is VERY impressive if you look at it's performance you get for 200$, the same and often better performance then the 9800gtx, that was selling for 300$ not more then a week ago. It FORCED nvidia to drop prices considerably, on a videocard that's more expensive to produce then the HD4850. Knowing the HD4850 sometimes even comes close to Nvidia's $400 videocard, the GTX260, is even more impressive.

We will have to wait and see how impressive the HD4870 will actually be, and what it will do with prices of the gtx280 and gtx260. But only in a few months can we really determine a winner, when nvidia comes with 55nm GTX260's and GTX280's. But then again, ATI will have their HD4870X2 out by then. Exciting times, when financial results get released we will know for sure how good ATI is faring against Nvidia. The 8800gt launch was called impressive, but the HD4850 launch beats it fair and square.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: ChronoReverse
From Anandtech. Ignoring Crysis since you said it's a mess.

CoD4
4850: 66FPS
88GT: 52FPS
26%

ET: QW
4850: 75FPS
88GT: 48FPS
56%

AC
4850: 46FPS
88GT: 37PS
24%

Oblivion
4850: 35FPS
88GT: 31FPS
13%

The Witcher
4850: 34FPS
88GT: 29FPS
17%

Bioshock
4850: 87FPS
88GT: 57FPS
52%
And in those titles you might see a difference in Bioshock and ET: QW which have always favored ATI parts, but certainly not any more than the difference in price at $120-140 vs. $170-200. The rest of the titles show clearly that % difference isn't really a good indication by itself. With only 5-7 FPS difference it'd be difficult to tell the difference side by side. But don't take my word for it, there's enough in this thread and others to verify that. :)

Add in better AA, DX10.1 (which allows more better AA) and a price that forces the previous equivalent card to drop $100, I'd say the 4850 is a very important card.
AA is better for sure, but its still borderline playable at the resolutions it has a significant advantage over G80/G92 parts. The 4870 takes that advantage and adds to it, clearly making it a viable resolution and signficant upgrade. The price drop on the 9800GTX wasn't a big surprise actually, they were already selling for $220-250 AR before the HD4850 was launched. I'd say the 4850 was only important so far as to make ATI competitive again in the mid-range, but otherwise it wasn't anything special. The 4870 on the other hand clearly distances itself from the mid-range and is on the verge of high-end for not much more.

As for the GTS, again, it's the pressure that has yielded the price drops to the point they are today (supposed price anyway since Newegg still lists them for $200).
No, those prices have been there for literally months. They were $200 or less even before the 9800GTX was released. You think ATI's 3850 and 3870 pressured them into that? I dont' think so.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
ChronoReverse... are those "88GT" figures for an 8800GT? why are you comparing it to a 4850? shouldn't you compare the 4850 to the similarly priced 9800GTX?
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: MarcVenice
You're arguing with Chizow, it's useless, he won't budge, ever, no matter how damning the evidence you give him.
Really? I've already cleaned up numerous instances of revisionist history regarding pricing even if you dont' agree with the rest of my commentary. :)

The 8800gt launch was called impressive, but the HD4850 launch beats it fair and square.
How can you even compare the 8800GT to the 4850? Honestly, look at what you are comparing. The 8800GT brought high-end performance, which at least doubled previous generation (pre-G80) performance at half the price. The 4850 does what? Gives you 20-25% more performance of the 8800GT for 150% of the price? I mean seriously......perspective folks....
 

ChronoReverse

Platinum Member
Mar 4, 2004
2,562
31
91
So basically you're discounting it because it's ATI. I mean, why be excited about any of the Nvidia cards when none of them provide much benefit according to that criteria?

Even now, at Newegg, the 8800GTS prices are above $200. A quick look at Zipzoomfly confirms that.
And you're claiming that the prices have been $200 or less for MONTHS BEFORE the 4850?

I don't see any 8800GT 512MB for less than $150. 4850 is indeed $200. Considering the performance delta, the price is right in line and that doesn't even account for the other benefits yet.

And did I say anything about the 3850? I said the 4850 pressured the 9800GTX to be $200 because it is so good.


How can you even compare the 8800GT to the 4850? Honestly, look at what you are comparing. The 8800GT brought high-end performance, which at least doubled previous generation (pre-G80) performance at half the price. The 4850 does what? Gives you 20-25% more performance of the 8800GT for 150% of the price? I mean seriously......perspective folks....
We're getting 25% performance plus bonuses for 33% more. It's not perfect but you do have higher absolute performance for close to the same price value. Plus you get some extra stuff.


Originally posted by: taltamir
ChronoReverse... are those "88GT" figures for an 8800GT? why are you comparing it to a 4850? shouldn't you compare the 4850 to the similarly priced 9800GTX?
Because only the 8800GT has enough performance/price to beat out the 4850. The 8800GTS is way overpriced right now and the 9800GTX and GTX+ are equal or slightly below the 4850.

Even then, the 8800GT isn't much better and you get, in absolute terms, much higher performance for a number of games (especially UE3 stuff).
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: ChronoReverse
So basically you're discounting it because it's ATI. I mean, why be excited about any of the Nvidia cards when none of them provide much benefit?
Uh no. I discounted ATI becaues up until this release, they haven't been competitive in 2 product cycles. I can quite honestly say that if the 4870 was released sooner I'd have one.

Even now, at Newegg, the 8800GTS prices are above $200. A quick look at Zipzoomfly confirm that.
And you're claiming that the prices have been $200 or less for MONTHS BEFORE the 4850?
You're not looking hard enough. I counted at least 3 others at $170 or so. And yes that OC'd MSI with The Witcher has been ~$200 for months, even before the 9800GTX launched since it was compared directly to it numerous times on these forums for prospective buyers. This is why it helps to read the forums frequently, and not just around big launches. ;)

I don't see any 8800GT 512MB for less than $150. 4850 is indeed $200. Considering the performance delta, the price is right in line and that doesn't even account for the other benefits yet.
Asus 8800GT. We can do this all day really. And I'm not even going to point out the ones from different sites that are significantly cheaper.

And did I say anything about the 3850? I said the 4850 pressured the 9800GTX to be $200 because it is so good.
It certainly changed MSRP, but the market had already taken care of real pricing long before that. Hell I have month old MicroCenter flyers showing $240-250 AR....

Oh I see. You just can't do math. 150% of $150 is not $200.

We're getting 25% performance plus bonuses for 33% more.
Well I guess it just depends what numbers you're willing to believe. ;) I think I've already demonstrated your sense of pricing needs adjustment.
 

ChronoReverse

Platinum Member
Mar 4, 2004
2,562
31
91
If you're talking MIR there are 4850's at $170 and BB even sold some 4850's at $150.

Come on, let's be practical here.


It certainly changed MSRP, but the market had already taken care of real pricing long before that. Hell I have month old MicroCenter flyers showing $240-250 AR....
So it's over $200 now. What happened to "below $200 for months"?


think I've already demonstrated your sense of pricing needs adjustment.
Right... So $140 8800GT versus the $170 4850. Oh wow, the difference is less than 25% now and not even 33% anymore.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: ChronoReverse
If you're talking MIR there are 4850's at $170 and BB even sold some 4850's at $150.

Come on, let's be practical here.
Of course we're talking MIR, that's the bottom-line pricing. You won't be able to get $150 price mistakes any time soon and maybe not even $30 MIRs on the 4850s but I think you'll still find G92 scales in price/performance well relative to 4850, and long before there was any "pressure" to do so.

So it's over $200 now. What happened to "below $200 for months"?
I never once said the 9800GTX was under $200 for months, I was referring to the G92 GTS which has been sub-$200 for months and offered performance within 2-5% of the 9800GTX.

Right... So $140 8800GT versus the $170 4850. Oh wow, the difference is less than 25% now and not even 33% anymore.
My linked example that didn't exist was $130 actually, I could link even cheaper ones but I think I've gotten my point across. :)
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Chizow/ChronoReverse: If you guys think you aren't getting anywhere with each other, it's because you're not. Call it quits already and move on.
 

pmv

Lifer
May 30, 2008
15,142
10,033
136
Speaking as someone who only bought a 8800gt a month ago, these both sound like excellent cards. Seems like AMD is at least still in the game. I seriously hope they can recover from their current financial plight however. Lord what a disaster it would be if Intel and nvidia were left unchallenged. Though I suppose its conceivable they'd end up competing directly with each other (but in that case would nvidia stand a chance?).

I also can't help dreaming of how great it would be if the OS market were as competitive as the graphics card market.
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Well, I posted this earlier but I think it got covered in the pissing contest here. Anyway, I'd like some input if you all don't mind.

How come no one is looking at SLI 9800GTX+ yet? Afraid of encroaching on the GTX 280 for $200 less?

Regarding the whole PhysX/CUDA issue, is the GTX 280 expected to have simply 240/128 more computing power (versus 9800GTX stream processors) or is there more to it than that? And how then does this compare to the 800 stream processors on the AMD cards?
 

ChronoReverse

Platinum Member
Mar 4, 2004
2,562
31
91
In absolute terms (read not too useful measurements), the GTX280 is just short of 1TFLOPS in single precision compared to the 1TFLOPS and 1.2TFLOPS of the 4850 and 4870 respectively.

For double precision, the GTX280 gets around 90GFLOPS while the 4850 and 4870 get 200GFLOPS and 240GFLOPS respectively.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: ChronoReverse
In absolute terms (read not too useful measurements), the GTX280 is just short of 1TFLOPS in single precision compared to the 1TFLOPS and 1.2TFLOPS of the 4850 and 4870 respectively.

For double precision, the GTX280 gets around 90GFLOPS while the 4850 and 4870 get 200GFLOPS and 240GFLOPS respectively.

Are these numbers reflecting as if all 800 shaders of the RV770 were identical?
 

ChronoReverse

Platinum Member
Mar 4, 2004
2,562
31
91
Worse case scenario is 1/5 of that of course. But I'd hope they'd get at least 37.5% efficiency (so that the 4870 would match the GTX280).
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: ChronoReverse
Worse case scenario is 1/5 of that of course. But I'd hope they'd get at least 37.5% efficiency (so that the 4870 would match the GTX280).

I was kind of hoping they might do a rework of their shaders. Seems like such a humongous waste of transistors. But, if this is what they can do for now, I guess they could have done a lot worse. 4xxx series looks to be very nicely done.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: chizow
Originally posted by: bryanW1995
re the 4850 vs 9800gtx, wtf are you talking about? they showed that across all games/resolutions tested, the 4850 was 4% faster without AA and 20% faster with AA. a 9% core and shader bump isn't going to fix that.

That's the point though, previous benches had the 9800 losing worst to the 4850, particularly with AA. The 9800GTX+ should actually mimic the results found in the various previews floating around, faster with no AA, usually faster with 4xAA, slower with 8xAA.

how are they going to improve that much with only a 9% core and shader oc and NO memory oc? :confused:
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: keysplayr2003
Originally posted by: ChronoReverse
Worse case scenario is 1/5 of that of course. But I'd hope they'd get at least 37.5% efficiency (so that the 4870 would match the GTX280).

I was kind of hoping they might do a rework of their shaders. Seems like such a humongous waste of transistors. But, if this is what they can do for now, I guess they could have done a lot worse. 4xxx series looks to be very nicely done.

There is a tradeoff with every design, and the serial-scalar approach used by NV is not without its disadvantages either. From one side, you can claim that 240 shaders beats 800, and therefore must be more efficient, but those numbers alone are taken out of context without considering other factors such as transistor count, die size, cost, flexibility, etc. And when you consider all these other factors, it becomes evident that had Ati followed the same transistor budget and produced a huge monolithc gpu the size of g200, they most likely would have had a faster product.

IMO, one of the stumbling blocks of Nvidia's architecture is that it required DP calculation to be implemented in separate ALU's, and that's one reason why the g200 is no match for the rv770 in that area. In addition, let's not forget that the g200 lacks DX10.1 features, and while the usefulness of that feature is questionable, it will nonetheless require additional transistors to the 1.4 billion already present on the g200. So, that begs the question: how long can Nvidia continue to push the die size envelope before it becomes unpractical?
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: keysplayr2003
Originally posted by: ChronoReverse
Worse case scenario is 1/5 of that of course. But I'd hope they'd get at least 37.5% efficiency (so that the 4870 would match the GTX280).

I was kind of hoping they might do a rework of their shaders. Seems like such a humongous waste of transistors. But, if this is what they can do for now, I guess they could have done a lot worse. 4xxx series looks to be very nicely done.

It might seem like that, but efficiency on R600-based GPUs can be very good in certain circumstances and the stream processors in R600/R700 are extremely efficient per die size/transistor count. AMD has packed an extra 480SP + 24 TMU + revamped ROPs in a die ~260mm^2 and with only ~290M more transistors than RV670.

So even if the shaders are inefficient (which they only are under certain circumstances) in how they perform per number of SPs, they are still incredibly efficient in terms of the amount of die space/transistors used to acheive a certain performance level. nVidia's SPs appear to take up much more space as seen by the huge transistor count of GT200 compared to G80/G92.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I was kind of hoping they might do a rework of their shaders.
According to the preliminary reviews some big changes were made to the shader core but we?ll have to wait for English reviews to verify them.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: MarcVenice
You're arguing with Chizow, it's useless, he won't budge, ever, no matter how damning the evidence you give him. The HD4850 might not be impressive from a performance crown point of view, but it was never produced to be ATI's topperformer, let alone beat Nvidia's top performer. The HD4850 is VERY impressive if you look at it's performance you get for 200$, the same and often better performance then the 9800gtx, that was selling for 300$ not more then a week ago. It FORCED nvidia to drop prices considerably, on a videocard that's more expensive to produce then the HD4850. Knowing the HD4850 sometimes even comes close to Nvidia's $400 videocard, the GTX260, is even more impressive.

We will have to wait and see how impressive the HD4870 will actually be, and what it will do with prices of the gtx280 and gtx260. But only in a few months can we really determine a winner, when nvidia comes with 55nm GTX260's and GTX280's. But then again, ATI will have their HD4870X2 out by then. Exciting times, when financial results get released we will know for sure how good ATI is faring against Nvidia. The 8800gt launch was called impressive, but the HD4850 launch beats it fair and square.

I'm stoked about 4850 as much as the next gpu fan, but I think that it's an overstatement to say that it beats the 8800gt launch fair and square. 8800gt, iirc, was called "the only card that matters" by anand during his review. it completely obliterated everything else on the market other than 8800gtx. It beat amd's incredibly successful 3870 (compared to the 2900xt that it replaced/rendered useless) by a good 15-20%, and nothing else even came close. It was close enough to 8800gtx that people were selling their 8800gtx's to get 2 8800gt's in xfire, or just one 8800gt and a boatload of cash. It was just plain stupid.

Amd has kicked some serious butt here, don't get me wrong, but the nvidia of june 2008 isn't asleep at the wheel like amd was from nov 06 to nov 07. nvidia is going to lose this round, but they're going down swingin' at least and we will all benefit because of it.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: keysplayr2003
Chizow/ChronoReverse: If you guys think you aren't getting anywhere with each other, it's because you're not. Call it quits already and move on.

dude, come on! This was almost as good as that humongous thread that chizow and Azn had going a few months ago. I remember writing that Azn got "CHIZOWED" or something like that, he had a 300 line post in that thread!

by the way, chronoreverse, you're very lucky that the ref jumped in and called it a draw. I might not always agree with chizow, but you'd better be on your "A" game if you wanna have a discussion with him! :)
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Extelleron
Originally posted by: keysplayr2003
Originally posted by: ChronoReverse
Worse case scenario is 1/5 of that of course. But I'd hope they'd get at least 37.5% efficiency (so that the 4870 would match the GTX280).

I was kind of hoping they might do a rework of their shaders. Seems like such a humongous waste of transistors. But, if this is what they can do for now, I guess they could have done a lot worse. 4xxx series looks to be very nicely done.

It might seem like that, but efficiency on R600-based GPUs can be very good in certain circumstances and the stream processors in R600/R700 are extremely efficient per die size/transistor count. AMD has packed an extra 480SP + 24 TMU + revamped ROPs in a die ~260mm^2 and with only ~290M more transistors than RV670.

So even if the shaders are inefficient (which they only are under certain circumstances) in how they perform per number of SPs, they are still incredibly efficient in terms of the amount of die space/transistors used to acheive a certain performance level. nVidia's SPs appear to take up much more space as seen by the huge transistor count of GT200 compared to G80/G92.

No. They are extremely inefficient in most circumstances, if not all circumstances. That is why they needed so many of them. 800 shaders should be a juggernaut Core. Un----touch----able. Hell, even half that (400) would be unstoppable. But that is not the case here. AMD did the best they could in a given amount of time. And with the time they had, their only option was to place an ocean of these same shaders onto a core. Die size and transistor count is only relevant from a cost perspective. I'm talking pure performance here Extelleron. I just want to make that clear. Performance per shader. And when all is said and done, it doesn't really matter as long as they get the job done.
Yes, Nv shaders are different from AMD/ATI's. Vastly different. It takes 1600 AMD/ATI shaders (2x4850's) to stay with a 240 shader GTX280. Vastly different.

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: munky
Originally posted by: keysplayr2003
Originally posted by: ChronoReverse
Worse case scenario is 1/5 of that of course. But I'd hope they'd get at least 37.5% efficiency (so that the 4870 would match the GTX280).

I was kind of hoping they might do a rework of their shaders. Seems like such a humongous waste of transistors. But, if this is what they can do for now, I guess they could have done a lot worse. 4xxx series looks to be very nicely done.

There is a tradeoff with every design, and the serial-scalar approach used by NV is not without its disadvantages either. From one side, you can claim that 240 shaders beats 800, and therefore must be more efficient, but those numbers alone are taken out of context without considering other factors such as transistor count, die size, cost, flexibility, etc. And when you consider all these other factors, it becomes evident that had Ati followed the same transistor budget and produced a huge monolithc gpu the size of g200, they most likely would have had a faster product.

IMO, one of the stumbling blocks of Nvidia's architecture is that it required DP calculation to be implemented in separate ALU's, and that's one reason why the g200 is no match for the rv770 in that area. In addition, let's not forget that the g200 lacks DX10.1 features, and while the usefulness of that feature is questionable, it will nonetheless require additional transistors to the 1.4 billion already present on the g200. So, that begs the question: how long can Nvidia continue to push the die size envelope before it becomes unpractical?

This isn't about a die size that's out of control, it's about amd simply having a better architecture this time around. Their die is significantly less than 1/2 the size of nvidia's and it offers, what, 80-90% the performance of gt200? This is crazy guys. I think what happened here is that amd has been in panic mode for 18 mos, and those ati freaks pulled some freaky gpu mojo out of their keisters. If they are able to get the sandwich card to work just as well as 3870x2 did they're going to destroy gtx 280. If any of the rumored optimizations were actually realized (a distinct possibility though difficult to determine the likelihood of any one particular op at this point) they will put too much distance between themselves and gt 200 for nvidia to remain profitable on that card.

How's this scenario: say that 4870x2 is on average 25% faster than gtx 280. Nvidia isn't going to get another 25% out of gtx 280 with a 55nm shrink and ultra-type cherry picking. They will still be able to tout the "single gpu" horn, but 25% is a BIG difference, especially at the high end. $499 for 4870x2 vs...$399 for gtx 280? This could get real ugly real fast. Fortunately, nvidia enjoyed record profits for the past 18 mos and they'll learn from this mistake in a big way imho. Also, as ben mentioned yesterday, they're going to take some market share from intel with tesla, recouping a good % of their rd and manufacturing costs.