NVIDIA 9800GTX+ Review Thread

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: BenSkywalker
the higher performance card can make the scene look smoother if the scene is bordering the 23-25 fps, but at 50-70 fps, it's a scientific fact that the human eyes can't distinguish!

Not sure where you went to school for science, but you should get a lawyer and sue, you certainly didn't get your money's worth. Get ahold of fpstest and a remotely decent monitor(when that can push 150Hz or higher) and see for yourself. Even people with extremely poor vision have no problem whatsoever spotting the difference between 50 and 70 fps. People with good vision can spot varriances well into the 100s.

...

:thumbsup:
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: BenSkywalker
the higher performance card can make the scene look smoother if the scene is bordering the 23-25 fps, but at 50-70 fps, it's a scientific fact that the human eyes can't distinguish!

Not sure where you went to school for science, but you should get a lawyer and sue, you certainly didn't get your money's worth. Get ahold of fpstest and a remotely decent monitor(when that can push 150Hz or higher) and see for yourself. Even people with extremely poor vision have no problem whatsoever spotting the difference between 50 and 70 fps. People with good vision can spot varriances well into the 100s.

JPB- You bolded let's make up some numbers, I put in brackets or something like that. Those numbers were pulled from the most die hard ATi fan site that I know of. 8800GTs in SLI is what I was comparing to the 4850, in terms of money versus performance that combination utterly smokes the current pricing on the 4850.

I noticed that you failed to address the "$300 xfire" possibility of 4850. how does that compare to a $260 sli solution that also requires a much-less-common sli motherboard and the hassle of $40+ mir's, which, by the way, are usually limited to one per household.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: bryanW1995
Originally posted by: BenSkywalker
the higher performance card can make the scene look smoother if the scene is bordering the 23-25 fps, but at 50-70 fps, it's a scientific fact that the human eyes can't distinguish!

Not sure where you went to school for science, but you should get a lawyer and sue, you certainly didn't get your money's worth. Get ahold of fpstest and a remotely decent monitor(when that can push 150Hz or higher) and see for yourself. Even people with extremely poor vision have no problem whatsoever spotting the difference between 50 and 70 fps. People with good vision can spot varriances well into the 100s.

JPB- You bolded let's make up some numbers, I put in brackets or something like that. Those numbers were pulled from the most die hard ATi fan site that I know of. 8800GTs in SLI is what I was comparing to the 4850, in terms of money versus performance that combination utterly smokes the current pricing on the 4850.

I noticed that you failed to address the "$300 xfire" possibility of 4850. how does that compare to a $260 sli solution that also requires a much-less-common sli motherboard and the hassle of $40+ mir's, which, by the way, are usually limited to one per household.

Is it still a possibility? I don't think that BB price mistake will be matched for some time, if you're going to use best-price-ever quotes as your test samples you'd have to go with mixed vendor 8800GT going for $110-120 AR, so $220-230 in SLI compared to $300+tax. Still, 8800GT will provide as good or better price/performance no matter how you slice it. A point I made early on that seems obvious but I guess not. ;)

Board limitations are a valid point that I agree with and have mentioned as well but in reality, CF isn't as effortless as you'd think. Only certain boards in the sea of Intel chipsets support 16x/16x CF (X38 and X48 I believe). Other solutions are 8x/8x or even worst, 16x/4x.

Anyways, I'm glad there was some user feedback on the 4850 over the weekend. Pretty much sums up my points earlier, that if you already owned a G80/G92 class GPU the 4850 isn't much of an upgrade at all. It'd only really make sense if you did have a CF board and wanted to double-up and go CF.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
only a 30% increase in performance is almost NEVER worth it. I wouldn't go from 8800gt to gtx 280 for that matter unless you're really suffering in your favorite games. The whole point of having high end video cards to improve your gaming experience, not just jump on the best deal that comes around. The issue that I have with many current posters is that they are still arguing about WHAT the best deal is.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: bryanW1995
only a 30% increase in performance is almost NEVER worth it. I wouldn't go from 8800gt to gtx 280 for that matter unless you're really suffering in your favorite games. The whole point of having high end video cards to improve your gaming experience, not just jump on the best deal that comes around. The issue that I have with many current posters is that they are still arguing about WHAT the best deal is.

Well sure it depends on resolution, I did a comparison of my own before I bought GTX 280 Performance Differences @ 1920 and there's huge differences. 8800GTX is still faster than 8800GT, so those differences are even greater. In terms of actual gameplay its night and day, my old average/medians are now my rare lows with FPS mostly capped at 60. This is all with higher game settings and/or 4xAA enabled.

I think Ben's point and mine as well was that people and reviewers wanted to make the 4850 seem as if it introduced earth-shattering levels of performance, especially with CF when in reality there have been similar options for similar price/performance for months.
 

Hunt3rj2

Member
Jun 23, 2008
84
0
0
Going off of AnandTech's 4850 review and the many others on the internet, and going off of what I've been reading for the 9800 GTX+, I can say these things about the graphics cards.

The ATI 48xx series seems to be ATI's trump card against Nvidia, and all the previous glitches of the R600 have been fixed. However, the fan should be made into a dual-slot one that exausts the air out the back of the case. I do believe that the Excelero S2 HS/F is the aftermarket cool replacement for the ATI cards, however it too blows the heat into the case. I personally have a case that has 2 120 mm fans and 1 80 mm to circulate air and therefore have no problem dealing with this. However, some more cramped cases will definitely need a better cooling solution. Also, the 4850 is shown as having better AA performance scaling compare to the 9800 GTX+ and I think anyone who knows that they can turn up the AA to the max without everything getting a big drop will do so, and therefore I think the 9800 loses a rather big point there. The 4850 also will have it's 2 bigger brothers, the 4870 and the 4870X2. I have heard rumors of the 4870X2 having 1GB of shared memory, which is a very big thing, because this would reduce or eliminate microstuttering on the texture-heavy games some experience when people play games like Age of Conan. However, the 4870 will be coming out on both 512MB and 1GB, but I am not too sure about that. Also, although the 4870 1GB in crossfire has more memory compared to the 4870X2, the shared memory would make it better due to just being able to communicate over a direct link (supposedly?) instead of a crossfire bridge. I am a bit fuzzy on the details, and some of us are, but once ATI gets the rest of their 4xxxs out the door the dust and smoke will clear.

The 9800 GTX+ is your standard 9800 GTX overclocked some and with a plus drawn on the back. A proven design and more familiar to me, as I have pretty much only used Nvidia for gaming PCs, due to the fact that my first and only gaming PC was in the time of the 7 series where Nvidia was stomping on ATI's arse. I can't really say much about anything for this graphics card, because we all know about it and there are no outstanding features other then a very good bang for your buck. However, this is shared by ATI, and they're both the same, if not the ATI being better at this point, due to AA scaling better. However, the one area where ATI does lose is fillrate. Although the ATI shines overall, the fillrate of something that I need to go back and look through the reviews to confirm (Yes, I have crap memory.). So I do believe the ATI does beat Nvidia's response to the 4850.


Moving on, Asus has announced a dual-slot blow out of the case 1GB model, which should make those people happy. Anyway, the moral of the story for this wall of text is that I believe that ATI is winning, although it might not be much. ATI does need to hurry up and get the rest of the models out the door though.

EDIT: Actually the moral of the story for this wall of text is that I can haz cheezburger?

EDIT2: Also, unless you have something more then a 60 hz monitor, you won't see the difference between 70 and 200 FPS, you might feel the controls to be more responsive, though.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
I noticed that you failed to address the "$300 xfire" possibility of 4850.

Link where I can buy it and I will give that absolute full merit. One B&M having a pricing error in the past doesn't make that a viable option now- you can't buy it for that price now. I am focusing on possible upgrades available in the market, 4850 in crossfire is considerably more expensive then $300 at this moment.

how does that compare to a $260 sli solution that also requires a much-less-common sli motherboard and the hassle of $40+ mir's, which, by the way, are usually limited to one per household.

We could talk about PSU requirements for any of these setups too, of course you need to have the proper supporting hardware. For the people that have SLI mobos you must include the cost of a new mobo in order for them to run CF also- that goes both ways.

I think Ben's point and mine as well was that people and reviewers wanted to make the 4850 seem as if it introduced earth-shattering levels of performance, especially with CF when in reality there have been similar options for similar price/performance for months.

Exactly. 9600GT SLI setup versus 9800GTX. 9600GT SLI is running about $240 atm. The 4850, despite the claims by some of the hardcore loyalists, doesn't introduce a new price/performance standard by any means whatsoever. What it does is makes ATi competitive with nVidia in the low mid and high mid segments of the market where they were previously getting butchered. That isn't a bad thing by any means, but people should keep in perspective what we are really talking about.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: BenSkywalker
Exactly. 9600GT SLI setup versus 9800GTX. 9600GT SLI is running about $240 atm. The 4850, despite the claims by some of the hardcore loyalists, doesn't introduce a new price/performance standard by any means whatsoever. What it does is makes ATi competitive with nVidia in the low mid and high mid segments of the market where they were previously getting butchered. That isn't a bad thing by any means, but people should keep in perspective what we are really talking about.

Nicely said.

Its just that people hail the HD4850 as the next best thing since sliced bread. Im glad they are competitive again, which is good for us but like Ben said, people need to keep it in perspective.

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: BenSkywalker
I noticed that you failed to address the "$300 xfire" possibility of 4850.

Link where I can buy it and I will give that absolute full merit. One B&M having a pricing error in the past doesn't make that a viable option now- you can't buy it for that price now. I am focusing on possible upgrades available in the market, 4850 in crossfire is considerably more expensive then $300 at this moment.

how does that compare to a $260 sli solution that also requires a much-less-common sli motherboard and the hassle of $40+ mir's, which, by the way, are usually limited to one per household.

We could talk about PSU requirements for any of these setups too, of course you need to have the proper supporting hardware. For the people that have SLI mobos you must include the cost of a new mobo in order for them to run CF also- that goes both ways.

I think Ben's point and mine as well was that people and reviewers wanted to make the 4850 seem as if it introduced earth-shattering levels of performance, especially with CF when in reality there have been similar options for similar price/performance for months.

Exactly. 9600GT SLI setup versus 9800GTX. 9600GT SLI is running about $240 atm. The 4850, despite the claims by some of the hardcore loyalists, doesn't introduce a new price/performance standard by any means whatsoever. What it does is makes ATi competitive with nVidia in the low mid and high mid segments of the market where they were previously getting butchered. That isn't a bad thing by any means, but people should keep in perspective what we are really talking about.

if amd was "being competitive again" then nvidia wouldn't have had to paper launch a potential future product the day 4850 was released. AMD is kicking the crap out of nvidia's last gen technology with their next gen technology. I have zero doubt that nvidia could have implemented this strategy more effectively than amd, but they instead chose to chase the high end. In the past that has ALWAYS been the correct strategy, so I don't blame them for going that route this time. Unfortunately for nvidia, gtx 280 is very hot/expensive and doesn't give performance commensurate with their targeted price point. 4850, otoh, is cool/low power (and only "hot" b/c of the crappy single slot fan) and in fact gives a LOT more performance per watt. On top of that, I don't think that I'll get too many arguments when I say that yields are 4850 with its mature 55nm process are probably a LOT higher than g92b and gt200, giving amd even more ability to control the $150-$250 market to their advantage.

I think that nvidia is going to be fine. They'll drop prices to be competitive on a price/performance basis, but their margins are going to suffer a lot until they can bring gt200 down to 55nm and the midrange segment.
 

rstove02

Senior member
Apr 19, 2004
508
0
71
I for one am pretty interested in this card as a cost effective replacement of my 8800GTS in my system, just hope it does not turn into a paper release or supply problems.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
if amd was "being competitive again" then nvidia wouldn't have had to paper launch a potential future product the day 4850 was released.

Announce you mean? ATi announced the 4850, 4870 and 4870x2 quite a while ago, where can I buy the 4870x2?

AMD is kicking the crap out of nvidia's last gen technology with their next gen technology.

Please provide links to back up your claims here. I did for mine. I showed the numbers, and they aren't close to matching up with what you are saying, not in the vague realm.

Unfortunately for nvidia, gtx 280 is very hot/expensive and doesn't give performance commensurate with their targeted price point.

GTX280 had far more to do with Intel then AMD. AMD has exited the high end graphics market as of this point. The GTX's real competition is Xeon clusters, from that perspective it is dirt cheap and insanely fast.

4850, otoh, is cool/low power (and only "hot" b/c of the crappy single slot fan) and in fact gives a LOT more performance per watt. On top of that, I don't think that I'll get too many arguments when I say that yields are 4850 with its mature 55nm process are probably a LOT higher than g92b and gt200, giving amd even more ability to control the $150-$250 market to their advantage.

Couple of things here, can you explain how the 4850 is built on the 'mature' 55nm process while the G92b is build on the immature 55nm process?

The 4850 isn't very low power and it certainly isn't close to cool, but let's assume that you are correct in your claims on that end- how exactly is that relevant in the market it is aimed at?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Announce you mean? ATi announced the 4850, 4870 and 4870x2 quite a while ago, where can I buy the 4870x2?

are you serious? anand claimed that he got an email at 1:35 IN THE FREAKIN' MORNING FROM NVIDIA ON THE DAY THE NDA LIFTED ON 4850. Do you think he was lying? You say "announce", I say "paper launch", they're both the same thing in this case. Nvidia and their fanboys are freaking out about their lack of competitiveness in the midrange, so much so that they dropped the msrp $100 on a gpu that was the highest performing single gpu 2 weeks ago and is now solidly midrange. I have nothing at all against nvidia and have in fact been pushing the 8800gt on anybody who would listen for months now, but 4850 has already done as much for ati as 8800gt did for nvidia. On top of that, 4850 is a NEW architecture instead of a 3rd retread of g80 with a 2nd dieshrink. g80 was great but now both camps have architectures that are just going to leave it further and further behind over time.

Please provide links to back up your claims here. I did for mine. I showed the numbers, and they aren't close to matching up with what you are saying, not in the vague realm.

again, are you serious? did you read the AT 4850 release article? tech report? computer base? etc etc etc. If you don't know how to go back page 1 of the 4xxx series thread please pm me and I'll send you a link. Besides, how much more proof that the infamous 1:35 email do you need. Would nvidia have "announced" a 55nm 9800gtx+ refresh at that time of the morning and overnighted 9800gtx+ cards to all the major sites if 4850 was ~ 8800gt??? I'm going to use "reason" and maybe some "common sense" here and say no.

GTX280 had far more to do with Intel then AMD. AMD has exited the high end graphics market as of this point. The GTX's real competition is Xeon clusters, from that perspective it is dirt cheap and insanely fast.

gtx 280 had far more to do with amd than intel. when nvidia first started design work on gt 200, who had the fastest card? intel? no? right, ati. As I mentioned previously, in every other instance of which I'm aware, the fastest card has produced a fantastic "halo effect" for the other cards in that company's lineup. Nvidia made the wrong move for the right reasons, basically. If they could have gotten about 20% more performance out of gtx 280 then we wouldn't be complaining about it, or if they would have priced it at, say, $500 at launch we'd be ok. Their problem is that their development costs and production costs are through the roof, and amd's development and production costs are very low.

I do believe that going forward cuda is going to be a great entrance for nvidia into the server and, especially, supercomputer market, but intel hasn't been asleep at the wheel. gtx 280 is primarily a gpu that can also do great work in other areas, larrabee will probably be better at high-end computing than gt 200 but not as good at graphics. This is definitely a good step for nvidia as they try to muscle into that market, however.

Couple of things here, can you explain how the 4850 is built on the 'mature' 55nm process while the G92b is build on the immature 55nm process?

sure, amd started building on 55nm nearly a year ago and has built a LOT of gpus on that process. nvidia started on 65nm at the same time and is only now starting to try their hand at 55nm. So far, I know of exactly 12 gpus that they've built on it that actually work (sample cards sent to reviewers at 1:35 am one day last week). Frankly, I think that the only reason for them to even mess with the learning curve on 55nm is to switch gt 200 to it ASAP. IMHO they need to instead focus all of their energy on getting gt 200 to 45nm by the end of the year, maybe with a midrange refresh that's 256 bit/ gddr5 like 4870.


 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
I think Ben's point and mine as well was that people and reviewers wanted to make the 4850 seem as if it introduced earth-shattering levels of performance,
As far as 8xMSAA goes it's looking like this is indeed the case. According to ComputerBase for example an 8800 Ultra is only scoring 75% (IIRC) the level of performance of a 4850 when using 8xAA.

Announce you mean? ATi announced the 4850, 4870 and 4870x2 quite a while ago, where can I buy the 4870x2?
I don't think they've announced anything except for the 4850. The rest of it is leaked given the NDA doesn't expire until at least the 25th June.
 

rjc

Member
Sep 27, 2007
99
0
0
Originally posted by: bryanW1995
As I mentioned previously, in every other instance of which I'm aware, the fastest card has produced a fantastic "halo effect" for the other cards in that company's lineup.

(will try to keep this short as have few postings and thus little cred in this newsgroup)

There is another psychological effect which explains what goes on in the video card market much better than the "halo effect".

The "fundamental attribution error" or "correspondence bias" works alot better. This is where people judge very quickly on meeting a new person whether a persons is instrinsically "good" or "bad" and ignore the situation/surroundings in which they met. In this case the person is actually a video card company but the effect is the same.

Look at all the publicity given to the very top of video card market relative to the publicity given to the vast bulk of the actual market. Its like 90% publicity from less than 10% of market leaving 10% of publicity for the remaining 90%. Pretty much everyone is going to make a judgement based on what is happening in the top 10% as that is all they will see in the initial decison making period.

How is this different to the "halo effect"? Well it isn't actually the high end card that matters, it is what is seen in the the first few minutes of encountering video cards that makes the decision. This initial judgment sticks like glue, it appears hardwired deep inside the brain.

Job interviews are notorious for this occuring, psychologists have measured that most interviewers subconciously decide within 10 seconds or so of the interview starting. The remaining half hour of the interview is merely for the interviewer to find some way to justify their initial snap judgement.

The video card market appears to work the same.

My preferred explanation for this is the cavemen one where they had to make snap decisions on meeting strangers - "friend or foe" - as roughly 30% of cavemen ended up murdered by other humans.

So funny that this primitive mental mechanism is used to sell millions and millions of cards.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
@Bryan:

An RV770 is a second refresh of R600 just like G92b being second refresh of G80. Architecturally i suspect nothing much has changed in the RV770 compared to RV670 to even deem it as a new archtiecture (the very foundations staying the same) except a rather huge increase in functional units (+250% in ALUs AND TMUs). After reading on how shader AA works, i can see why it takes such a low hit going from 4xAA to 8xAA. (All fingers point to 800ALUs within RV770s shader core)

Ok so how did this email got so infamous? nVIDIA had two choices. Dont respond to the HD4850 at all and rather sit idle while your competition starts retaking some of its market share or respond to it NOW by aggressively pricing the current cards (factoring in its production cost and other relevant costs) while introducing another card to be placed at $229 to meet the threat. 9800GTXs have dropped to $199. GTX+ will probably come in volume in july as stated (whereas some nVIDIA partners are already shipping the GTX+ versions using 65nm cores). I see absolutely nothing wrong with this. Theres no reason for nVIDIA to sit idle and do nothing about this. This is competition. And i think its quite safe to say a HD4850 is around 9800GTX level give or take 5~10% depending on titles.

Ben is right. nVIDIA is primarily focusing on GPGPU and the G200 is the stepping stone for nVIDIA to start targeting intel at several market segments. You can already tell that they've acknowledged AMD/ATi in giving up the high end race and that nVIDIAs main focus is now intel as soon as Jen-Hsun said the words "opening a can of whoopass" on intel. Why do you think AMD has chosen to go with their current design cycle (i.e mid range GPU first, multi GPU for high end and so on)? You've answered it in your own post.

Amd's development and production costs are very low.

Where as nVIDIA can afford to spend more money on R&D. They have the resources compared to AMD, a company thats in a financial crisis. More importantly they have the money and thats not surprising either since they've enjoyed record breaking Qs for a very long time now. CUDA isn't just for the server or supercomputer market either. Its also targeting the desktop where CPUs will soon be outclassed by the onslaught of GPGPUS at multiple apps, such as encoding. Software Apps are too taking advantage of this as well. Hell they are even making a superPI based on CUDA.

On a side note, both IHVs dont manufacture their GPUs, TSMC does. (They simply place an order) For nVIDIA they've had the 55nm G92s for some time now seeing as they are already available (in A2 i believe) and ready to be manufactured in volume for launch around July. These transitions takes months at the least.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Cookie Monster
An RV770 is a second refresh of R600 just like G92b being second refresh of G80. Architecturally i suspect nothing much has changed in the RV770 compared to RV670 to even deem it as a new archtiecture (the very foundations staying the same) except a rather huge increase in functional units (+250% in ALUs AND TMUs).

I don't see how that statement can be valid, unless you also count the g200 as a refresh of the g80, neither of which is true. Fundamentally the rv770 is much more than a r6xx with increased unit count. Based on the available info I can gather, there have been numerous changes to the individual units, such as:
-reorganized TU's no longer having separate point-sampling fetch units
-memory controller is no longer a ring-bus
-the ROP's do 4 z-samples per clock, double that of r6xx
-reorganized cache hierarchy and distribution
-TU array possibly distributed and scheduled differently among the shader SIMD clusters
-numerous other changes of which I don't know the details

The point is we're not looking at something like the r300 -> r420 transition. The fact alone that they've managed to cram so many units into such a surprisingly small die should tell you that this isn't just a r600 with more of the same.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Let's keep the RV770 chatter in the right thread, guys.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I know that tsmc's 55nm production is better today than it was when amd first started using it, but tsmc isn't as familiar with the g92b arch as they are with rv670/rv770. you can't have it both ways, if rv770 is just a refresh of rv670 then tsmc will definitely be more efficient in making it, have better yields, etc. Also, remember when amd said that they were on first silicon with rv670? That process was a winner from the word go.

btw, I just realized why nvidia's folding team is called team whoopass. :laugh: Nvidia's marketing dept makes it hard to like them, but their ceo makes it hard NOT to like them!
 

JPB

Diamond Member
Jul 4, 2005
4,064
89
91
XFX 9800GTX Black Edition beats 9800GTX+

3 months before Nvidia

Geforce 9800GTX+ will launch at 738MHz GPU, 1836MHz Shaders and 2200MHz memory whereas XFX and EVGA already have the card with higher clocks than these.

This raises a lot of questions as Geforce 9800 GTX 760M Black Edition runs at 760MHz core, 1900MHz Shaders and 2280MHz memory.

If you look at the specs, XFX is already faster than Nvidia?s soon to be launched 55nm version of the 9800GTX+ card. Overall, we believe that the performance difference won?t be that significant but XFX 9800 GTX is available today and costs around ?240 on the German etail market.

The black edition looks like a winner in this re-launch game and EVGA's e-GeForce 9800 GTX SSC with 770MHz core clock surely wins against new 9800 GTX+.

Geforce 9800GTX+ won?t be available for another three weeks and XFX 9800 GTX Black edition or EVGA 9800GTX Super Super Clock are available today, at a very attractive price. We tested one XFX black card back in April, and we were pleased with it. You can read about it here.

This changes a few views, doesn?t it?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
You say "announce", I say "paper launch", they're both the same thing in this case.

Paper launch would be a bit different, if they wanted to push a paper launch all of the review sites would have had their boards well in advance. What they did was spring the announcement to coincide with the retail release of ATi's parts, that way their marketing dollars have been commited to a $199 price point.

so much so that they dropped the msrp $100 on a gpu that was the highest performing single gpu 2 weeks ago and is now solidly midrange

It was made midrange by nV's parts before the 4850 hit.

Would nvidia have "announced" a 55nm 9800gtx+ refresh at that time of the morning and overnighted 9800gtx+ cards to all the major sites if 4850 was ~ 8800gt??? I'm going to use "reason" and maybe some "common sense" here and say no.

The 4850 costs 50% more then the 8800GT- the 8800GT in SLI is 30% more then the 4850.

8800GT<4850<<8800GT SLI

It's the price performance ratio that I was talking about, and in that aspect there is no doubt whatsoever that the 4850 is very far removed from making any sort of large impact.

gtx 280 had far more to do with amd than intel.

Look up information on Tesla. The GTX280 utilized in that aspect is obliterating Xeon clusters- they added multiple DP functional units to the 280, there is no doubt that GPGPU was a much larger relative focus this generation then besting ATi's top part.

Their problem is that their development costs and production costs are through the roof, and amd's development and production costs are very low.

Nvidia will profit significantly more from the GTX2xx line of GPUs then ATi will from the 48xx series. To appreciate that, you must realize nV would likely profit significantly more from it if they didn't sell a single graphics card. You aren't grasping properly what this part is aimed at. The fact that it can render graphics is a checkbox feature.

I do believe that going forward cuda is going to be a great entrance for nvidia into the server and, especially, supercomputer market, but intel hasn't been asleep at the wheel. gtx 280 is primarily a gpu that can also do great work in other areas, larrabee will probably be better at high-end computing than gt 200 but not as good at graphics. This is definitely a good step for nvidia as they try to muscle into that market, however.

Look up Tesla, Intel is getting creamed right now. Not talking four generations down the line or anything like that, right now Intel is getting absolutely crushed. For DP operations where Tesla is at its' relatively weakest point is roughly an order of magnitude faster then Xeon, in SP it is more then a couple order of magnitudes. In graphics terms(but GP code), Intel is pushing 1FPS and nV is pushing 100FPS in certain cases. GTX280 was very, very clearly a shot at Intel, not ATi.

sure, amd started building on 55nm nearly a year ago and has built a LOT of gpus on that process. nvidia started on 65nm at the same time and is only now starting to try their hand at 55nm.

TSMC makes all the chips for ATi and nVidia. They are even made in the same plant. Can you explain how the major overhauled new 4850 is much easier to produce then the die shrunk already made a million G92b?
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: bryanW1995
I know that tsmc's 55nm production is better today than it was when amd first started using it, but tsmc isn't as familiar with the g92b arch as they are with rv670/rv770. you can't have it both ways, if rv770 is just a refresh of rv670 then tsmc will definitely be more efficient in making it, have better yields, etc. Also, remember when amd said that they were on first silicon with rv670? That process was a winner from the word go.

btw, I just realized why nvidia's folding team is called team whoopass. :laugh: Nvidia's marketing dept makes it hard to like them, but their ceo makes it hard NOT to like them!

Bryan, isn't going from 65nm to 55nm just an optical shrink? Like 90nm to 80nm? Meaning, no reworking of the transistors are necessary? in turn, meaning that TSMC is more than familiar with G92b, because it is the same core as G92 which TSMC has been fabbing for Nvidia for quite a while now. Now they are just shrinking it?

TSMC 65 to 55nm linear shrink.

"TSMC?s 55nm process technology is a 90% linear-shrink process from 65nm including I/O and analog circuits. The process delivers reportedly significant die cost savings from 65nm, while offering the same speed and 10 to 20% lower power consumption. Because the 55nm process is a direct shrink, chip designers can leverage existing libraries and port their 65nm designs with minimal risk and effort. The 55nm logic family includes general purpose (GP) and consumer (GC) platforms. Initial production of the 55GP begins this quarter, followed later in the year by 55GC."

So your comment I bolded above. Did you just throw a guess out there on that one? Because G92b can't be anything other than a knee jerk? I have to admit, when you guys get your mind set on something, all kinds of cool stuff comes out. :)

G92b has been a topic for discussion for a while now. The timing of it's release makes sense. They had it ready to go, and were just waiting for the proper time to announce it.
It's always good to have an answer to your competitors products, both in performance and price point. TSMC was probably waiting for Nvidia to give word to ramp up production, hence the July availability reports. Why make something if you don't need it? While the current 9800GTX is fairly close in performance to the 4850, the 9800GTX+ should be a bit quicker. The price point of 4850 is 199.00, but if they implement a better cooler on it, that price may rise a few bucks. All in all, these cards will line up well against each other.
I'd like to see a quick fix for the fan speeds on the 4850, similar to the 8800GT fan speed fix. Either that, or a better cooler.

Sorry for the long postage.

keys
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: BenSkywalker
Paper launch would be a bit different, if they wanted to push a paper launch all of the review sites would have had their boards well in advance. What they did was spring the announcement to coincide with the retail release of ATi's parts, that way their marketing dollars have been commited to a $199 price point.
The only thing worse than a paper launch is a last-minute scramble paper launch, and that's exactly what this is. You don't send out emails to review sites in the middle of the night and call it an "announcement." Lets not forget that several review sites did get the press edition of the 9800gtx+, and they did publish the benchmarks, but the product is not available for sale anywhere. "Announcement" is just a sugar-coat term for this paper launch.

It was made midrange by nV's parts before the 4850 hit.
Too bad the price didn't reflect its midrange status until after the 4850 launch

The 4850 costs 50% more then the 8800GT- the 8800GT in SLI is 30% more then the 4850.

8800GT<4850<<8800GT SLI

It's the price performance ratio that I was talking about, and in that aspect there is no doubt whatsoever that the 4850 is very far removed from making any sort of large impact.
You forgot to mention that SLI requires a craptastic NV mobo, doesn't always work, and has other issues such as broken vsync, no multi-monitor support, micro-stuttering, etc. If SLI is the magic answer when NV can't compete with a single gpu, then where does that leave the gtx280 which costs 2x as much as 8800gt SLI and is still slower?

Look up information on Tesla. The GTX280 utilized in that aspect is obliterating Xeon clusters- they added multiple DP functional units to the 280, there is no doubt that GPGPU was a much larger relative focus this generation then besting ATi's top part.
Too bad the gtx280 still gets creamed by the 4850 when it comes to DP computation.