Which was more disappointing? Parhelia or GeForce FX??

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

kgraeme

Diamond Member
Sep 5, 2000
3,536
0
0
FX.

Nobody really expected the Parahelia to be great. People expected NV30 to rock the house. Parahelia just proved people right, while the FX was the major letdown.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Wow.

I am quite suprised that such a number of people are so dissapointed with the GeforceFX.

1) It's equal to a product which had the most dramatic performance improvement in a generation since Voodoo2 SLI vs Voodoo Graphics. I'm impressed that nVidia was even able to match the R300.

2) Although it's seemingly underperforming wrt paper specs on the core, it's obviously inferior to the R300 wrt memory bandwidth. Did anyone really expect nv30 to out-perform R300 in hi-res AF+AA? If so, your expectations were artificially high. As far as non-bandwidth limited sidutations, it's often outperforming the R300 and I believe that drivers have some headroom.

3) I'm still in shock that so many people are taking the cooling seriously. The jokes and photochops are hysterical, but do that many people really think that the majority of AIB makers will want to sell cards which are 70dB? We know they have the expertise to create performance cooling which is not as loud as a jet engine (ala Abit OATES), I find the whole ordeal to be silly.



 

SamuraiSludge

Member
Jun 13, 2001
30
0
0
At least the Parhelia is extremely useful for some people (who would like to run 3 monitors off 1 card, get great image quality and not for gaming).

That's for sure.

My father uses the Parhelia purely for professional-class 2D imaging work on 2 CRTs and through Composite TV-OUT. Now I admit I was pretty skeptical when he told me he bought the Parhelia. But, after seeing and using the Parhelia in my dad's PC, I'm pretty impressed with it. The drivers are quite stable, and the 2D Imaging features it offers are slick. Sure the 3D rendering is less than mediocre and the cost is steep, but for what my father does with it (2D and only 2D) and from what I've seen of it first-hand, the Parhelia was definitely not as big a disappointment IMHO as everyone made it out to be.

So, as of right now, I'd have to say that the GeForce FX is a bigger disappointment than the Parhelia.
 

Rankor

Golden Member
Jul 10, 2000
1,667
0
76
To tell you the truth, I wasn't expecting much from either.

The GFFX was going to be marginally or even equivalent to the 3D perfomance of the 9700 Pro.

The P512 wasn't going to win any kudos in the 3D performance category against what was currently available at the time.

I can't wait for the prices on these cards to drop, though. I look forward to upgrading my GF4 Ti 4400 with either GFFX or the 9700 Pro for its 3D performance and my G450 with P512 for its 2D IQ.
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: merlocka
Wow.

I am quite suprised that such a number of people are so dissapointed with the GeforceFX.

1) It's equal to a product which had the most dramatic performance improvement in a generation since Voodoo2 SLI vs Voodoo Graphics. I'm impressed that nVidia was even able to match the R300.

2) Although it's seemingly underperforming wrt paper specs on the core, it's obviously inferior to the R300 wrt memory bandwidth. Did anyone really expect nv30 to out-perform R300 in hi-res AF+AA? If so, your expectations were artificially high. As far as non-bandwidth limited sidutations, it's often outperforming the R300 and I believe that drivers have some headroom.[/quote]

Regarding these two points. FX is a disappointment because:

1. In order to be competetive with 9700, it needs to be overclocked right out of the factory. It needs a huge and noisy cooling-solution, and even then, it does NOT crush 9700.

2, Can you say HYPE? NV kept hyping the card so much ot was ridiculous! They claimed it was "Our greatest contribution to this industry!". They talked how it is the next best thing since sliced bread. What did we get instead? A card that is loud as hell and does not offer substantially new features and performance when compared to it's main rival. In fact, it's main rival offers something FX does not: more or less silent operation!

3) I'm still in shock that so many people are taking the cooling seriously. The jokes and photochops are hysterical, but do that many people really think that the majority of AIB makers will want to sell cards which are 70dB? We know they have the expertise to create performance cooling which is not as loud as a jet engine (ala Abit OATES), I find the whole ordeal to be silly.

I certainly hope that you are correct. But the fact is that RIGHT NOW, FX is so loud that it is REALLY turning potential buyers elsewhere! It's just too loud, period.
 

First

Lifer
Jun 3, 2002
10,518
271
136
But the fact is that RIGHT NOW, FX is so loud that it is REALLY turning potential buyers elsewhere! It's just too loud, period.

No, you just haven't heard an actual (i.e. shipping) GeForceFX card before. :)
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: Ilmater
[First off, I'm not just picking on you. There are a lot of people that, like you, are saying this same thing. However, let's look at what was hyped by each company.

nVidia: Same nVidia rhetoric - State of the art graphics, DX9+ compatibility. nVidia never said that they would blow ATI out of the water, and I challenge everyone to prove otherwise. It IS the fastest card out currently. Period. Though not by much.

Nope. The hype was more or less like "This is the greates card there has ever been! It's our greates offer to this industry we have EVER made! It rules!". The reailyt of the situation was a bit different to say the least. Yes it has decent performance. But in order to have that performance, it needs ridiculous cooling-solution that is loud as hell. 9700 does NOT need anything like that to be competetive. To be honest, FX is only competetive when it's overclocked, which also happens to be the default state of FX.

Matrox: said it would revolutionize gaming and be extremely fast, neither of which it did.

Surround-gaming IS pretty damn cool, there are people who absolutely love it. And it was pretty fast. I have seen benchmarks where it's competetive with 9700. Though the performance is not best there is, it still performes well.

It hyped its bump-mapping and triple-head gaming, and nobody cared.

There are people who care about surround-gaming, and there are even more people who care about triplehead.

By the time you turn on all the features that Matrox hyped, you're getting 20fps.

The benchmarks I have seen suggest that the card is fast enough to run games in surround-mode using FAA.

Yes, that Matrox card is useful to some people, but only a very select few. At least the gaming market will open up to nVidia's card.

Really? What does FX offer that 9700 does not? Besides loud as hell cooling-solution ;)? In some cases it gives you marginally better performance. But in order to have that, you need to suffer from the noise the FX makes. I for one am NOT prepared to do that!

No gamer is going to buy the Parhelia. And just to make it clear, the Parhelia WAS intended for the gaming market. It's clearly the bigger flop.

Parhelia was inteded for ALOT more markets than just the gaming-market! Gaming was just one of the things you could do with it. It was meant for videoediting, Photoshop... Things like that. And if you wanted to play some Quake3 after photoshopping, you could use Parhelia for that too.
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: Evan Lieb
But the fact is that RIGHT NOW, FX is so loud that it is REALLY turning potential buyers elsewhere! It's just too loud, period.

No, you just haven't heard an actual (i.e. shipping) GeForceFX card before. :)

Assuming 9700 is about as loud as GF2 GTS, then FX is too loud. True, I haven't hear a shipping FX yet, and I hope that it's more silent. But _right now_ FX is too loud. If NV can cut down the noise, then I might consider it. But right now, I wouldn't consider it.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: Nemesis77
Originally posted by: Evan Lieb
But the fact is that RIGHT NOW, FX is so loud that it is REALLY turning potential buyers elsewhere! It's just too loud, period.

No, you just haven't heard an actual (i.e. shipping) GeForceFX card before. :)

Assuming 9700 is about as loud as GF2 GTS, then FX is too loud. True, I haven't hear a shipping FX yet, and I hope that it's more silent. But _right now_ FX is too loud. If NV can cut down the noise, then I might consider it. But right now, I wouldn't consider it.

LOL. You miss the point. You haven't seen or heard a "production quality" GeforceFX so you don't know if its cooling system produces 7,70,or 170dBa of noise. In the past most AIB have followed nVidia's reference PCB designs very closely, but many (if not all) have strayed from nVidia's reference cooling design.
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: merlocka
LOL. You miss the point. You haven't seen or heard a "production quality" GeforceFX so you don't know if its cooling system produces 7,70,or 170dBa of noise.

I will put my entire post here, with highlights where needed.

Assuming 9700 is about as loud as GF2 GTS, then FX is too loud. True, I haven't hear a shipping FX yet, and I hope that it's more silent. But _right now_ FX is too loud. If NV can cut down the noise, then I might consider it. But right now, I wouldn't consider it.

I was talking about how RIGHT NOW FX is way too loud. My post is pretty clear on that. And unless they drop the noise, I will not consider it.
 

Dulanic

Diamond Member
Oct 27, 2000
9,965
590
136
Originally posted by: Nemesis77
Originally posted by: merlocka
LOL. You miss the point. You haven't seen or heard a "production quality" GeforceFX so you don't know if its cooling system produces 7,70,or 170dBa of noise.

I will put my entire post here, with highlights where needed.

Assuming 9700 is about as loud as GF2 GTS, then FX is too loud. True, I haven't hear a shipping FX yet, and I hope that it's more silent. But _right now_ FX is too loud. If NV can cut down the noise, then I might consider it. But right now, I wouldn't consider it.

I was talking about how RIGHT NOW FX is way too loud. My post is pretty clear on that. And unless they drop the noise, I will not consider it.

I think your missing Evan's message there... Evan is pretty good at giving hints but not actually saying things :)

Anyway's what does the cooling solution matter on a card that is never going to be sold the the public... Nvidia doesnt sell final products. You keep saying RIGHT NOW its too loud... RIGHT NOW you cant buy one... NEVER will you beable to buy the card that got reviewed in retail.... Yes the card nvidia made is too loud... but man... people are WAY overeacting, I could care less how loud a card I CANT BUY will sound.... I care how the final products are. When a retail card hits and its that loud... then complain.

Im not defending Nvidia because Im some zealot... if I was to buy a card right now it would likely be a 9700 Pro because of its AF and FSAA (possibly, this is being looked into) look better. But I'm also not gonna go crazy over a cooling solution that I wont even be on retail cards (except lazy manufacturers). Maybe with some new drivers I may end up considering the FX... Im waiting for final reviews of retail cards. I dont even doubt there will be quieter retail cards, so thats one of the last things on my mind.
 

Nemesis77

Diamond Member
Jun 21, 2001
7,329
0
0
Originally posted by: Dulanic
I think your missing Evan's message there... Evan is pretty good at giving hints but not actually saying things :)

No I'm not. He's suggesting that the shipping FX will be considerably more quiet. I acknowledged that and said that I sure hope that it will be more quiet.

You keep saying RIGHT NOW its too loud... RIGHT NOW you cant buy one

I say RIGHT NOW because it's the only thing I have right now. Meaning: There has been no reviews of retail FX's. So I have to base my opinions on it. And RIGHT NOW it's too loud. period.

Yes the card nvidia made is too loud... but man... people are WAY overeacting, I could care less how loud a card I CANT BUY will sound

That is assuming that the final board will be considerably more quiet. If it is, I could buy it (if I were upgrading). If it isn't tough luck. I'm basing my opinions on the facts that I have right now: that the tested FX-cards were insanely loud. I can't go around assuming that the noise wont matter when the product ships, since we don't yet know for sure what kind of cooling-solution the card will have then. Only thing I have heard is the 7db cooling in one of the FX's. If it's true and viable, then it great. But we haven't seen the product yet, just the insanely loud versions.

But I'm also not gonna go crazy over a cooling solution that I wont even be on retail cards (except lazy manufacturers).

Hopefully the retails do fix the cooling. If they don't, tough! But we have no reviews of those products, so we can't really comment on them. Right now it's vaporware. All we have is the boards that were tested on Anandtech and elsewhere. But I hope that the final product is CONSIDERABLY more quiet. But untill I see it with my own eyes, I have to base my opinions on the reviews that we have seen.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
There's no point in arguing, they're grasping at straws here. I've noticed the focus has shifted to pricing, thinking it is going to cost $700 US based on the Euro conversion rate. I guess they think pricing is global and have no idea what economies of scale are.
rolleye.gif
There's also the issue of $399 MSRP being too expensive, but the R9700pro still carries the same MSRP.

Regarding the noise non-issue, maybe this will help clear up any "doubts":

Originally posted by: Evan Lieb
Why is everyone so surprised? Anand and the rest of the Internet reviewed a non-shipping card from NVIDIA; i.e. not a shipping retail product.

Since I'll be doing individual video card reviews soon, I called up MSI, PNY and Leadtek to see what their FX schedule would be. All of them said FX availability next month, though not in massive quantities...however what was interesting was that they all purposefully claimed that their cards would be much quieter than the competition, especially the FX cards used in reviews over the Internet.

Though 7db is a little hard to believe, I'm sure most FX cards aren't going to be any louder than 9700 Pro cards.



 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Regarding the noise non-issue, maybe this will help clear up any "doubts":

Originally posted by: Evan Lieb
Why is everyone so surprised? Anand and the rest of the Internet reviewed a non-shipping card from NVIDIA; i.e. not a shipping retail product.

Since I'll be doing individual video card reviews soon, I called up MSI, PNY and Leadtek to see what their FX schedule would be. All of them said FX availability next month, though not in massive quantities...however what was interesting was that they all purposefully claimed that their cards would be much quieter than the competition, especially the FX cards used in reviews over the Internet.

Though 7db is a little hard to believe, I'm sure most FX cards aren't going to be any louder than 9700 Pro cards.

Sorry, but I would absolutely not take his word as gospel. Couple of insider tips he's given in the last week on this subject:

Yup, looks like early summer 2003 if it's .15-micron or sometime after that if it's .13-micron. However, if it's just a core clock speed increase (as we suspected), then I can't see it defeating the GeForceFX, though it'll be in the ballpark probably.

Considering the r300 is already "in the ball park" the r350 would litterally have to be slower than the r300 to be considered slower than the FX. Anyone want to predict that the r350 will be slower than the r300? Who here thinks the r350 even with just a clock boost (which hasn't been confirmed) won't be able to beat the FX in the majority of benchmarks now?

The Inq gets it dead wrong here. I have great respect for Magee, but this piece is just FUD. Wait for the reviews (coming very soon), and then you can see whether you believe NVIDIA is going the way of 3DFX.

OK, I've seen the reviews, and I'm thinking 3dfx more than ever, how did the reviews disprove anything? The FX reminds me of the V5 5500 release more than any other card I can think of, huge, hot, late and only on par performance wise with the competition when it was expected to be so much more. NVidia is not going out of business anytime soon, but if the FX release doesn't bring back memories of the V5 release then you haven't been following the industry.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Pariah
Sorry, but I would absolutely not take his word as gospel.

But instead, you're willing to take the previews of a pre-release reference board as "gospel"? nVidia has said throughout OEMs had full reign over cooling design, yet despite reports to the contrary in a press release by a well-known AIB maker AND someone who works in the industry you still choose to base your opinion on a reference board? That makes sense.
rolleye.gif


Considering the r300 is already "in the ball park" the r350 would litterally have to be slower than the r300 to be considered slower than the FX. Anyone want to predict that the r350 will be slower than the r300? Who here thinks the r350 even with just a clock boost (which hasn't been confirmed) won't be able to beat the FX in the majority of benchmarks now?

I'm not sure what you're expecting out of R350, but the latest roadmap does indicate it will be nothing more than a clock/memory frequency boost. No die revisions and no enhancements to the architecture were mentioned. The first .13 micron chip is expected to be RV350. If you want to see how a clock/memory speed increase will improve performance, you need to look no further than Tyan's Tachyon, which provides about a 17% boost over a stock Radeon. How well the R350 will scale or overclock is anyone's guess, but considering it will still be on a .15 micron process, headroom may be limited.

OK, I've seen the reviews, and I'm thinking 3dfx more than ever, how did the reviews disprove anything? The FX reminds me of the V5 5500 release more than any other card I can think of, huge, hot, late and only on par performance wise with the competition when it was expected to be so much more. NVidia is not going out of business anytime soon, but if the FX release doesn't bring back memories of the V5 release then you haven't been following the industry.

No, you've seen the PREviews. How do the initial previews of the 9700pro and 9500pro stack up to their release performance? Quite a performance delta IIRC. That piece in the Inquirer was FUD and much of that rhetoric has spilled over here. Yes, its huge, hot, and late, but thats a by-product of innovation made possible by being the industry leader. Considering nVidia was late to market and essentially missed a product cycle, they must be tickled-pink to see they maintained their industry market share. Its easy to blame nVidia for the late launch of the FX, but if you had been following the industry you would remember that NV30's specs were finalized last year. Complications in shifting to a .13 micron process at TMSC held up the FX's production; but I guess acknowledging that depends on what side of the industry you follow.

Chiz
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
But instead, you're willing to take the previews of a pre-release reference board as "gospel"? nVidia has said throughout OEMs had full reign over cooling design, yet despite reports to the contrary in a press release by a well-known AIB maker AND someone who works in the industry you still choose to base your opinion on a reference board?

I believe very few prerelease boasts from companies as they are inaccurate far more often than they are accurate. Nvidia didn't choose the horrible cooling solution they are using to impress anyone or for their own amusement, but because that's what they felt was necessary to run their hardware properly. Sure OEM's are free to use any cooling solution they want to, if they want to put a dinky chipset passive heatsink on the card they can, but it isn't going to run at all. We'll see what other companies come up with but to expect major improvements is wishful thinking unless Nvidia has really lost their edge.

I'm not sure what you're expecting out of R350, but the latest roadmap does indicate it will be nothing more than a clock/memory frequency boost.

I'm expecting nothing from the R350 except that it will undeniably return the performance crown to ATi. Not by a large margin, but across the board. Evan seems to think that R350 will only achieve ballpark performance of the FX, which is kind of silly when you can see the R300 is already beating the FX in a number of benchmarks. It isn't going to take much of a clock boost to push it past the FX in the majority of benchmarks. ATi has had 6 months since the R300 release, I would hope they have achieved more than a 10MHz speed bump of the GPU by now.

How well the R350 will scale or overclock is anyone's guess, but considering it will still be on a .15 micron process, headroom may be limited.

The R300 has proved to be a pretty decent OC'er so there is definitely headroom left. Mine hits 380MHz standard cooling, and it may go higher than that, I just haven't tried.

No, you've seen the PREviews. How do the initial previews of the 9700pro and 9500pro stack up to their release performance?

Please post links to back up this claim. There isn't going to be any huge performance increase from the tested boards to the shipping boards in 3 weeks. Every product release no matter the company these claims are made, and they never turn out to be true. Maybe 6 months from now, but not in 3 weeks.


 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Pariah
I believe very few prerelease boasts from companies as they are inaccurate far more often than they are accurate. Nvidia didn't choose the horrible cooling solution they are using to impress anyone or for their own amusement, but because that's what they felt was necessary to run their hardware properly. Sure OEM's are free to use any cooling solution they want to, if they want to put a dinky chipset passive heatsink on the card they can, but it isn't going to run at all. We'll see what other companies come up with but to expect major improvements is wishful thinking unless Nvidia has really lost their edge.

Now you're just waffling. Is it a prerelease or has it been released? How can you question the truthfulness of their claims when every single assertion you've made about the FX is based upon assumptions? You think a company would risk their reputation and not be able to deliver? I'm not sure what you do for a living, but in the real world, people and firms are held accountable for their assertions and "boasts." Of course no one is held accountable in this safe haven called the internet, where the only stakes are a troll or a flame. You should really look into whats going on over at Zalman, they've been producing quiet passive coolers for some time that hardly resemble a "dinky chipset fan".

Evan seems to think that R350 will only achieve ballpark performance of the FX, which is kind of silly when you can see the R300 is already beating the FX in a number of benchmarks.

He said that prior to the benchmarks being released based on his knowledge at the time. Maybe he knew something we don't. Maybe he still does. :shrug;

It isn't going to take much of a clock boost to push it past the FX in the majority of benchmarks. ATi has had 6 months since the R300 release, I would hope they have achieved more than a 10MHz speed bump of the GPU by now.
The R300 has proved to be a pretty decent OC'er so there is definitely headroom left. Mine hits 380MHz standard cooling, and it may go higher than that, I just haven't tried.

The fact that a 9700pro is currently a decent OC'er is actually a bad sign for the R350, IMO. ATI could have very well been depending on that extra headroom for an R300 refresh. Fault tolerances are built into every core design, but there's going to be a point where scaling hits a wall. Without a core revision or a die shrinkage, its anyone's guess what they'll be able to squeeze out of the .15 process. There have been rumors of lower power consumption as well, but we've seen that it does not guarantee higher clock speeds (early T-bred A's), just heat dissipation.

And again, there have been reports of driver problems leading to flawed AA and AF results with the beta drivers, making it more difficult to judge the FX's performance or potential. Many reviews noted that the FX ran significantly faster with aggressive AA and AF settings than the 9700pro, but b/c the quality appeared to be worse, they tested with the quality settings. Reports from nVidia cite driver issues and frame buffer screen capture techniques. Although the impact is questionable, it does introduce a difficult paradigm involving subjective testing methods based on preference. When its all said and done, maybe the aggressive and quality settings of the FX sit on either side of the 9700pro's performance, leaving the final judgement to the end user to decide between more frames per second or image quality.

Please post links to back up this claim. There isn't going to be any huge performance increase from the tested boards to the shipping boards in 3 weeks. Every product release no matter the company these claims are made, and they never turn out to be true. Maybe 6 months from now, but not in 3 weeks.

Not sure if this is a case of selective memory or genuine oversight. The Radeon 9500pro was released just over 2 months ago, and previewed 3 months ago.

9500pro preview dated 10/24

9500pro review dated 11/27

You should be able to quickly see there is in fact a significant increase in performance due to final silicon and drivers; enough to leapfrog an entire GPU family. nVidia may very well be able to perform a similar feat considering the quality of their shipping drivers tend to improve as cited by most review sites. As for having to wait 6 months, I highly doubt it considering they release WHQL drivers almost monthly. There's often 1 or 2 beta releases between WHQL versions, so there should be improvement with each driver update and game patch (that optimizes for the FX).
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
You think a company would risk their reputation and not be able to deliver? I'm not sure what you do for a living, but in the real world, people and firms are held accountable for their assertions and "boasts."

Not sure what ideal world you live in, but practically every day this happens where companies don't achieve what they claimed they were going to do.

You should really look into whats going on over at Zalman, they've been producing quiet passive coolers for some time that hardly resemble a "dinky chipset fan".

That one went right over your head, it was complete sarcasm but you missed it.

Fault tolerances are built into every core design, but there's going to be a point where scaling hits a wall. Without a core revision or a die shrinkage, its anyone's guess what they'll be able to squeeze out of the .15 process.

The difference between the XP2200+ Tbred Rev A and XP2800+Tbred rev B is proof that miracles can happen without a die shrink. Regardless, look how often Nvidia has "refreshed" their cards without major revisions and increased performance quite well.

Not sure if this is a case of selective memory or genuine oversight. The Radeon 9500pro was released just over 2 months ago, and previewed 3 months ago.

Is this the best you could come up with? A hack job with a 9500Pro GPU mounted on a 9700Pro PCB with half its memory disabled? The GeforceFX GPU was not mounted on a TI4600 PCB using only 64MB RAM. How about a comparison where the GPU is actually on the right PCB using the same amount of memory?
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Not sure what ideal world you live in, but practically every day this happens where companies don't achieve what they claimed they were going to do.

And there are repercussions, because they are held accountable. It's one thing not to meet expectations, but to purposefully deceive is another. It seems that you feel Gainward is attempting the latter. I live in reality, and in reality companies answer to shareholders and regulatory bodies. When companies lie or knowingly deceive, these entities take action. The key to any business is managing expectations; a press release of a product they could not produce would negatively impact their ability to meet expectations.

That one went right over your head, it was complete sarcasm but you missed it.

Original quote: Sure OEM's are free to use any cooling solution they want to, if they want to put a dinky chipset passive heatsink on the card they can, but it isn't going to run at all.

Really??? Sarcasm??? Wasn't this the topic that launched your little inquisition and FUD-finding expedition? You still sound pretty convinced that Gainward and a respected AT editor are lying.
rolleye.gif


The difference between the XP2200+ Tbred Rev A and XP2800+Tbred rev B is proof that miracles can happen without a die shrink. Regardless, look how often Nvidia has "refreshed" their cards without major revisions and increased performance quite well.

I said a die shrink or a major core revision. It only took AMD 8 months and a core revision to begin producing their B revision T-breds en masse in enough supply to fill out their product lines (good luck finding a 2800+ T-bred B btw). If you read the 2400+ review you would see that the core revision was substantial, and expensive. Another major difference in AMD's situation is b/c they had just moved to a fab process in in its infancy, the .13 process, whereas ATI has truly pushed the boundaries of the .15 process beyond anyone's expecations. How much further they'll be able to push it, we'll see. Nvidia is a pretty poor example, their refreshes have largely been centered around memory speed differences, as each generation uses the same stepping GPU.

Is this the best you could come up with? A hack job with a 9500Pro GPU mounted on a 9700Pro PCB with half its memory disabled? The GeforceFX GPU was not mounted on a TI4600 PCB using only 64MB RAM. How about a comparison where the GPU is actually on the right PCB using the same amount of memory?

I guess someone hasn't been following the industry so much himself. Isn't that all a 9500pro is? A hack job 9700pro? There's clearly no difference in the core itself, and the difference of 64MB is negligible as current games simply don't use the extra memory. But since you'll no doubt come up with some other excuse, I'll quote Anand:

The majority of the performance differences come from a new driver that is shipping with the card. The driver improves performance across the board on the 9500 Pro and also works on the 9700 Pro. Performance isn't really enhanced on the 9700 Pro however, in some cases it's faster in others it's slower.
and from his FX review:

All evidence points to the GeForce FX's drivers holding it back

Maybe you can second guess Anand's conclusions as well???
rolleye.gif


Chiz
 

skace

Lifer
Jan 23, 2001
14,488
7
81
I'd say parhelia, only because the last thing I remember from these forums (last time I was really idling for video cards) was some member who would CONTINUALLY go in every thread about video cards and post about how his cousin's uncle's nephew's father's brother-in-law knew this guy who said the card was going to be insane and be like 12x faster than everything on the market now. He would make fun of all the nVidia owners / etc / etc saying how they just bought a paper weight and the jokes going to be on them. It was all very annoying to read. Anyone remember who that was? Because I've forgotten his name.

I missed the hype on the FX card, partly because I've been content with a GF4 and don't even see the need for a new card. The problem here is that people are rating the FX card based on the sample board which isn't even perfect yet. EVERYONE here must know that the production boards won't be as loud as that, otherwise they are just fooling themselves.