Different Far Cry for nV40 owners?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

reever

Senior member
Oct 4, 2003
451
0
0
Originally posted by: Acanthus
Originally posted by: reever
Originally posted by: Safeway
Can you get an R&D quote from ATI?


Unless they specifically state how much they spent there is no way of knowing from simple annual reports as both companies don't just make graphic cards

Judging from the technology and specs of the new card, i think its safe to say they spent far far less than nvidia.

Except they also needed to redo the entire chip manufacturing line to 0.13 low-k, that can cost a *lot* of money
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: reever
Originally posted by: Acanthus
Originally posted by: reever
Originally posted by: Safeway
Can you get an R&D quote from ATI?


Unless they specifically state how much they spent there is no way of knowing from simple annual reports as both companies don't just make graphic cards

Judging from the technology and specs of the new card, i think its safe to say they spent far far less than nvidia.

Except they also needed to redo the entire chip manufacturing line to 0.13 low-k, that can cost a *lot* of money

They were already producing .13 low-k chips before the X800Series (R9600Pro/XT). Im not saying they didnt spend a lot on R&D, im saying its very likely that they didnt spend as much compared to NVIDIA, as the chip is more evolutionary than revolutionary.
 

Curley

Senior member
Oct 30, 1999
368
3
76
Conclusions


"Conclusions
The new Far Cry patch does indeed seem to be a decent showcase for Shader Model 3.0's potential. The GeForce 6800 cards gain up to about 10% in average frame rates when using the SM3.0 code path. No, the differences aren't going to convert the masses into NVIDIA fanboys overnight, but they do show NVIDIA wasn't kidding about Shader Model 3.0 offering some advantages. "

These cards are running neck and neck just as they should be for the price. It is unfortunate for Nvidia that the major computer retailers have elected to choose ATI for thier PCI Express preference. Why, because the Nvidia will probably take the lead in performance but the cost is to high, power supply, cooling, etc.. in the oem market for 5-10% increase.

Being the bleeding edge, hungry for performance enthusiast, I will probaly end up with the Nvidia 6800 Ultra Overclocked, Nitrious Oxide powered card.

ATI has one thing going for them, Work Smarter, Not Harder. Employing a one PCI slot, one molex connector, and 350 Watt power supply requirement, they were obviously looking for a marketable solution that would be competative with Nvidia need to be No. 1 in performance.

As for the Statement in the TechReport article above, they've never had the opportunity to meet Rollo.
 

SilentRunning

Golden Member
Aug 8, 2001
1,493
0
76
Originally posted by: BFG10K
FYI Anand's results were wrong (no AA on nVidia cards). Newer results have been published.

Yes, people might want read the article again with the corrected benchmarks.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: BFG10K
FYI Anand's results were wrong (no AA on nVidia cards). Newer results have been published.

These threads are hilarious. :D I suspect that the magic drivers that will make one company super king probably do not exist, but fan sites and etc. will be quick to crown. The best new card is really which ever one you are lucky enough to own. One has more features and quicker (?), but other uses less power and quiet versions should arrive soon at a not too much of a price penalty. Get what you prefer and don't cheer so loud.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Acanthus
Judging from the technology and specs of the new card, i think its safe to say they spent far far less than nvidia.
This seems like a safe assumption for the X800, but I think the reason for it is b/c ATi is sinking more time (and thus money) into the ~SM3 Xbox2 GPU, which should make its way into PCs next spring. It probably comes down to "limited" resources, and nV seems to have spent a lot of theirs on NV4x, while ATi doesn't seem to have spent as much on R420.

Originally posted by: reever
Except they also needed to redo the entire chip manufacturing line to 0.13 low-k, that can cost a *lot* of money
I doubt moving to 130nm low-K cost ATi anywhere near hundreds of millions of dollars, especially since some of that cost was offset by their previous RV360 production.