• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

anyone think ATI's triumph will be short lived?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I say ATi gets this round as well. they're achieving the same, if not better, performance as nV while using a card that takes half as many power connectors and requires a smaller cooling solution
 
Originally posted by: chsh1ca
R400, NV40. PFFT. Next gen Matrox is where it's at! 😛

I don't think you can reliably call this a "triumph", mostly because it seems that all the cards at the high end will be available in large quantities (unlike the FX5950U), and because the performance of both cards is pretty well equal. I'd say the scales tip towards NVidia in the features department, but I was surprised at the lack of reviews involving the MPEG-2 encoder that was supposed to be packaged up with the NV40s.

i'm with u all the way bud! imo, matrox is waiting for ati and nv to release their bigger cards and then matrox will step in wahaha, just an assumption, though time will tell.
 
I think ATI's engineers are more talented than Nvidia's. And I tend to think they have the better future in the graphics card market.

I think you think what you're told to think.

I say ATi gets this round as well. they're achieving the same, if not better, performance as nV while using a card that takes half as many power connectors and requires a smaller cooling solution

*yawn* you do realise that extra molex on the ultra is unecessary and pureply for OC stability? Let's look at the 6800GT which is basically the same core declocked... I personally say bring on the big 2 molex beast and let me have a go at it. I hope a manufacturer decides to makes add a extra molex to the 6800GT so I can OC it to above ultra levels at the GT's price. Not to mention if you're shelling over for one of these enthusiast cards... you should have a good PSU anyway.
 
Last time aroudn Ati was more efficient with regards to clock rate. Somehow i get the feeling that the same people criticizing Ati for having high clock rates now were not criticizing nvidia for having higher clock rates last time.

Just a guess though 🙂 Personally I don't understand why people become personally attached to corporations who honestly couldn't care less about you.
 
but ATI's card is running 520mhz.....the theortical limit for greater than 150million trannies on 0.13 is bout 600mhz........nvidia isnt too far behind and theyre only at 400mhz...
Yeah but at 400 MHz nVidia is cranking out far more heat and using far more power than ATi at 520 MHz. Clock speeds and die size aren't the only factor in the equation.
 
ATI won the round. Why? Because even if nVidia's NV40 does end up faster than the R420 with maturity and driver optimization, the fact of the matter is that it doesn't seem anyone company will crush the other with performance. ATI took the crown when their 9700Pro completely crushed the Ti 4600, there was no question as to who was the new king of the hill. The fact that nVidia has yet to offer anything that makes ATI's product a hands down loser, ATI will remain on top of the hill until they are truly knocked down, and you can't really share the top either...
 
Originally posted by: Rudee
I think ATI's engineers are more talented than Nvidia's. And I tend to think they have the better future in the graphics card market.

Didn't ATI buy out another company who had designed the 9700 Pro's core? And all this time they've just been making modifications/improvements to it this whole time?

Not to detract from ATI's talent or anything... cause they're obviously doing something right... but I wouldn't credit ATI's engineers for something they didn't create, only perfected. I will give credit where credit is due... they did a wonderful job perfecting it... but you make it sound as if they were hanging a picture and slipped and hit their head on the toilet and had a vision of the 9700 Pro. 😉
 
Originally posted by: Jeff7181
Originally posted by: Rudee
I think ATI's engineers are more talented than Nvidia's. And I tend to think they have the better future in the graphics card market.

Didn't ATI buy out another company who had designed the 9700 Pro's core? And all this time they've just been making modifications/improvements to it this whole time?

Not to detract from ATI's talent or anything... cause they're obviously doing something right... but I wouldn't credit ATI's engineers for something they didn't create, only perfected. I will give credit where credit is due... they did a wonderful job perfecting it... but you make it sound as if they were hanging a picture and slipped and hit their head on the toilet and had a vision of the 9700 Pro. 😉

ATI bought out ArtX, whose intellectual property included what eventually became the R300-series GPU (and now, with further modification, the R400). IIRC, they had been working on the Nintendo Gamecube's graphics chip, and then ATI took *that* design and turned it into the R300. But unless someone who was or is an engineer at ATI comes over and tells us all exactly what happened, we'll never know exactly how much work ATI had to put into ArtX's hardware to turn it into the R300/R350/R360, and then to further refine it into the R420.

Results matter. Maybe ATI didn't design and build the R300 from scratch, but they certainly didn't just buy something from ArtX and throw it out the door with no support either. Everyone uses designs and ideas from outside sources in addition to their own in-house developments, and bases parts of new designs off of their own earlier ones. It's not like the GeForce6 was redesigned entirely from scratch either. Maybe the shaders (it's tough to say how extensive NVIDIA's modifications were), but the guts of its triangle processing is still largely based off of the GeForceFX, which itself is based off the GF4 (and so on, and so on).

Also, keep in mind that that just coming up with the design for the structure of the GPU is not necessarily the hardest part -- they would have still had to do quite a bit to create the R9XXX and RX800 product lines even if ArtX had handed them a fully designed 9700Pro GPU (which does not seem likely). At the very least, ATI has pushed ArtX's design through a process shrink, modified its pixel and vertex shaders extensively, and scaled it from 4 to 16 pipelines in a variety of memory and clockspeed configurations. These are not trivial changes.
 
Thanks Matthias, I couldn't recall the name of the company they bought. I agree with you that ATI has done extensive development and didn't just take the GPU and slap their logo on it. If it sounded like that's what I was saying that wasn't my intention... it was to point out that while ATI has been successful and no doubt have great engineers... it's not entirely accurate to credit them with the whole design. Similar to the way nVidia acquired knowledge from 3Dfx and may in fact be using some of their ideas even today in current GPU's.
 
Originally posted by: g3pro
Originally posted by: Rudee
I think ATI's engineers are more talented than Nvidia's. And I tend to think they have the better future in the graphics card market.

Oh MY!!! You really have no clue, do you? :roll: Perhaps you would like to meet some of them.

He thinks -- everyone is entitled to his or her own opinion. You don't have to thrash him for it. ATI has been a leader since August 2002 and still remains one after coming out of nowhere. This somewhat justifies his beliefs. Also since R420 is an extension of the R300 core, that means ATI engineers were able to come up with a core 2.5 years ago that is competitive today. Even after Nvidia redesigned their GPU (to NV40) they are still not the leaders today. So again his opinion is well justified to some extent.
 
Yeah but at 400 MHz nVidia is cranking out far more heat and using far more power than ATi at 520 MHz. Clock speeds and die size aren't the only factor in the equation

Need a little proof on this. The few people who have tried to measure the power draw of these cards have them all within 15-20% of each other at load. That is hardly "far" more heat and using "far" more power.
 
He thinks -- everyone is entitled to his or her own opinion. You don't have to thrash him for it. ATI has been a leader since August 2002 and still remains one after coming out of nowhere. This somewhat justifies his beliefs. Also since R420 is an extension of the R300 core, that means ATI engineers were able to come up with a core 2.5 years ago that is competitive today. Even after Nvidia redesigned their GPU (to NV40) they are still not the leaders today. So again his opinion is well justified to some extent.

Depends on how you define "leader".

If running a DX7 level game at 300 FPS is "leader" then I am not terribly impressed. What ATI has done this generation and last generation is create a great pixel pusher. Eventually they are going to have to come up with something a little more elegant. Because eventually games will require more than just pixel pushing power.

Sure the R3.xx was better at shader operations than the NV3.xx but that was because the NV3.x had half as many pipes. What I see now is an Nvidia card with a 120Mhz deficeit keeping up with ATIs greatest. Eventually like Intel you will need a more efficient design.

Good marketing decision btw but I dont consider ATI the leader unless you consider how fast they can push a 4 year old game like QuakeIII great. Nvidia this round broke even on the benchmarks but excelled at the feature level. Developers will be working on Nvidias solution for SM3.0 for a year before ATI has anything to show for it. So they will be behind the 8 ball when it comes to developing games for their hardware. And if you dont think SM3.0 will latch on you are crazy. Like people have said, SM3.0 is a developers model. The last time I checked developers make the games.
 
First off, ATI is currently winning. If you dont belive me, go look at anandtech benchmarks.

ATI =
Better Benchmarks results (so far)
less power consumption
one slot solution
Pro is available to buy

nVidia =
Worse Benchmakrs results in most tests (so far)
more power consumption
two slot solution
not available yet
More features then ATI but none are used yet

ATI = *current* winner. nVidia = awesome but not as good as ATI based on pure performance.

And no, im not an ATI fanboy. I currently have a GeForce3 and i have never owned an ATI product, I just look at benchmarks.
 
Originally posted by: Genx87
He thinks -- everyone is entitled to his or her own opinion. You don't have to thrash him for it. ATI has been a leader since August 2002 and still remains one after coming out of nowhere. This somewhat justifies his beliefs. Also since R420 is an extension of the R300 core, that means ATI engineers were able to come up with a core 2.5 years ago that is competitive today. Even after Nvidia redesigned their GPU (to NV40) they are still not the leaders today. So again his opinion is well justified to some extent.

Depends on how you define "leader".

If running a DX7 level game at 300 FPS is "leader" then I am not terribly impressed. What ATI has done this generation and last generation is create a great pixel pusher. Eventually they are going to have to come up with something a little more elegant. Because eventually games will require more than just pixel pushing power.

Sure the R3.xx was better at shader operations than the NV3.xx but that was because the NV3.x had half as many pipes. What I see now is an Nvidia card with a 120Mhz deficeit keeping up with ATIs greatest. Eventually like Intel you will need a more efficient design.

NV3X had more problems than just its pipeline limitations. In particular, shader code that issued instructions in a nonoptimal order could really drag its performance down (still a problem in NV40, BTW, though use of HLSLs has mitigated it). It also ran relatively slowly in FP32 mode -- and since it couldn't do FP24, anything that needed better than FP16 precision had to be run in FP32.

And I hesitate to call an IPC/clockspeed tradeoff a decisive advantage. AMD matches Intel's overall performance at 2/3 the clockspeed, but they don't overclock as much, and Intel still beats them at certain tasks (like video encoding) where raw bandwidth and long pipelines are advantageous. I guess we'll have a better idea which company's position is stronger when we see overclocking results on retail cards, and developers start to figure out the real limits, strengths, and weaknesses of these chipsets.

Good marketing decision btw but I dont consider ATI the leader unless you consider how fast they can push a 4 year old game like QuakeIII great.

What about how fast they are in a next-gen game like Far Cry, then?

And what has NVIDIA done this generation that is so much more 'elegant'?

Nvidia this round broke even on the benchmarks but excelled at the feature level. Developers will be working on Nvidias solution for SM3.0 for a year before ATI has anything to show for it. So they will be behind the 8 ball when it comes to developing games for their hardware. And if you dont think SM3.0 will latch on you are crazy. Like people have said, SM3.0 is a developers model. The last time I checked developers make the games.

Developers make the games, but they also don't like to invest heavily in new technologies that have next to no market penetration. I think it'll catch on within the next year (mostly because it's not too hard to integrate SM3.0 shaders once you're already using SM2.0, and most developers are headed that way), but it's not going to be truly widespread until there are $100 cards that can handle it. Do you think most game developers are going to ignore the entire last generation of SM2.0 hardware in favor of the relative handful of $300+ cards that NVIDIA will sell this year? Half of the super-high-end graphics market is not a very big market share, even when you're talking about just computer gamers.

And SM3.0 is not an NVIDIA-specific standard. If ATI's R500 (or maybe even R480) comes out and matches NVIDIA at SM3.0 performance, ATI would benefit just as much from any SM3.0 code that had already been produced.
 
Originally posted by: Genx87
Depends on how you define "leader".

If running a DX7 level game at 300 FPS is "leader" then I am not terribly impressed. What ATI has done this generation and last generation is create a great pixel pusher. Eventually they are going to have to come up with something a little more elegant. Because eventually games will require more than just pixel pushing power.
Actually, I'd say the reverse is true. Looking at various benchmarks, ATI has for far longer had the upper hand when it comes to features. They implemented SM2.0 at useable speeds (look at all the FarCry NV3x vs R3x0 benches), and offered various AA/AF modes with far less of a penalty than their NVidia counterpart -- all the while getting beat out at the older "pixel-pumping feature-less" DX7/8 games.

Sure the R3.xx was better at shader operations than the NV3.xx but that was because the NV3.x had half as many pipes. What I see now is an Nvidia card with a 120Mhz deficeit keeping up with ATIs greatest. Eventually like Intel you will need a more efficient design.
The most confusing part of the benchmarks to me (which has been stated by others before) is the slim difference the 4 extra pipelines seems to make for the X800XTs. It should be a 30% increase in speed over the X800Pro given it has +30% of the pipes. The difference between the X800XT and the X800Pro from Anand's article on the X800XT/Pro/etc:
Aquamark: 4.9FPS (8.3%)
FarCry: 26 FPS (22.9%), 21.5FPS (24.6%), 16.8FPS (25.3%), 12.9FPS (25.2%)
Halo: 11.4FPS (15.7%), 9.4FPS (17.2%),
Homeworld 2: 14.4FPS (18.4%), 18.1FPS (26.1%)
F1 Challenge: 11.9FPS (16.5%), 8.2FPS (14.9%)
EVE:2G: 5FPS (9.7%), 14.4FPS (32.4%) -- Looks driver-related since the R9800XT outperforms the X800Pro on AA/AF
UT2004: 1.1FPS (1.9%), 5.7FPS (11%), 5.4FPS (11%), 7.3FPS (16.2%)
(And so on)

It really seems to me like perhaps more mature drivers will improve the performance of the cards for both companies.


Good marketing decision btw but I dont consider ATI the leader unless you consider how fast they can push a 4 year old game like QuakeIII great. Nvidia this round broke even on the benchmarks but excelled at the feature level. Developers will be working on Nvidias solution for SM3.0 for a year before ATI has anything to show for it. So they will be behind the 8 ball when it comes to developing games for their hardware. And if you dont think SM3.0 will latch on you are crazy. Like people have said, SM3.0 is a developers model. The last time I checked developers make the games.[/quote]
 
Originally posted by: i82lazyboy
I think ATI's engineers are more talented than Nvidia's. And I tend to think they have the better future in the graphics card market.

I think you think what you're told to think.

I say ATi gets this round as well. they're achieving the same, if not better, performance as nV while using a card that takes half as many power connectors and requires a smaller cooling solution

*yawn* you do realise that extra molex on the ultra is unecessary and pureply for OC stability? Let's look at the 6800GT which is basically the same core declocked... I personally say bring on the big 2 molex beast and let me have a go at it. I hope a manufacturer decides to makes add a extra molex to the 6800GT so I can OC it to above ultra levels at the GT's price. Not to mention if you're shelling over for one of these enthusiast cards... you should have a good PSU anyway.

BS, it's not just for OC stability. That was only said to please investors on the conference call. Sevreal reviewers have confirmed that while running at stock speeds the 6800U either has corruption or just doesn't work when only one Molex is used.
 
Am I the only one who stops reading when people quote 4 other people and respond to each one with multiple paragraphs? 🙂
 
Originally posted by: BFG10K
Need a little proof on this.
Two molex connectors, a 480W minimum PSU requirement and a gigantic two-slot cooling solution is all the proof you need.

But they said that the two slot solution was only for the reference model. Retail GTs will be single slot.
 
Originally posted by: BFG10K
Need a little proof on this.
Two molex connectors, a 480W minimum PSU requirement and a gigantic two-slot cooling solution is all the proof you need.

480w is not the minimum any more
Link
the 480w "minimum" is now being dropped to 350w and the two molex's ARE NOT needed to run the card...the second molex is to give it extra juice to OC....and a two slot solution...who gives a crap...like you use your first PCI slot anyways....its funny how people b!tch about a two slot solution...and most had a 9800pro with a arctic cooling silencer on it.....
 
Retail GTs will be single slot.
Who was talking about the GT?

the 480w "minimum" is now being dropped to 350w and the two molex's ARE NOT needed to run the card..
Yes I saw that later

who gives a crap...like you use your first PCI slot anyways..
Whether you use the slot is irrelevant to the fact that the card requires it to cool it. That's what this discussion was about - power requirements.
 
Originally posted by: BFG10K
Whether you use the slot is irrelevant to the fact that the card requires it to cool it. That's what this discussion was about - power requirements.

on reference boards its dual slots...we may see singles once everyone else gets a hold of the things......holding reference designs against what posibly may hit shelves is not really showing much. sure it may run warm and be a bear to cool...but once a company gets it they may design their own system which is one slot. so again wait for the retail cards to show up then lets see what happons.
I never base an opinion on reference designs...if I did I'd always be let down.
 
Originally posted by: otispunkmeyer
sure this has already been through this forum and is now a million pages away gathering dust.....but ATI's card is running 520mhz.....the theortical limit for greater than 150million trannies on 0.13 is bout 600mhz........nvidia isnt too far behind and theyre only at 400mhz....if the cards were equal speeds i think NV would win and then ATI would claw back abit with AA and AF

i think wen Nv's manufacturing matures i think theyll have ATI out for the count

Is the theoretical limit the same for a 150million chip as for a 200 million chip?
I find it hard to believe ATi and nVidia will have the same ceiling.
Or that that is the actual ceiling (one or both cards will probably break it eventually, in "normal conditions" as one has done with exotic cooling)
 
Back
Top