ATI months ahead of NVIDIA with DirectX 11 GPU schedule?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: lopri
Well, I thought this was a tech forum. Not 'Consumer Reports'. And GDDR3 vs. GDDR5 directly relates to performance as well as power/heat. I am interested in 40nm because of its potential for better products, not because of better margins company X will make. (Though some seem to care about such)

And yet the 4870 runs hotter, is louder and uses more power than a 260.

http://www.techreport.com/articles.x/15651/11

Certainly ATI could have benefited from using a new process but they had to run the card at its limit in order to perform well.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Originally posted by: Wreckage
Originally posted by: lopri
Well, I thought this was a tech forum. Not 'Consumer Reports'. And GDDR3 vs. GDDR5 directly relates to performance as well as power/heat. I am interested in 40nm because of its potential for better products, not because of better margins company X will make. (Though some seem to care about such)

And yet the 4870 runs hotter, is louder and uses more power than a 260.

http://www.techreport.com/articles.x/15651/11

Certainly ATI could have benefited from using a new process but they had to run the card at its limit in order to perform well.
And at the day of release it walked all over the GTX 260 for 2/3 the price... sounds like complete domination to me. And oh yah, I'm sure those tests were accurate enough to discern the .6 dB sound difference between the cards :roll:

Anyway, if ATI completely dominates the competition again, especially with an early release, I'll be grabbing one. A single GPU/card for 2560x1600 = awesome :D.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Genx87
None of these process changes matters to the end user. It could be on the .30 process, have a size of 2000mm, and run at 1Mhz. If it plays my games at top speed what do I care how it is done via manufacturing?

Where as HDR, SLI, AF ect are technologies I can actually see and care about as a gamer.

Using GDDR5 instead of GDDR3 reduces the PCB complexity necessary to maintain large memory bandwidth and thus reduces production costs. And a card with a GPU made using a 40nm process is (generally) going to be cooler running and less expensive to produce than one made on a 55nm process. These cost savings can be passed down to the end user. And typically, each die shrink has resulted in GPUs that run faster than their previous generation counterpart.

Are you trying to say that gamers don't care about faster, cooler, cheaper video cards? Because that's the end result of process changes.

Just because it's 'behind the scenes' to most people doesn't mean they don't ultimately enjoy the benefits of the technology advancements that make new generation hardware possible to produce.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Creig


Are you trying to say that gamers don't care about faster, cooler, cheaper video cards? Because that's the end result of process changes.

But they were not faster. They never held the single GPU crown. They sure as heck were not cooler. After a quick price drop from NVIDIA they were not even cheaper. Not to mention a lack of support for next gen gaming.

So again zero benefit to the end user.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: Creig

Are you trying to say that gamers don't care about faster, cooler, cheaper video cards? Because that's the end result of process changes.

Just because it's 'behind the scenes' to most people doesn't mean they don't ultimately enjoy the benefits of the technology advancements that make new generation hardware possible to produce.

Where are the faster and cooler DDR5 cards? :confused:
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: Creig
Originally posted by: Genx87
None of these process changes matters to the end user. It could be on the .30 process, have a size of 2000mm, and run at 1Mhz. If it plays my games at top speed what do I care how it is done via manufacturing?

Where as HDR, SLI, AF ect are technologies I can actually see and care about as a gamer.

Using GDDR5 instead of GDDR3 reduces the PCB complexity necessary to maintain large memory bandwidth and thus reduces production costs. And a card with a GPU made using a 40nm process is (generally) going to be cooler running and less expensive to produce than one made on a 55nm process. These cost savings can be passed down to the end user. And typically, each die shrink has resulted in GPUs that run faster than their previous generation counterpart.

Are you trying to say that gamers don't care about faster, cooler, cheaper video cards? Because that's the end result of process changes.

Just because it's 'behind the scenes' to most people doesn't mean they don't ultimately enjoy the benefits of the technology advancements that make new generation hardware possible to produce.

They dont see it and dont care provided the card performs as intended. So yes I dont believe gamers really care if their card has GDDR5 or GDDR3 or is on a .40 or .32 process.

 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: BenSkywalker
The Radeon X800XT, Radeon X800XTPE, Radeon X850XT, and Radeon X850XTPE all very soundly beat the 6800 Ultra.

Really?

If that's too difficult a decision to make, our data shows the 6800 GT to lead the X800 Pro in performance. Maybe the $10 is a factor to you and maybe it isn't, but in this close race, price should definitely be a determining factor.

We can also see the 6800UE leading the X800XTPE most of the time as well.

Link.

So when you say 'very soundly beat' that actually means flat out loses? Dialect difference perhaps?

The 7900GX2 was around twice as expensive as the x1900xt.

And the 4890 cost a lot more then the 9400GT, doesn't change the fact that the 7900gx2 was significantly faster.

I said the 6800 Ultra, not the 6800 Ultra Extreme, which does not exist and is not an official nvidia part.

http://www.anandtech.com/showdoc.aspx?i=2044&p=11
http://www.anandtech.com/showdoc.aspx?i=2044&p=12
http://www.anandtech.com/showdoc.aspx?i=2044&p=13
http://www.anandtech.com/showdoc.aspx?i=2044&p=14
http://www.anandtech.com/showdoc.aspx?i=2044&p=16
http://www.anandtech.com/showdoc.aspx?i=2044&p=17
http://www.hardocp.com/article...EwLCxoZW50aHVzaWFzdA==

There are far, far more reviews that favor the x800xt cards and there are also far more people on various forums who agree.

Also, if cost isn't a concern, then why not compare that 7900GX2 with X1900 crossfire?
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: SlowSpyder
Originally posted by: munky
Originally posted by: Wreckage
Originally posted by: Just learning
If this is true I think it will really help ATI.

I mean seriously when was the last time they were considered "the best" and "first to market"? Wasn't it in 2003 with the 9800 series cards?

Yeah the R300 launched in 2002 and they have pretty much struggled ever since. After that NVIDIA launched the 6xxx series with SLI, SM3 and Purevideo. ATI pretty much just played catch up.

Unbiased history would disagree with you.
X800XT was faster than any card from the 6-series
X1900XT was faster and more advanced than anything from the 7-series.

I would think that the X1900 card will hold up in games a lot better today than a 7-series card as well.

I remember my x1900xt ran Bioshock surprisingly well at 1920x1200.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Originally posted by: Wreckage
Originally posted by: Creig


Are you trying to say that gamers don't care about faster, cooler, cheaper video cards? Because that's the end result of process changes.

But they were not faster. They never held the single GPU crown. They sure as heck were not cooler. After a quick price drop from NVIDIA they were not even cheaper. Not to mention a lack of support for next gen gaming.

So again zero benefit to the end user.
Are you talking about the 4800 series? Because they dominated the market after release, the GTX series couldn't even come close to touching them.

Originally posted by: WreckageBut they were not faster. They never held the single GPU crown.
But them came close for less than half the price, NVIDIA got owned.

Originally posted by: WreckageThey sure as heck were not cooler.
The 4870 had a TDP equal to that of the (slower) GTX 260, and much lower than that of the GTX 280, putting out much less heat.

Originally posted by: WreckageAfter a quick price drop from NVIDIA they were not even cheaper.
Actually it was several price drops, and by then NVIDIA had completely lost the round and a huge chunk of market share it took awhile to recover.

Originally posted by: WreckageNot to mention a lack of support for next gen gaming.
What on Earth are you talking about?
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Originally posted by: Genx87
Originally posted by: Creig
Originally posted by: Genx87
None of these process changes matters to the end user. It could be on the .30 process, have a size of 2000mm, and run at 1Mhz. If it plays my games at top speed what do I care how it is done via manufacturing?

Where as HDR, SLI, AF ect are technologies I can actually see and care about as a gamer.

Using GDDR5 instead of GDDR3 reduces the PCB complexity necessary to maintain large memory bandwidth and thus reduces production costs. And a card with a GPU made using a 40nm process is (generally) going to be cooler running and less expensive to produce than one made on a 55nm process. These cost savings can be passed down to the end user. And typically, each die shrink has resulted in GPUs that run faster than their previous generation counterpart.

Are you trying to say that gamers don't care about faster, cooler, cheaper video cards? Because that's the end result of process changes.

Just because it's 'behind the scenes' to most people doesn't mean they don't ultimately enjoy the benefits of the technology advancements that make new generation hardware possible to produce.

They dont see it and dont care provided the card performs as intended. So yes I dont believe gamers really care if their card has GDDR5 or GDDR3 or is on a .40 or .32 process.
If you can equate to them the cost benefit of GDDR5 so that they see the $, that's something everyone can understand. I think you're both arguing the same side of the coin, just one more directly than the other.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: nitromullet
Let's hope that ATI can pull another win this year because it looks like AMD needs it. http://www.dailytech.com/AMD+P...+Loss/article14936.htm

We certainly don't want an NV/Intel Duopoly, which is where we are headed if AMD doesn't get it's @$%# together...

Yeah I read that today. I really hope AMD can pull it together, but it's not looking good. Especially after Intel posted rather positive results (relatively speaking).

I have had a lot of AMD CPUs over the years and would still consider one today, but there is noway the company can keep operating like they are.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: BenSkywalker
The Phantom Edition x800 was the only board that was competitive with the 6800Ultra, overall probably a draw between those two parts. ATi's refresh(850) was needed for them to have a card that people could buy that was competitive.
Nothing is more of a phantom than the 6800UE you mentioned in the other post. But even a regular x800xt was overall faster than the 6800U.
Text:
"In the eye candy mode, i.e. when 4x full-screen anti-aliasing and 16x anisotropic filtering are enabled, the ASUS AX800 XT has much more impressive results, outperforming the GeForce 6800 Ultra across the majority of applications, save for Doom 3 and IL-2 Sturmovik."

The 7900GX2 smoked the x1900xt, it wasn't close.

You mean the 7900gx2 which was released half a year after the x1900xt? Which by that time competed with the a single-gpu x1950xtx, and didn't always win either? Yeah, it sure smoked something, didn't it?
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Genx87
They dont see it and dont care provided the card performs as intended. So yes I dont believe gamers really care if their card has GDDR5 or GDDR3 or is on a .40 or .32 process.

Ten years ago, the GeForce 256 was first released. It was built on a 220nm process, running 120 MHz on the GPU and with SDRAM at 166 MHz on a 128 bit memory bus.

Try constructing a modern video card using that technology.

It is only through the advancements in technology that the current generation of CPUs/motherboards/video cards have been able to be built. As I already said, most people may not know why their systems run so much faster than those from 10 years ago, but they certainly do care.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: Creig
Originally posted by: Genx87
They dont see it and dont care provided the card performs as intended. So yes I dont believe gamers really care if their card has GDDR5 or GDDR3 or is on a .40 or .32 process.

Ten years ago, the GeForce 256 was first released. It was built on a 220nm process, running 120 MHz on the GPU and with SDRAM at 166 MHz on a 128 bit memory bus.

Try constructing a modern video card using that technology.

It is only through the advancements in technology that the current generation of CPUs/motherboards/video cards have been able to be built. As I already said, most people may not know why their systems run so much faster than those from 10 years ago, but they certainly do care.

Like I said gamers wouldnt give a shit if it was built like a Geforce 256, if it performed.

So to reiterate I dont find process shrinks and memory selection "ground breaking".
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: Genx87
Like I said gamers wouldnt give a shit if it was built like a Geforce 256, if it performed.

So to reiterate I dont find process shrinks and memory selection "ground breaking".

Why do you keep saying "if it performed"? In reality, it wouldn't perform the same, would it?



 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: Creig
Originally posted by: Genx87
Like I said gamers wouldnt give a shit if it was built like a Geforce 256, if it performed.

So to reiterate I dont find process shrinks and memory selection "ground breaking".

Why do you keep saying "if it performed"? In reality, it wouldn't perform the same, would it?

But the current GDDR3 cards perform as fast or faster than the current GDDR5 cards. You are trying to have it both ways.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Creig
Originally posted by: Genx87
Like I said gamers wouldnt give a shit if it was built like a Geforce 256, if it performed.

So to reiterate I dont find process shrinks and memory selection "ground breaking".

Why do you keep saying "if it performed"? In reality, it wouldn't perform the same, would it?

That's just it though the GDDR5 cards are not the fastest on the market. The consumers are not seeing any benefit.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Originally posted by: Genx87
Originally posted by: nitromullet
Let's hope that ATI can pull another win this year because it looks like AMD needs it. http://www.dailytech.com/AMD+P...+Loss/article14936.htm

We certainly don't want an NV/Intel Duopoly, which is where we are headed if AMD doesn't get it's @$%# together...

Just sickens me to think how stupid the decision to buy ATI was for AMD.

Even though the economy is bad ATI's lead with 40nm and DX11 is making me feel more optimistic about its chances (for profitability)

Getting to market with the smaller process first and the high performance is something it hasn't done for a long time. I'm sure those "first adopter" sales make a lot more money than what they have been doing (the occasional "refresh" like X1900 being an exception)

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Wreckage

Nobody cares how the card is made. They care about how it plays games and what features it has (such as video encoding or physics).

Manufacturing process is the worry of the card maker not the card buyer.

Besides it not like any of that helped ATI to make cooler cards or more power efficient cards... those things would also benefit the end user.

They care about how it plays games... period. I certainly don't care if it supposedly "has" physics or video encoding... which really means someone somewhere threw together some software that somehow uses the gpu for something other than 3D rendering and called it a "feature".
 

thilanliyan

Lifer
Jun 21, 2005
12,060
2,273
126
Originally posted by: Genx87
Just sickens me to think how stupid the decision to buy ATI was for AMD.

It's AMD's CPU business that's crap. After Core2 came out it just went downhill for the CPU business and the ATI 2900 or 3000 series didn't help the matter though.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: thilan29
Originally posted by: Genx87
Just sickens me to think how stupid the decision to buy ATI was for AMD.

It's AMD's CPU business that's crap. After Core2 came out it just went downhill for the CPU business and the ATI 2900 or 3000 series didn't help the matter though.

They spent $5.6 Billion on ATI and it has not made them a dime yet when you add up the last 2 years.

That $5.6 billion would have allowed AMD to maintain its own fabs and ride out this financial crisis. Now it may just be the death of them.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Yet another ATI thread derailed by you know who :roll:

My thought about the whole thing is - what the hell has nvidia been doing lately? Since the 8800GTX came out 2 and a half years ago they have barely improved, they keep doing stupid refreshes and rebadges instead of working on a new card, sounds like AMD resting on their Athlons until they got owned by Core2

In the same time, ATI had enough time to pull from a disaster called HD2900, and now we have a stellar line of products that is arguably superior when it comes to price/performance, so really, what the hell is nv doing?

Oh and dont bother replying with physx + cuda crap to justify anything, I dont care about that and most of the forum doesnt either, what I want in a card is gaming performance, period