ATI months ahead of NVIDIA with DirectX 11 GPU schedule?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: ShadowOfMyself
Yet another ATI thread derailed by you know who :roll:

My thought about the whole thing is - what the hell has nvidia been doing lately? Since the 8800GTX came out 2 and a half years ago they have barely improved, they keep doing stupid refreshes and rebadges instead of working on a new card, sounds like AMD resting on their Athlons until they got owned by Core2

In the same time, ATI had enough time to pull from a disaster called HD2900, and now we have a stellar line of products that is arguably superior when it comes to price/performance, so really, what the hell is nv doing?

Oh and dont bother replying with physx + cuda crap to justify anything, I dont care about that and most of the forum doesnt either, what I want in a card is gaming performance, period

Having the top single and multi-GPUs and retaining a larger market share?


What is your point here?
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Wreckage
Originally posted by: Creig
Originally posted by: Genx87
Like I said gamers wouldnt give a shit if it was built like a Geforce 256, if it performed.

So to reiterate I dont find process shrinks and memory selection "ground breaking".

Why do you keep saying "if it performed"? In reality, it wouldn't perform the same, would it?

That's just it though the GDDR5 cards are not the fastest on the market. The consumers are not seeing any benefit.

Sure the consumers see the benefit. Do you think if the 4870 didn't perform as well as the GTX260 we wouldn't have all benefited from the price war? AMD was able to price their cards where they did because of the design decisions they made for the RV770. Did DDR5 make the 4870 faster than a GTX260 216 with DDR3? No, they are very close in performance. But, the 4870 gave users that level of performance for under $300 vs. $450 Nvidia wanted for a GTX260 192 part at launch. That IS a benefit.


Originally posted by: Genx87
Originally posted by: nitromullet
Let's hope that ATI can pull another win this year because it looks like AMD needs it. http://www.dailytech.com/AMD+P...+Loss/article14936.htm

We certainly don't want an NV/Intel Duopoly, which is where we are headed if AMD doesn't get it's @$%# together...

Just sickens me to think how stupid the decision to buy ATI was for AMD.

Looks like they were the only portion of AMD that turned a profit. The decision to buy ATI doesn't look like a terrible decision to me... now how much they overpaid, that's a different story.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Originally posted by: Wreckage
Originally posted by: ShadowOfMyself
what I want in a card is gaming performance, period

So either a GTX285 or a GTX295. Two fastest cards, period.

I hear what you are saying but if ATI can keep die size down and total costs reduced (relative to Nvidia) future-wise they won't neccessarily need to have the fastest card. They just need to be first to market with the new generation of GPUs that at least significantly "one-ups" the previous generation of top performing cards (from any manufacturer)

Admittedly that GT200 core is a true world beater from a raw performance aspect , but how much does it cost to make considering how large it is?

Being better is one thing but lets say (for the sake of argument) competition can make something 90% as effective for 60% of the cost this is going to impact future manufacturing decisions. (Re: this impacts the ability to "force" price cuts on the competition)



 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Just learning


Admittedly that GT200 core is a true world beater from a raw performance aspect , but how much does it cost to make considering how large it is?

I don't care how much it costs to make. Do you? I got mine for around $200 and you can now buy an overclocked core 216 for around $180. Besides if it's financials you are worried about. NVIDIA has something like $2 billion in the bank while AMD could be out of business by the end of the year.

Originally posted by: SlowSpyder

Sure the consumers see the benefit. Do you think if the 4870 didn't perform as well as the GTX260 we wouldn't have all benefited from the price war?

Do you think that if the 4870 was the top GPU they would have sold it for the same price? Nope.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Wreckage


Originally posted by: SlowSpyder

Sure the consumers see the benefit. Do you think if the 4870 didn't perform as well as the GTX260 we wouldn't have all benefited from the price war?

Do you think that if the 4870 was the top GPU they would have sold it for the same price? Nope.

I'm sure they would have charged more than $299. The point was that DDR5 being a design decision very likely did provide a benefit to consumers. Whether you wanted an AMD/ATI card or an Nvidia card, pressure from the 48x0 cards forced Nvidia to lower prices so you could get the card you wanted for less money. That is a benefit to the consumer.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Originally posted by: Genx87
Originally posted by: Creig
Originally posted by: Genx87
Why do people think it is ground breaking to have GDDR5, 40nm, and 1Ghz GPU's? That is a process\manufacturing decision.

I'd rather have something that is tangible as a consumer like SLI, HDR, PureVideo, Non Adapative AF.

How aren't any of those items groundbreaking? It may be a process/manufacturing decision, but it doesn't magically appear on video cards. There is still a lot of engineering and designing necessary to incorporate those technological advances into new cards. And the company that releases them first generally has an upper hand on its competition, be it in overall speed or cost to manufacture/street cost.

I'd say that makes them very tangible benefits for end consumers.

Because it is an evolutionary step every manufacturer goes through. We can pretty much map out the mask sizes for the next 20 years. Achieving each step isnt ground breaking. It is a process change and something that the end user wont see.

But I admit perhaps I should have reworded the intentions of my response from "thinking it is groundbreaking" to "who gives a flying eff". None of these process changes matters to the end user. It could be on the .30 process, have a size of 2000mm, and run at 1Mhz. If it plays my games at top speed what do I care how it is done via manufacturing?

Where as HDR, SLI, AF ect are technologies I can actually see and care about as a gamer.

The roadmap to smaller manufacturing processes are just that, signposts that say we're going to go here and here and here. One still needs to research the materials and do testing to see how it works out. And just because it is the same nm size manufacturing process doesn't mean you use the same road to get there. Different materials used can lead to different yield percentages, heat, power consumption.

All of the above factors affect the performance and availability of CPU's and GPU's. Costs savings are usually passed down to consumers (except when there is a lack of competition such as the ATI gaffs where nVidia has benefited the last couple years) and as consumers this is one of the ways we benefit but they also help keep the companies who produce the products we buy in business when they are able to save money.

Originally posted by: Wreckage
Nobody cares how the card is made. They care about how it plays games and what features it has (such as video encoding or physics).

Manufacturing process is the worry of the card maker not the card buyer.

Besides it not like any of that helped ATI to make cooler cards or more power efficient cards... those things would also benefit the end user.

Originally posted by: Wreckage
But they were not faster. They never held the single GPU crown. They sure as heck were not cooler. After a quick price drop from NVIDIA they were not even cheaper. Not to mention a lack of support for next gen gaming.

So again zero benefit to the end user.

If no one cares about how a card is made why the emphasis on how nVidia's GPU is great because they have the fastest single GPU solution? ATI was beating nVidia on the price/performance ratio and this forced nVidia to drop it's prices.

Even after the price drop ATI cards of roughly equal performance, specials and occasional large rebates notwithstanding, could be found quite often for less than the competing nVidia card.

I'd say that consumers benefited a huge amount because ATI was cleaning up the charts on the price/performance standpoint forcing nVidia to drop it's prices to stay competitive. How is that zero benefit to the end user?

Originally posted by: OCguy
Originally posted by: Creig

Are you trying to say that gamers don't care about faster, cooler, cheaper video cards? Because that's the end result of process changes.

Just because it's 'behind the scenes' to most people doesn't mean they don't ultimately enjoy the benefits of the technology advancements that make new generation hardware possible to produce.

Where are the faster and cooler DDR5 cards? :confused:

Imagine how hot the ATI cards would run without the process shrink?

As for DDR5, this helped reduce the overall costs associated with ATI card design. You can't say that after all it's said and done the smaller ATI GPU's along with simpler board design didn't help it from a cost perspective. Sure, heat wise it might be similar to nVidia cards and past ATI cards as far as heat output during full load but the cost savings has benefited nVidia buyers when ATI forced nVidia to drop it's prices to stay competitive.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Originally posted by: Wreckage


Do you think that if the 4870 was the top GPU they would have sold it for the same price? Nope.

If 4870 came out before GT200 it would have sold it for a lot more money.

 

thilanliyan

Lifer
Jun 21, 2005
12,060
2,273
126
Originally posted by: Wreckage
That $5.6 billion would have allowed AMD to maintain its own fabs and ride out this financial crisis. Now it may just be the death of them.

True but what's done is done and it is the CPU business that's dragging things down (the graphics division was profitable in at least one quarter in the last little while IIRC). Regardless of whether they would have had the extra cash, they need to change things as it would have gotten to the point they are at now at sometime if they keep going as they are...it just happened quicker due to several circumstances (buying ATI being one of them).
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: thilan29
Originally posted by: Wreckage
That $5.6 billion would have allowed AMD to maintain its own fabs and ride out this financial crisis. Now it may just be the death of them.

True but what's done is done and it is the CPU business that's dragging things down (the graphics division was profitable in at least one quarter in the last little while IIRC). Regardless of whether they would have had the extra cash, they need to change things as it would have gotten to the point they are at now at sometime if they keep going as they are...it just happened quicker due to several circumstances (buying ATI being one of them).

Well either way the economy has been $hitty to everyone. I look forward to the day when we can recommend people buy $1000 video cards because everyone is doing well enough to not consider that a big deal.

As for the original topic. Until there is a hard launch of a DirectX 11 part from either company we are basically talking about a mythological card that only exists in the minds of dreamers.
 

allies

Platinum Member
Jun 18, 2002
2,572
0
71
Originally posted by: Wreckage
Originally posted by: nitromullet
Let's hope that ATI can pull another win this year because it looks like AMD needs it. http://www.dailytech.com/AMD+P...+Loss/article14936.htm

We certainly don't want an NV/Intel Duopoly, which is where we are headed if AMD doesn't get it's @$%# together...

Yeah I read that today. I really hope AMD can pull it together, but it's not looking good. Especially after Intel posted rather positive results (relatively speaking).

I have had a lot of AMD CPUs over the years and would still consider one today, but there is noway the company can keep operating like they are.

Relatively speaking AMD did well then too. Their losses were less than anticipated. Edit: I just thought about it and figured you were talking about Intel vs. AMD, not Intel vs. expected earnings. Sorry :

As for the topic. This could be similar to the 9700 Pro. Didn't it release prior to DX9?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
http://forums.anandtech.com/me...=2296969&enterthread=y

In light of GT300 (more than double the processing power via 512 processors....up from GT200's 240 processors) what do you think ATI will be doing with HD58xx?

Will they be able to fit 2000 stream processors on their current die size? Or will they use something more conservative like 1600 stream processors?

Maybe (in this economy) a 1600 stream processor design for ATI is not a bad idea provided they can release the new part before Nvidia.

Something just tells me the market for the ultra high-end GPU may be a little lower than what is was previously and making GPUs with purposely deactivated cores (to fill in lower price points) may not work as well if the starting core is too expensive or big?

 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
it's hard to say. going from 3870 to 4870 was huge, and AMD won this round for the most part. Their cards are very cost effective. Going from G92 to G200 was a disappointment, and you've got tons of kids with G80/G92 cards who will probably upgrade this time around. It's AMD's job to make sure GT300 isn't a $600+ card at launch. if anything, the market is growing, but people can't upgrade every season if the gains are going to be less than 20%.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Genx87
ATI's problem hasnt been so much performance since 2002 but delivery of their product. Wasnt it the x1800 that was 6 months behind because there was a bug in their design software? Then it showed up slow and hot and they quickly followed up with the X1900. And the X800 XTX or whatever the hell the highest end card they had to compete with the 6800 was backordered for months. I think some people still have the card on backorder.

Originally posted by: Just learning
Originally posted by: Pantalaimon
Again with GT200 the outcome was also obvious....Nvidia had very weak competition (till RV870) and up and till that point could get away with the unopposed mark-ups

NVIDIA had to drop the prices to almost half in less than a month on their GT200 models after the HD4850 and HD4870 came out.

That was because GT200 >> HD3870x2

Fact is (with the exception of the "midcycle" X1900XTX) Nvidia has always been "first to market" with the new generation of cards and Multi-GPU technologies.

For example 6800 beat X800 to market.

SLI beat Crossfire to market.

7800 beat X1800 to market.

8800 beat 2900XT to market (and was a whole lot better)

GT200 beat HD48xx to market and is better (although possibly more expensive to produce than HD48xx)

But now for the first time in a long while AMD/ATI has its first major chance to get some of the "first adopter" market.


Yeah ATI's problems the last few years seemed to revolve around their memory bus. First, with the X1800 cards, they introduced the 512-bit ring bus, but were also 6 months late with problems in the X1800 card.

They recovered nicely with the X1900 series, but then had massive problems bringing out the 2900XT (still the only true 512-bit card to date).

"Brute Force" architecture hasn't truly worked for ATI since the 9700 Pro, meanwhile, Nvidia has hit home run over home run (6800 series, 8800 series, GT200 series).

ATI seems to operate best being "lean and mean" - the 38xx and 48xx series were successes because they built them to be performance per watt champs and cheap to make.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Wreckage

Yeah the R300 launched in 2002 and they have pretty much struggled ever since. After that NVIDIA launched the 6xxx series with SLI, SM3 and Purevideo. ATI pretty much just played catch up.
Yeah, like HDR + AA, multi-GPU AA, and AAA in OpenGL. Oh wait, ATi had those first.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Nothing is more of a phantom than the 6800UE you mentioned in the other post. But even a regular x800xt was overall faster than the 6800U.
Text:
"In the eye candy mode, i.e. when 4x full-screen anti-aliasing and 16x anisotropic filtering are enabled, the ASUS AX800 XT has much more impressive results, outperforming the GeForce 6800 Ultra across the majority of applications, save for Doom 3 and IL-2 Sturmovik."

Hehe, that post is quite comical. You realize that the x800 Phantom Edition was clocked at 500MHZ right? Check out the clock rate on that engineering sample review you linked. Limiting ourselves to actual released parts, nV did come out ahead that generation.

You mean the 7900gx2 which was released half a year after the x1900xt? Which by that time competed with the a single-gpu x1950xtx, and didn't always win either? Yeah, it sure smoked something, didn't it?

It overwhelmingly obliterated the x1950xtx- the review you linked even shows that. Yes, some benches the x1950 pulled ahead, just as the 192 core 260 sometimes bested the 4870x2, doesn't mean that the 4870x2 is very accurately considered a MUCH faster board. As far as why the comparison was made in the first place, someone made the assinine assertion that none of the 7 series GeForce products were faster then the x19 series, this very clearly wasn't the case.

I said the 6800 Ultra, not the 6800 Ultra Extreme, which does not exist and is not an official nvidia part.

Versus the PE which didn't exist either? Sorry, if we are comparing imaginary cards then green teams should be compared to red teams. If we eliminate the imaginary cards, then nV wins the majority of benches, even in the review you linked.

Also, if cost isn't a concern, then why not compare that 7900GX2 with X1900 crossfire?

You could SLI 7900GX2s also.
 

mmnno

Senior member
Jan 24, 2008
381
0
0
Originally posted by: BenSkywalker


You could SLI 7900GX2s also.

So why not compare that with x1900 crossfire? You do know that looked even worse on nVidia's side right? Because of that original look at dual sammiches, I'm truly astonished today when I see tri- and quad- setups even posting any positive percentage benefit over dual-cards.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K
Originally posted by: Wreckage

Yeah the R300 launched in 2002 and they have pretty much struggled ever since. After that NVIDIA launched the 6xxx series with SLI, SM3 and Purevideo. ATI pretty much just played catch up.
Yeah, like HDR + AA, multi-GPU AA, and AAA in OpenGL. Oh wait, ATi had those first.

Which do you think was more significant? Honestly?

Those are very, very minor in comparison.

Also we are talking about the X1800 era here.

 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: BenSkywalker
Originally posted by: munky
I said the 6800 Ultra, not the 6800 Ultra Extreme, which does not exist and is not an official nvidia part.
Versus the PE which didn't exist either? Sorry, if we are comparing imaginary cards then green teams should be compared to red teams. If we eliminate the imaginary cards, then nV wins the majority of benches, even in the review you linked.

The X800XT PE was definitely produced and sold to the public. In fact, my daughter's computer is running one right now.

 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Originally posted by: mmnno
Originally posted by: BenSkywalker


You could SLI 7900GX2s also.

So why not compare that with x1900 crossfire? You do know that looked even worse on nVidia's side right? Because of that original look at dual sammiches, I'm truly astonished today when I see tri- and quad- setups even posting any positive percentage benefit over dual-cards.

x1900 Crossfire was better than 7950GX2. but at that time ATI didn't have a dual GPU card of their own.

Speaking of dual GPU Video cards when do you think we will see Tri-SLI or Tri-Fire in a single PCI-E slot? :)

If this ever happened hopefully someone figures how to get all the memory contributing instead of this "mirroring" effect (or whatever it is called). This mirroring effect with the memory is just a big waste of chips.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
So why not compare that with x1900 crossfire?

Ok, what is supposed to change my view? I guess if you drop the settings way down, then quad sli didn't look so hot. Crank everything up, and quad sli actually rather handily bested either normal SLI or Crossfire for that generation.

The X800XT PE was definitely produced and sold to the public. In fact, my daughter's computer is running one right now.

I suppose that is true, to be completely fair ATi did sell some of the x800xt PEs based on various reports. But availability seemed to be comparable however.
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Originally posted by: Creig
Originally posted by: BenSkywalker
Originally posted by: munky
I said the 6800 Ultra, not the 6800 Ultra Extreme, which does not exist and is not an official nvidia part.
Versus the PE which didn't exist either? Sorry, if we are comparing imaginary cards then green teams should be compared to red teams. If we eliminate the imaginary cards, then nV wins the majority of benches, even in the review you linked.

The X800XT PE was definitely produced and sold to the public. In fact, my daughter's computer is running one right now.

While the X800XT PEs were a little tough to get they were nothing compared to the 6800 Ultra UEs. I think they only gave those cards away for contests and reviews.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Speaking of dual GPU Video cards when do you think we will see Tri-SLI or Tri-Fire in a single PCI-E slot?

When chips are small enough and cool enough with reasonable power requirements would be my guess. ATi seems to be headed in that general directon.

If this ever happened hopefully someone figures how to get all the memory contributing instead of this "mirroring" effect (or whatever it is called).

That would be extremely difficult at best. Likely the best way to approach that would be to have three seperate chips that do differing things on a shared high speed bus connected to a common memory pool. I say that would likely be the best in as much as that would be much easier to engineer then trying to get our current multi chip approach to not require redundant memory. Not saying it isn't possible, and I'm sure both nV and ATi have people working on that, just if it does come down the pipe it will likely be extremely complex(and I would assume who ever gets it done first will patent the hell out of it making it nigh impossible for the others to follow).
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Originally posted by: BenSkywalker
Speaking of dual GPU Video cards when do you think we will see Tri-SLI or Tri-Fire in a single PCI-E slot?

When chips are small enough and cool enough with reasonable power requirements would be my guess. ATi seems to be headed in that general directon.

I'm thinking this might even save on development costs in some scenarios.

The current trend for Nvidia right now appears to design a "mega core" video card and then either deactivate part of it or just use the old cards for the midrange and lower range. But then for the lower range how easy will it be to downsize G92 to 40nm for this purpose?

At some point they will need to design something new for these lower cost SKUs and doing this with an entirely new creation costs money.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: BenSkywalker
Hehe, that post is quite comical. You realize that the x800 Phantom Edition was clocked at 500MHZ right? Check out the clock rate on that engineering sample review you linked. Limiting ourselves to actual released parts, nV did come out ahead that generation.
No, the "Phantom Edition" was clocked at 520mhz. I used to have a regular x800xt, and there was nothing phantom about it, not to mention it easily ran at 520+mhz.

It overwhelmingly obliterated the x1950xtx- the review you linked even shows that. Yes, some benches the x1950 pulled ahead, just as the 192 core 260 sometimes bested the 4870x2, doesn't mean that the 4870x2 is very accurately considered a MUCH faster board. As far as why the comparison was made in the first place, someone made the assinine assertion that none of the 7 series GeForce products were faster then the x19 series, this very clearly wasn't the case.
LMAO, by that measurement the x800xt also "overwhelmingly obliterated" the 6800U.