ATI months ahead of NVIDIA with DirectX 11 GPU schedule?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TC91

Golden Member
Jul 9, 2007
1,164
0
0
Originally posted by: chizow
No not really, even if performance is similar, an Nvidia card is superior in just about every way. I recently put together a nice long list, can't remember if it was for you or someone else. ;)

QFT times infinity.
 

thilanliyan

Lifer
Jun 21, 2005
12,060
2,273
126
Originally posted by: chizow
No not really, even if performance is similar, an Nvidia card is superior in just about every way. I recently put together a nice long list, can't remember if it was for you or someone else. ;)

Chizow, you're supposed to take that green stuff you drink in moderation!! :beer::p
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Wreckage

That does not answer the question. Do you think they would have sold the top card for the same price?
No, I don't. Like I said, they would've released a "GTX275" and "GTX260+" like nVidia did to remain competitive in that pricing space.

It worked better than AVIVO (which did not exist at the time).
Well okay, so technically AVIVO debuted with the R5xx, but ATi still had similar hardware accelerated media playback.

Also to this day AVIVO is still very CPU dependent.
I'm not sure what you're basing these claims off:

http://www.xbitlabs.com/articl...on-hd4650_7.html#sect0

It looks to me ATi have the lowest CPU utilization in those tests overall, even the 4550 which is an excessively low end part compared to the nVidia parts being tested.

That would be the X1900 era. I'm not sure how that could not be more clear.
Again I'm not sure what you're basing your claims off. The benchmarks I linked to refuted you on several levels, so I?ll link them again:

http://www.computerbase.de/art...rmancerating_qualitaet

The X1800XT crushes the 7800 GTX 256 MB (which it originally competed with) and matches the 7800 GTX 512 MB, whose availability was rare at best.

Now the previous generation:

http://www.computerbase.de/art...rmancerating_qualitaet

The X850 XT (non PE edition i.e. it was widely available, unlike the 7800 GTX 512 MB) is a lot faster than the 6800 Ultra; heck, even ATi?s second best card of that generation (X800XL) is a tad faster than the 6800 Ultra.

Like I said, the 2xxx/3xxx series is where ATi dropped the ball. R4xx and R5xx parts using later drivers generally outperform nVidia?s competing parts of the time, in some cases significantly. Revisionist history will get you nowhere when the facts speak for themselves.

The X1800 was a steaming pile as well.
:roll:

So what does that make the 7800 GTX 256 MB which it directly competed against, and crushed?

Or how about the 7800 GTX 512 MB which nVidia was forced to release and generally wasn?t available outside of review space?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: chizow

They already had a cut down version, the 4850 and later the 4830, so naturally those would've been more expensive as well and not the bargain they were seen at launch.
No, not if the 9800 GTX+ existed, which the 4850 competes against.

That's my point: even if you have the highest performing part, you also release lower performing parts that are price & performance competitive with the competitions? offerings. If you don?t, you lose potential sales in that market.

As a result of these historical trends, many consumers will simply wait for Nvidia's offering before making any purchasing decisions, ultimately resulting in losses for ATI (both figuratively and literally) in the long-run and overall big picture.
It should be worth that noting that according to the latest financials, ATi yet again turned a profit. It?s AMD?s CPU/chipset division that is losing money, not ATi.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: error8
Originally posted by: chizow


4890 vs. GTX 275 reviews
Again, feel free to count em up, I'm sure you'll see the majority of sites saying the GTX 275 is the faster part. Do you think the 4890 would've dropped in price if the majority of reviews said it was the faster part? I don't think so.

Here, 12 reviews and conclusion gathered by Keysplayr: link to forum post

If you look thorough each and every review, you'll see that 4890 has games and settings, where it's considered faster and so is GTX 275. Some reviews put them equal. It's true that gtx 275 won more "sites" then 4890, but the score is still tight. Again I say, if it would have won 12 out of 12, then it was truly the faster part, otherwise is still a wash out between the two.

It's true that these cards are for the most part, equal in gaming. But if only one card has to get the nod based on the majority of those reviews, it's the 275. 275 gets the gold, 4890 gets the silver. In the olympics, the difference between gold and silver can be 1/10th of a second.

That's gaming.

Then there are the rest of the cards capabilities and features to consider if you are into folding, encoding/transcoding, medical research, astrophysics research simulations...... You know what? Nevermind, there are far too many non gaming applications to list.
I'll link you to Nvidia's CUDA pages and tell me if you see anything remotely close to this on AMD/ATI's GPGPU computing pages.

GPGPU applications

Here is AMD's stream computing page for comparison.

AMD's stream computing apps.

Nvidia - 265 listed apps
AMD - 14 listed apps

Which one of those 14 AMD stream apps couldn't be run faster on an Nvidia counterpart? There was an article for video transcoding/encoding on CUDA vs. Stream. Stream was faster, but had rendering errors/anomolies in the form of poorer IQ and square blocks (pixelations) in the resulting videos. I had a discussion about this with members here, and they agreed that Stream is still a work in progress. An afterthought is more like it. Only so much you can do with a GPU that wasn't really designed for much more than gaming.

GT200. Ask yourselves. Why all the extra transistors? Why the huge die size?
This is what G80 working on up to GT200 (And soon GT300) were designed for. Not just gaming. ATI's offerings are primarily focused on gaming. Yes, they have GPGPU programmability, but are nowhere near as powerful as the Nvidia counterparts.

You don't have to be just a gamer anymore to consider an Nvidia card. You can be a doctor, astrophysisist, engineer, geologist, research scientist, a dedicated protein folder (check our very own distributed computing forum).

From a gaming ONLY standpoint, the GTX275 and the HD 4890 are super close. And that is primarily what this forum here is concerned about. That's really great, but then there is the rest of the world. Nvidia's visions are far larger and broader than ATI's. This cannot be argued as we sit here and look at the vast capabilities of say the GT200 series currently.

AMD came out with a very good gaming GPU at 4xxx launch. And in doing so, did many gamers a huge favor with competitive performance and pricing and created a price war. Always good for gamers and I commend them for that. Gaming. Everything else though, the 4xxx series kind of falls to it's knees in the GPGPU arena when compared to GT200.

It is what it is. So the folks in here can argue which has the better gaming GPU til doomsday. 275/4890 too close to call in most cases. That's done. Move on. You can choose not to consider the overall capabilities of the GPU's because that can't concern you. Why? Because it's not in your best interests when arguing the true worldly question. "Which is the better GPGPU overall including gaming".
There is a much bigger world out there than first person shooters and MMORPG's. Nvidia saw this long ago.






 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: Keysplayr

I'll link you to Nvidia's CUDA pages and tell me if you see anything remotely close to this on AMD/ATI's GPGPU computing pages.

Of course Keysplayr, if one is considering CUDA, then there is no more argue, and the GTX 275 is the ONLY one to get, between the two cards. If one is interested in CUDA then ATI doesn't matter anymore.

If we look only from the gaming point of view, me and many others on this forum and from the entire world, might choose 4890 or any ATI card if it provides us better bang for the buck. For me CUDA is nothing, since I don't fold and I don't care anything about video encoding. I'm just someone who enjoys playing games occasionally, with a limited budget and having CUDA on my GPU means absolutely nothing. It's just a feature I would have and that it's not going to be used ever on my PC. So I wouldn't spend a $ more for a 275, just because it has CUDA.

 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: error8


If we look only from the gaming point of view,

From a gaming point of view gamers will want better features like.

-GPU accelerated Physics
-Transparency AA
-Superior AF
-Ambient Occlusion

 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Wreckage
Originally posted by: BFG10K

No, I don't. Like I said, they would've released a "GTX275" and "GTX260+" like nVidia did to remain competitive in that pricing space.
The point is, if ATI could have sold the 4870 for $600 they would have.

I'm not sure what you're basing these claims off:
Transcoding.
http://www.pcper.com/article.p...=647&type=expert&pid=3

Wreckage, do you forget that the 4870 is faster than the GTX260 192, yet AMD charged less than the GTX260 at launch ($150 less at launch! The GTX260 cost 50% more for a slower part).

I would bet if the 4870 was faster than the GTX280 they would have done something similar. Charged a good deal less than the GTX280 price, but I'm sure more than the $299 they did launch it at.

You know that link you provided shows that AMD's solution is much faster than Nvidia's in that review. :confused:

"After spending a few days with both of these applications, I have to say that I impressed overall with what both NVIDIA and ATI have been able to do with GPU-based transcoding. The speed increases seen in both applications are truly astonishing with the obvious win going to AMD's Avivo Video Converter as it was able to beat out Badaboom handily."

Sure, the AMD solution uses more CPU, but it also gets more work done in a given amount of time. I would not say that's a feather in Nvidia's cap there.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Wreckage
Originally posted by: error8


If we look only from the gaming point of view,

From a gaming point of view gamers will want better features like.

-GPU accelerated Physics
-Transparency AA
-Superior AF
-Ambient Occlusion

Taken from this XbitLabs review.

AMD 4890 highs.

-Wide range of supported FSAA modes;
-Best Edge-detect CFAA in the industry;
-DirectX 10.1 and Shader Model 4.1 support;
-Built-in 8-channel audio controller with HD support;

Both companies have some unique features that would be of interest from a pure gaming stand point.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Just about everyone here would take a card that is 5-10% faster with no use other than gaming over a card that can help run random programs that may or may not ever even be used by the user.

It's only a matter of time until the 4800 cards can run just as many of the non gaming applications as the Nvidia cards currently can, so it's a non issue. For now, the AMD cards provide an extremely good value overall so I'm perfectly fine with waiting a bit for non essential application support.
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: SlowSpyder
Originally posted by: Wreckage
Originally posted by: error8


If we look only from the gaming point of view,

From a gaming point of view gamers will want better features like.

-GPU accelerated Physics
-Transparency AA
-Superior AF
-Ambient Occlusion

Taken from this XbitLabs review.

AMD 4890 highs.

-Wide range of supported FSAA modes;
-Best Edge-detect CFAA in the industry;
-DirectX 10.1 and Shader Model 4.1 support;
-Built-in 8-channel audio controller with HD support;

Both companies have some unique features that would be of interest from a pure gaming stand point.

Yes, but Nvidia's features are better. :roll:
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SlowSpyder

Wreckage, do you forget that the 4870 is faster than the GTX260 192,
And the core 216 is faster than the 4870, The GTX275, 280 and 285 are faster than anything ATI has.... what's your point?


yet AMD charged less than the GTX260 at launch ($150 less at launch! The GTX260 cost 50% more for a slower part).
Because the GTX260 was already out and if AMD did not take a huge loss selling the card that low they may not have sold any at all. AMD is not a non-profit organization. You make them sound like some sort of video card charity.

I would bet if the 4870 was faster than the GTX280 they would have done something similar. Charged a good deal less than the GTX280 price, but I'm sure more than the $299 they did launch it at.
If they did then their board of directors would all be fired.

You know that link you provided shows that AMD's solution is much faster than Nvidia's in that review. :confused:
If you don't have a high end CPU your results will be worse, if you run other applications your results will be worse. The whole point of using a GPU transcoder is to take the load off of the CPU. AMD failed miserably at that. What hurts them even more is that AVIVO will run slower on AMD cpus as they don't compete with a i7.

Sure, the AMD solution uses more CPU, but it also gets more work done in a given amount of time. I would not say that's a feather in Nvidia's cap there.

I noticed you left this quote out...

"There is more to be said than raw benchmarks scores though as quality is just as important as speed for video to most people and in that area Badaboom seemed to win out, even when not taking the "garbage" seen in ATI's results into consideration."

 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
The point was that AMD did charge less for a faster part. There's no way to know if they would or wouldn't do the same if the 4870 had been fastest card on the blcok at launch.

You're right, it looks like AMD's transcoder is still a work in progress as far as the 'garbage'. It looks like they did their testing on a stock clocked Q9650, so I don't know what you're getting at with the i7. I guess another way of looking at it would be that as you get faster CPU's AMD's solution will get faster yet since it appears to load both the GPU and the CPU while Nvidia's will not.

Also, would you care to remind me on what the price of these transcoders are? I'll give you a hint, one costs $30 the other costs $0.00.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: SlowSpyder


Also, would you care to remind me on what the price of these transcoders are? I'll give you a hint, one costs $30 the other costs $0.00.

One produces good results while not loading down the CPU while the other produces garbage. I guess you get what you pay for.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Wreckage
Originally posted by: SlowSpyder


Also, would you care to remind me on what the price of these transcoders are? I'll give you a hint, one costs $30 the other costs $0.00.

One produces good results while not loading down the CPU while the other produces garbage. I guess you get what you pay for.

Yup, here's one of the things you get for your money. For $30: "This last benchmark takes the same Blu-ray 1080p trailer video we used in the first iPod benchmark but converts it to a 2.5 mbps Windows Media Video file. Because the NVIDIA Badaboom application is unable to transcode to anything other than H.264, it had to sit out this particular test."

Or for $0.00: "The Avivo application also supports MANY more output options than the Badaboom application as you can see here - DVD, WMV, MPEG-2 and even MPEG-1!"

To me it doesn't matter. I don't have an Ipod, if I did I don't think I'd care if my videos took 72, 23, or 12 seconds to transcode. I wouldn't care if I was using 38% or 78% of my CPU. I would care about $0 vs. $30. I would also care about the quality of the transcoded video. Looks like neither camp has a perfect solution for what I would want.

But what this is really about to you isn't who has a better transcoder, or what features Nvidia has vs. AMD, who provides better bang for the buck, etc. This is just another thread based on something pro-AMD, in this case in regards to them releasing a DX11 card first (that I admit I doubt matters at all, I bet both companies will have DX11 cards out in time for DX11 games) that you have successfully derailed and turned into a 100+ reply Nvidia vs. AMD pissing contest. Congrats.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: error8
Here, 12 reviews and conclusion gathered by Keysplayr: link to forum post

If you look thorough each and every review, you'll see that 4890 has games and settings, where it's considered faster and so is GTX 275. Some reviews put them equal. It's true that gtx 275 won more "sites" then 4890, but the score is still tight. Again I say, if it would have won 12 out of 12, then it was truly the faster part, otherwise is still a wash out between the two.
Again, I've never said they weren't close, I said the majority of sites came to the conclusion the 275 was the faster/better part, and as such, the price drops on the 4890 reflect that fact. You keep ducking my simple question though, do you think the 4890 would've dropped in price if the general perception was that it was the faster/better part?

Originally posted by: error8
Yes, chizow convinced us all, that Nvidia is by far superior, with its "higher quality build" and some other points that were either unimportant, or unproven. ;)
Yep and I had no problems coming up with a list of areas Nvidia products have been superior historically before even considering PhysX. Others brought up Image Quality as well, which would certainly also be relevant in Nvidia's favor. ;)

ATI vs. Nvidia build quality etc.

1) Performance
2) Overclockability
3) Cooling, Temps, Power Draw
4) Warranty, overall build quality
5) Vendor Support and Resources (EVGA ftw)
6) Driver Support (particularly for new titles)
7) Game/software Bundle
8) 1st and 3rd party utilities and driver features
9) CUDA application acceleration (non-PhysX)
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
I have a GTX 280 and a HD 4890 side by side, and it'd be very, very hard to pick one over the other. I thought this thread was about DX11..
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
i agree with you except for CUDA. it isn't a feature that gamers look for. they are scarcely aware of it and hopefully it'll be moved aside by a universal standard like OpenCL anyway.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: BFG10K
No, not if the 9800 GTX+ existed, which the 4850 competes against.

That's my point: even if you have the highest performing part, you also release lower performing parts that are price & performance competitive with the competitions? offerings. If you don?t, you lose potential sales in that market.
I don't disagree with that, however in the case of the 4870, I think its plainly obvious ATI badly mispriced that part due largely to their failed previous product launches. If Nvidia was selling their competing part for $450 (GTX 260), why not sell their part for $400 and still beat it in price/performance? Similarly, the 9800GTX was selling for $299 at the time, why not sell the 4850 for $250-300?

AMD fans are going to claim its because AMD is some altruistic firm that actually cares about their interest. In the meantime they're bleeding money for what? 11 straight quarters and become the incredible shrinking firm after their acquisition of ATI. Its very easy to drop prices, but once you've done so, its very difficult to later justify an increase in prices for the same relative or expected level of performance.

It should be worth that noting that according to the latest financials, ATi yet again turned a profit. It?s AMD?s CPU/chipset division that is losing money, not ATi.
AMD's method of determining product division P&L is non-GAAP for a reason. They don't allocate any below-the-line deductions or expenses to thei individual product divisions, even if those divisions are directly responsible for those charges or impairments. If you look at Nvidia's (or any other GAAP) financials, you'll see such expenses properly allocated. For example, when Nvidia took that 200m charge for their chip packaging, they properly allocated half and half to their GPU and chipset divisions.
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Originally posted by: chizow

I don't disagree with that, however in the case of the 4870, I think its plainly obvious ATI badly mispriced that part due largely to their failed previous product launches. If Nvidia was selling their competing part for $450 (GTX 260), why not sell their part for $400 and still beat it in price/performance? Similarly, the 9800GTX was selling for $299 at the time, why not sell the 4850 for $250-300?

AMD fans are going to claim its because AMD is some altruistic firm that actually cares about their interest. In the meantime they're bleeding money for what? 11 straight quarters and become the incredible shrinking firm after their acquisition of ATI. Its very easy to drop prices, but once you've done so, its very difficult to later justify an increase in prices for the same relative or expected level of performance.

Market share? Lowering prices to increase market share is a tried and true business practice. Probably a good idea to do it now when their GPUs probably cost less then the competition to produce so they have a little wiggle room as far as price goes.
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: chizow
If Nvidia was selling their competing part for $450 (GTX 260), why not sell their part for $400 and still beat it in price/performance?

Because they could, because even at this price, 4870 was a very profitable card and it also gained much more clients this way. I guess that at 400$ they would still gain profit, but selling it much cheaper attracted more buyers and the profit was even larger.
ATi is not an altruistic firm by any means, they just used aggressive marketing to win more money.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: Elfear
Market share? Lowering prices to increase market share is a tried and true business practice. Probably a good idea to do it now when their GPUs probably cost less then the competition to produce so they have a little wiggle room as far as price goes.
But they didn't increase market share, check the latest Peddie figures.