• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

ATI months ahead of NVIDIA with DirectX 11 GPU schedule?

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: solofly
Originally posted by: evolucion8
Cherry picking the benchmarks is an ability of yours, keep going nVidia Focus Member. Your credibility was never there because there's no impartiality when you work for a company. But at least you have more objectivity than Wreckage's renegade posts, but thumbs down for both of you for derailing the thread.

Well there is NV in the title which justifies his posts. Just remember, a pile of dog shit on the street has more value than what he has to say. I'll take wreckage's fanboism over a salesman any day...

Fortunately, you have your own mind, and you don't have to find any value at all in my posts. However, I think yours will hold even less if you continue with these kinds of hate filled posts dude. You have my pity.

 
You are the one who's biased, you seems to forget that the HD 4670 is as fast as the HD 3870 which was as fast as the 9600GT, both the HD 4670 and HD 3870 has the same 320 stream processors, the main difference is that the HD 4670 has better Anti Aliasing performance. Try better next time.

Hard to say which of your comments are the least accurate. Don't feel bad about having your comments blown out of the water, I have reality on my side which makes the whole discussion loaded against you from the start.
 
Originally posted by: BenSkywalker
You are the one who's biased, you seems to forget that the HD 4670 is as fast as the HD 3870 which was as fast as the 9600GT, both the HD 4670 and HD 3870 has the same 320 stream processors, the main difference is that the HD 4670 has better Anti Aliasing performance. Try better next time.

Hard to say which of your comments are the least accurate. Don't feel bad about having your comments blown out of the water, I have reality on my side which makes the whole discussion loaded against you from the start.

Only noobs use Tom's Hardware please, try better next time. Once you turn Anti Aliasing on, the HD 4670 gets dangerously close and sometimes outperform the HD 3870, what's the point to buy those cards if you aren't gonna turn on Anti aliasing? Of course that if anti aliasing is off, the HD 3870 will be faster overall. The performance difference between the 9600GT and the HD 4670 is almost the same performance difference that existed between the 8800GT and the HD 3870, this time the gap is closer because of the excepcional performance of the HD 4000 architecture when Anti Aliasing is on and the HD 4670 is quite often matching the 9600GT performance.

http://www.anandtech.com/video/showdoc.aspx?i=3405&p=10

http://www.techpowerup.com/rev...ercolor/HD_4670/6.html

http://www.pcgameshardware.com...D-4650/Reviews/?page=7

http://www.guru3d.com/article/...adeon-hd-4670-review/9

http://hothardware.com/Article...The-Mainstream/?page=5
 
The performance difference between the 9600GT and the HD 4670 is almost the same performance difference that existed between the 8800GT and the HD 3870, this time the gap is closer because of the excepcional performance of the HD 4000 architecture when Anti Aliasing is on and the HD 4670 is quite often matching the 9600GT performance.

Did you glance at the benches you posted? They support my end of the discussion more then the ones I posted. Will keep it strictly using AA-

Comparing the 9600GT, 4670 and 3850 the 9600GT wins 41 out of the 45 benches on TechPowerup, finishes second on one and third on three(Quake4 the nV cards all choked).

The 3850 wins 4 benches, finishes second 17 times and third 24 times.

The 4670 wins none. That would be one less then one. Finishes second 27 times and third 18 times.

What does Hot Hardware have to say?

9600GT wins every game bench at every setting.

4670 finishes second in all but one, third in that one

3850 finishes last in all but one, second in that one.

Guru3D actually shows the 4670 winning 3 out of the 28 benches, although those were 3 out of the 4 tests they ran without AA. The other 21(for some reason the 8800GTS was used for the Mass Effect numbers) benches and all of them with AA went to the 9600GT. You may try and cling to those 3 non AA victories, but the 9600GT even bested the 4850 in four of the benches they had, how often does the 4670 beat the 4850? 😉

Your benches demonstrate it better then mine did, the 4670 is not close to the 9600GT. I can't comment on the AT article as I would assume they ran the benches, saw the 4670 getting destroyed and decided against including the 4670's direct competitor in a review about it. If AT wasn't extremely biased in protecting the 4670 their review just would have showed what everyone elses did, the 9600GT is without a doubt a superior offering.
 
Originally posted by: Wreckage

The point is, if ATI could have sold the 4870 for $600 they would have.
Yes that?s quite true, which is why competition benefits all consumers regardless of which camp they're in. That?s the point I?m making.

Transcoding.
Eh? The original Purevideo had transcoding, did it? I think not; this appears to be a goalpost shift on your part.

But yes, the original transcoding version of AVIVO released in late 2008 was a beta so it had issues. CPU usage was higher because it only offloaded the motion estimation portion of the encoding pipeline. On top of this, some converted material had artifacts in some instances.

It?s possible newer versions have improved things, but I haven?t seen any newer transcoding tests.

From a gaming point of view gamers will want better features like.
-Transparency AA
-Superior AF
I agree completely.

-Ambient Occlusion
ATi has ambient occlusion too; all nVidia does is allow it to be forced in unsupported games from the control panel. Doing so barely produces any IQ change, produces a large performance hit, and sometimes causes artifacting in games.

-GPU accelerated Physics
Since the last discussion, no new PhysX games have come out. We?ve still got Mirror?s Edge and Cryostasis, the latter which seems to have dropped off the radar even before it was on it.

With working demonstrations of hardware Havok on OpenCL and nVidia releasing OpenCL, PhysX looks even more uncertain than before. If developers can get hardware accelerated Havok physics on any vendor that implements OpenCL, why should they bother with the proprietary PhysX which only works with nVidia?
 
Originally posted by: chizow

I don't disagree with that, however in the case of the 4870, I think its plainly obvious ATI badly mispriced that part due largely to their failed previous product launches.
That?s one theory; another is that they could.

Based on their marketing slides it was obvious their approach was a ?bang for buck? one, and that they were specifically targeting mid and low end market segments because that?s where they thought the most money was made.

It?s quite possible the 4xxx manufacturing costs were low enough to allow such pricing at launch and still retain a profit.

AMD fans are going to claim its because AMD is some altruistic firm that actually cares about their interest.
I don?t believe ATI is altruistic at all. They?re just like any company whose goal is to make as much profit as possible without breaking any laws.

AMD's method of determining product division P&L is non-GAAP for a reason. They don't allocate any below-the-line deductions or expenses to thei individual product divisions, even if those divisions are directly responsible for those charges or impairments.
That may be true, but I still believe ATi?s pricing allows that section of the business to remain profitable, otherwise it would?ve been corrected with newer parts. The 4770 is faster than the 4830, yet it has the same price-point. Likewise the 4890 is faster than the 4870, yet it?s already dropped in price.

AMD could raise the prices of ATi cards to try to prop up the CPU side, but that could hurt the company overall. The best solution is to get the CPU side of things profitable.
 
Originally posted by: Keysplayr

Your links are nice and all, but why would you ignore mine? The random ones? Care to explain that?
Your links are fine Keys, but they don't show the story of driver progression.

During the R4xx/R5xx days in particular, ATi did a lot of work on improving driver performance and you can see the results in later benchmarks.

Your benchmarks show the cards at launch, but mine show the cards when the next generation had arrived (i.e. the X1800XT figures come from the X1950XTX launch), allowing driver optimizations to be present.

That?s why there?s such a large performance advantage with ATi parts in my results.
 
Originally posted by: Wreckage
Originally posted by: Just learning
If this is true I think it will really help ATI.

I mean seriously when was the last time they were considered "the best" and "first to market"? Wasn't it in 2003 with the 9800 series cards?

Yeah the R300 launched in 2002 and they have pretty much struggled ever since. After that NVIDIA launched the 6xxx series with SLI, SM3 and Purevideo. ATI pretty much just played catch up.

R300 was first and faster.
R350 was faster still.
The x800 series was slightly late and slower (xt model didn't come out until later).
X850 series was even later and faster, but lacked SM3.0 right as games started using it.
X1800 was late and way slower.
X1900 was way late and way faster.
HD2900 was late and slower.
HD3850 was on-time(?) and slower, but cheap.
HD4850/4870 gen was early, and better at any price point it existed at.

Except for the x800/x850 and x1800 series, I recall all of ati's cards at least being a better value than nvidia's though.

From what I remember of 2006 ATI had the advantage in games like Oblivion but Nvidia was still better overall.

Just take a look at this graph with respect to oblivion http://www.anandtech.com/video...spx?i=2858&p=7....then If you read the conclusion of this article and look at the other other games compared I think it is fair to say Nvidia had the upper hand during this era.

I remember it being a similar situation to the r300/r350 gen. As newer games came out, ati did comparatively better until nvidia looked like crap.
From the last round of benchmarks I saw with the 7900 and x1950, the x1950 could have upwards to 50% more performance in the really shader intensive games. But I think the x1950 was also a massively larger die size, and wasn't on the market long before the 8800 ate it's launch. The x1950 also made the hd2900 look even worse, the hd2900 wasn't much faster, and could even be slower with AA on. X1800 got stomped by the 7800 though.

I think nvidia is making the far more interesting parts at the moment though. CUDA is real, Physx is real, stereoscopic vision is real, and that ambient occlusion thing is real. ATI may have the faster video cards, but nvidia is doing a good job with value add, in the same way that AMD Overdrive and their media center app value add to the Phenom.

As a programmer/geek, CUDA appeals to me. It's got some interesting demos, and I'd love to see what I can do with it, if only its compiler were just a bit more advanced.

PhysX is cool, but still lacks an app that showcases it as anything special. Also, it runs like crap without a dual vid card setup or a GT200 based card. (on a single card setup, a gtx260 gets 8x the performance of a 9800gtx+ in some physx benchmark I saw) Even with gt200, I have a feeling it won't be until gt300 that the single card physx setup reallly shines. Too bad vista won't let you use nvidia cards with ati. Nvidia should release a separate physx driver to allow for that.

Stereoscopic would be cool if I had the hardware for it. Tried red/blue glasses mode, and it was pretty disorienting and not as cool as having color in my games.

Ambient occlusion is cool in that nvidia is adding something else to make old games better. I just wish I could tell the difference. Look forward to more stuff like this though, ATI needs to bring back it's custom shader support. (ASCII and sepia mode were my favs) I'd call this a wash though since ATI appears to have better AA modes and faster too.

Next gen of vid cards is likely to be far more favorable to nvidia than this one. I see nvidia winning the high end, and being competitive throughout, albeit perhaps a little late to the mid-range and low-end cards.
 
Originally posted by: BFG10K
Originally posted by: Keysplayr

Your links are nice and all, but why would you ignore mine? The random ones? Care to explain that?
Your links are fine Keys, but they don't show the story of driver progression.

During the R4xx/R5xx days in particular, ATi did a lot of work on improving driver performance and you can see the results in later benchmarks.

Your benchmarks show the cards at launch, but mine show the cards when the next generation had arrived (i.e. the X1800XT figures come from the X1950XTX launch), allowing driver optimizations to be present.

That?s why there?s such a large performance advantage with ATi parts in my results.

Thank you. I'm glad someone mentioned this. Driver performance really took off after a few months of the card being on the market. While the X1800XT was neck and neck with the 7800GTX 256MB at launch, after a few driver revisions it was about 20% faster on average.
 
Originally posted by: BFG10K
Originally posted by: Keysplayr

Your links are nice and all, but why would you ignore mine? The random ones? Care to explain that?
Your links are fine Keys, but they don't show the story of driver progression.

During the R4xx/R5xx days in particular, ATi did a lot of work on improving driver performance and you can see the results in later benchmarks.

Your benchmarks show the cards at launch, but mine show the cards when the next generation had arrived (i.e. the X1800XT figures come from the X1950XTX launch), allowing driver optimizations to be present.

That?s why there?s such a large performance advantage with ATi parts in my results.

Ok, but you must mean that ATI actually "caught up" to the 7800GTX via driver improvements? Don't you remember the fun everyone made of the X1800XT at launch?

funnypic 1

funnypic 2

 
Originally posted by: chizow

No not really, even if performance is similar, an Nvidia card is superior in just about every way. I recently put together a nice long list, can't remember if it was for you or someone else. 😉


Even gives you a better chance of testing out the warranty. :laugh:

I can not figure out Nvidia. They should be in a better financial situation than ati, yet nothing really new for quite some time. Guess this is why they are revisiting ancient history.

I am expecting a killer dx11 offering - as they should have plenty of resources to develop it. If they come to the party late with dx11 - hope they at least make it worth the wait. :thumbsup:
 
Originally posted by: Keysplayr
Originally posted by: BFG10K
Originally posted by: Keysplayr

Your links are nice and all, but why would you ignore mine? The random ones? Care to explain that?
Your links are fine Keys, but they don't show the story of driver progression.

During the R4xx/R5xx days in particular, ATi did a lot of work on improving driver performance and you can see the results in later benchmarks.

Your benchmarks show the cards at launch, but mine show the cards when the next generation had arrived (i.e. the X1800XT figures come from the X1950XTX launch), allowing driver optimizations to be present.

That?s why there?s such a large performance advantage with ATi parts in my results.

Ok, but you must mean that ATI actually "caught up" to the 7800GTX via driver improvements? Don't you remember the fun everyone made of the X1800XT at launch?

funnypic 1

funnypic 2

But that actually proves his point further

The problem with the x1800 was its inconsistent performance at launch. In some games it really delivered, but in others it was barely faster than the x850, which is obviously driver fail

Now if you look at recent reviews the card even surprasses the paper-launched GTX 512, which was roughly 20% faster than the original 7800 to begin with

The downside of this is that obviously, it makes Nvidia drivers look better, since you dont wait 1 year to buy a card just so it can have stable drivers, you buy it shortly after launch

Although, for buyers like me, who cant afford to buy top of the line stuff every generation, its important for a card to last 3 years or so, and in that regard ATI is definitely superior, maybe because of drivers, maybe because of forward thinking designs, but the fact is nowadays the x1950 does wipe the floor with the 7900 series

(None of this is actually on topic but whatever, this thread was derailed long ago anyway)
 
If i remembered correctly another reason the 7800 had superior performance over the X1800 at launch was because most of the reviewers benched the 7800 with quality driver setting instead of high quality. Most of the later reviews had high quality enabled to match the image quality of ATI's card since texture shimmering became a big issue, this and the 7800/7900 older architecture caused performance to tank with more recent games. Not to mention ATI's driver improvements.
 
Originally posted by: YEPP
If i remembered correctly another reason the 7800 had superior performance over the X1800 at launch was because most of the reviewers benched the 7800 with quality driver setting instead of high quality.

Yeah I remember that...did that start with with the 7 series or 6 series though?
 
Originally posted by: ShadowOfMyself
Originally posted by: Keysplayr
Originally posted by: BFG10K
Originally posted by: Keysplayr

Your links are nice and all, but why would you ignore mine? The random ones? Care to explain that?
Your links are fine Keys, but they don't show the story of driver progression.

During the R4xx/R5xx days in particular, ATi did a lot of work on improving driver performance and you can see the results in later benchmarks.

Your benchmarks show the cards at launch, but mine show the cards when the next generation had arrived (i.e. the X1800XT figures come from the X1950XTX launch), allowing driver optimizations to be present.

That?s why there?s such a large performance advantage with ATi parts in my results.

Ok, but you must mean that ATI actually "caught up" to the 7800GTX via driver improvements? Don't you remember the fun everyone made of the X1800XT at launch?

funnypic 1

funnypic 2

But that actually proves his point further

The problem with the x1800 was its inconsistent performance at launch. In some games it really delivered, but in others it was barely faster than the x850, which is obviously driver fail

Now if you look at recent reviews the card even surprasses the paper-launched GTX 512, which was roughly 20% faster than the original 7800 to begin with

The downside of this is that obviously, it makes Nvidia drivers look better, since you dont wait 1 year to buy a card just so it can have stable drivers, you buy it shortly after launch

Although, for buyers like me, who cant afford to buy top of the line stuff every generation, its important for a card to last 3 years or so, and in that regard ATI is definitely superior, maybe because of drivers, maybe because of forward thinking designs, but the fact is nowadays the x1950 does wipe the floor with the 7900 series

(None of this is actually on topic but whatever, this thread was derailed long ago anyway)

I proved his point further? Ok. Whatever you want to get out of it.
 
Originally posted by: thilan29
Yeah I remember that...did that start with with the 7 series or 6 series though?

I have no idea when it started but some games at the time of the 7900 era didn't like Nvidias driver optimization and the shimmering was bad enough for the reviewers to switch to high quality setting.

http://www.legitreviews.com/article/322/4/

Please note that for all tests, excluding 3D Mark 2006, Nvidia image quality settings were set from "Quality" to "High Quality." ATI image quality settings were left at "High Quality" with "High Quality Anisotropic Filtering" enabled in Catalyst Control Panel.

http://www.firingsquad.com/har...rce_7900_gto/page3.asp

Again keep in mind that we?re testing the NVIDIA cards with the image quality setting at ?High Quality? mode rather than the driver default setting of ?Quality?. We?ve noted that the HQ setting significantly reduces the amount of texture shimmering in games such as Battlefield 2. This change does negatively impact NVIDIA?s performance, but it?s a tweak many NVIDIA users seem to be doing with their own cards so we?re doing it too.



 
Originally posted by: Xellos2099
Basically the new dirextx won't go off when it will require both a new os and new gpu.

In the Dave Bauman interview, he stated that the DX11 was a subset of DX10 and DX10.1, that many of the performance benefits like multi threading can be seen in DX10 and DX10.1.

"Additionally we should remember that DirectX 11 is a superset of DirectX 10 and DirectX 10.1, meaning that by including DirectX 10.1 developers are already paving the way to DirectX 11 features and compatibility.

By providing support for DirectX 10.1 we're helping developers be ready for DirectX 11 sooner than if they only limited current development to DirectX 10.

As for DirectX 11, naturally we're happy to see the DirectX API continue to evolve in the manner it has. My feelings are that it offers a sensible evolution of the feature-set capabilities, in line with the directions the IHVs are taking from a hardware perspective and where the ISV's want to go on the software side, whilst also addressing some of the points that were lacking in DirectX 10.

One such element that gets updated in DirectX 11 is that of Display Lists, a new driver model to more effectively multithread graphics workloads over multi-core CPUs, natively within the API. This is something that we know developers have been looking requested. The advantage here is that although this is a DirectX 11 API feature, the functionality will move down to DirectX 10 hardware, so all DirectX 10 hardware users that update to the DirectX 11 runtime will get the benefits of this feature."

If it requires a new GPU and OS, the adoption will be even worse than with Vista, ignoring the fact that Windows 7 is faster and that DX11 will be available in Vista.

 
Originally posted by: Keysplayr
Originally posted by: BFG10K
Originally posted by: Keysplayr

Your links are nice and all, but why would you ignore mine? The random ones? Care to explain that?
Your links are fine Keys, but they don't show the story of driver progression.

During the R4xx/R5xx days in particular, ATi did a lot of work on improving driver performance and you can see the results in later benchmarks.

Your benchmarks show the cards at launch, but mine show the cards when the next generation had arrived (i.e. the X1800XT figures come from the X1950XTX launch), allowing driver optimizations to be present.

That?s why there?s such a large performance advantage with ATi parts in my results.

Ok, but you must mean that ATI actually "caught up" to the 7800GTX via driver improvements? Don't you remember the fun everyone made of the X1800XT at launch?

funnypic 1

funnypic 2


nice pics keys... hehehe

I will say for the Nvidia team, if GT300 is anything even close to as it looks, AMD going to have a big hill to climb next time around and thats coming from ATI fan here...

I just hope we see strong single GPU cards for both sides next time around - and not sure it even matters if AMD is couple months ahead with DX11 part, sure as far as the posts and talk here - however what counts is the sale$ in the end, only few if any games will launch right off.

Will also add (since so much driver talk) I have current match up would guess its all about drivers and AMD not winning - FC2, my GTX 285 at 1920res is with ease taking my HD 4870X2.

So in this case, at launch the HD4870X2 took the GTX 285, as time has passed, at least on my system Nvidia drivers (or something) improved enough in this one game to gain the lead.

Personally, again being ATI fan, I've not really been big fan of any of the 9.xx cats - to me seems AMD is starting to slip in the driver area again - but what do I know.

Also, Nvidia and Vista 64 sucked when at the point win7 is today - now Nvidia learned and has good driver platform going, I can not say the same for AMD with win7 - even the install with HD 4890 is rough...




 
The INQ are saying now that the GT300 is delayed till 2010 If that is true then ATI can have the market for itself at the high end for many months. And by the time the GT300 does get released, ATI can have a higher clocked and/or x2 version of their R870 cards.

It will be interesting to see how things play out in the end, and to see how Larrabee fairs against the new generation of GPUs from NV and ATI. My guess is even the fastest Larrabee will end up slower (for games at least), but it will be able to do many things a normal GPU can't.

 
My guess is that the 'months' ahead portion will be the single GPU cards and an X2 version when nVidia's offering shows up.
Happy days if you ask me (which you haven't but if you had 😛); if you are desperate for a GPU upgrade at the time of launch pick up the ATI when it shows up, else wait for possibly even better performing parts or price drops.
 
Originally posted by: Kuzi
The INQ are saying now that the GT300 is delayed till 2010 If that is true then ATI can have the market for itself at the high end for many months. And by the time the GT300 does get released, ATI can have a higher clocked and/or x2 version of their R870 cards.

It will be interesting to see how things play out in the end, and to see how Larrabee fairs against the new generation of GPUs from NV and ATI. My guess is even the fastest Larrabee will end up slower (for games at least), but it will be able to do many things a normal GPU can't.

Man oh man that is a garbage "article".

He links to HIS OWN SHITTY PREDICTION about when certain cards are going to be released as proof that there is some sort of delay.

Do yourself a favor, dont waste brain cells reading that.
 
Originally posted by: YEPP
Originally posted by: thilan29
Yeah I remember that...did that start with with the 7 series or 6 series though?

I have no idea when it started but some games at the time of the 7900 era didn't like Nvidias driver optimization and the shimmering was bad enough for the reviewers to switch to high quality setting.

http://www.legitreviews.com/article/322/4/

Please note that for all tests, excluding 3D Mark 2006, Nvidia image quality settings were set from "Quality" to "High Quality." ATI image quality settings were left at "High Quality" with "High Quality Anisotropic Filtering" enabled in Catalyst Control Panel.

http://www.firingsquad.com/har...rce_7900_gto/page3.asp

Again keep in mind that we?re testing the NVIDIA cards with the image quality setting at ?High Quality? mode rather than the driver default setting of ?Quality?. We?ve noted that the HQ setting significantly reduces the amount of texture shimmering in games such as Battlefield 2. This change does negatively impact NVIDIA?s performance, but it?s a tweak many NVIDIA users seem to be doing with their own cards so we?re doing it too.

ah yes! The old Quality vs HQ issue. Looking forward to Keys defending this one.
 
So, uh, who seriously left that mode at "Quality"? I never, ever had it below "High Quality".

How much of an FPS impact is it, anyway?
 
Back
Top