• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

AMD beliefs: DirectX 11 Radeons pleasantly fast

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: apoppin


i know Dell is still carrying Nvidia - look at their highest end for GTX 280 M hybrid SLi:

http://phx.corporate-ir.net/ph...&ID=1294970&highlight=

Alienware, Dell's premier high-performance PC gaming brand, unveiled today the M17x, with three NVIDIA(R) GeForce(R) graphics processing units (GPUs) to provide an unrivaled visual computing experience. As a launch vehicle for the global expansion of the Alienware brand, Alienware equipped the M17x with a pair of NVIDIA GeForce GTX 280M enthusiast-class GPUs with NVIDIA SLI(R) Technology along with the GeForce 9400M GPU to create the world's most powerful 17-inch notebook.
. . .
Built with the gamer in mind, the M17x pushes technology and design innovation, establishing a precedent for combining best-in-class performance with striking design. Alienware M17x laptops deliver scorching gaming performance by combining the horsepower of two discrete GeForce GTX 280M GPUs with NVIDIA SLI technology, and have the option of switching to a quiet operation mode with NVIDIA HybridPower(TM) technology. In HybridPower operation, the powerful GTX 280M GPUs are powered down and graphics operation is transferred to the GeForce 9400M GPU to save battery life while still delivering great graphics.



:laugh: That laptop is gloriously ridiculous. :thumbsup:
 
Originally posted by: apoppin


thanks for that; but all his comments appear inaccurate -

It was just wishful thinking on his part, or more to the point.... FUD.

Dell, Apple and even Microsoft have made recent announcements using NVIDIA chips. Also according to the link I posted they have gained even more market share. (dispelling many conspiracy theories).

Either way if it's ATI's plan to simply duct tape DX11 onto the 4xxx series, I think it will just about be the death of them.
 
Originally posted by: WelshBloke
Originally posted by: apoppin


i know Dell is still carrying Nvidia - look at their highest end for GTX 280 M hybrid SLi:

http://phx.corporate-ir.net/ph...&ID=1294970&highlight=

Alienware, Dell's premier high-performance PC gaming brand, unveiled today the M17x, with three NVIDIA(R) GeForce(R) graphics processing units (GPUs) to provide an unrivaled visual computing experience. As a launch vehicle for the global expansion of the Alienware brand, Alienware equipped the M17x with a pair of NVIDIA GeForce GTX 280M enthusiast-class GPUs with NVIDIA SLI(R) Technology along with the GeForce 9400M GPU to create the world's most powerful 17-inch notebook.
. . .
Built with the gamer in mind, the M17x pushes technology and design innovation, establishing a precedent for combining best-in-class performance with striking design. Alienware M17x laptops deliver scorching gaming performance by combining the horsepower of two discrete GeForce GTX 280M GPUs with NVIDIA SLI technology, and have the option of switching to a quiet operation mode with NVIDIA HybridPower(TM) technology. In HybridPower operation, the powerful GTX 280M GPUs are powered down and graphics operation is transferred to the GeForce 9400M GPU to save battery life while still delivering great graphics.



:laugh: That laptop is gloriously ridiculous. :thumbsup:

ATI top dog for laptops is the 4870X2 that ASUS W90 has (i have one) and it puts a fierce fight against those GTX 280M's in the AW M17. I've hit 14800 in 3dmark 06 with the Q9000 @ 2.8 with stock drivers. ATI mobility drivers suck and ATI doens't support the 4870X2 at all, ASUS had to custom make theirs.
 
Originally posted by: Wreckage
Either way if it's ATI's plan to simply duct tape DX11 onto the 4xxx series, I think it will just about be the death of them.

How will adding DirectX11 result in the death of ATI? I'm in little doubt that adding something - as long as it's not resulting in the loss of something else - is a good thing.

However, if you can convince me otherwise, I'll agree with you ^_^
 
Originally posted by: apoppin
So yes, Evolutions comments were misleading even though his exact comment was accurate (Uber high end lappy is ATI only, certain XPS model)

thanks for that; but all his comments appear inaccurate - if you count the highest end {alienware by Dell} - the ultra end in Notebooks is owned by Nvidia in their NEW lineup:

i know Dell is still carrying Nvidia - look at their highest end for GTX 280 M hybrid SLi:

http://phx.corporate-ir.net/ph...&ID=1294970&highlight=

Alienware, Dell's premier high-performance PC gaming brand, unveiled today the M17x, with three NVIDIA(R) GeForce(R) graphics processing units (GPUs) to provide an unrivaled visual computing experience. As a launch vehicle for the global expansion of the Alienware brand, Alienware equipped the M17x with a pair of NVIDIA GeForce GTX 280M enthusiast-class GPUs with NVIDIA SLI(R) Technology along with the GeForce 9400M GPU to create the world's most powerful 17-inch notebook.
. . .
Built with the gamer in mind, the M17x pushes technology and design innovation, establishing a precedent for combining best-in-class performance with striking design. Alienware M17x laptops deliver scorching gaming performance by combining the horsepower of two discrete GeForce GTX 280M GPUs with NVIDIA SLI technology, and have the option of switching to a quiet operation mode with NVIDIA HybridPower(TM) technology. In HybridPower operation, the powerful GTX 280M GPUs are powered down and graphics operation is transferred to the GeForce 9400M GPU to save battery life while still delivering great graphics.

Ah... I wasn't counting Alienware stuff. Just Dell branded XPS systems.
 
Originally posted by: Henrah
Originally posted by: Wreckage
Either way if it's ATI's plan to simply duct tape DX11 onto the 4xxx series, I think it will just about be the death of them.

How will adding DirectX11 result in the death of ATI? I'm in little doubt that adding something - as long as it's not resulting in the loss of something else - is a good thing.

However, if you can convince me otherwise, I'll agree with you ^_^

Well traditionally you create a new architecture around a new version of DirectX. This has been hugely successful method for both companies in the past (R300 & DX9, G80 and DX10).

It seems that NVIDIA is going to be using a new architecture while ATI is going to just shoe horn DX11 into its existing architecture. At least according to the latest rumors.

Although if my theory about AMD mostly dropping out of the video card market is true, these rumors fit in nicely with that theory.
 
Originally posted by: Wreckage
Originally posted by: Henrah
Originally posted by: Wreckage
Either way if it's ATI's plan to simply duct tape DX11 onto the 4xxx series, I think it will just about be the death of them.

How will adding DirectX11 result in the death of ATI? I'm in little doubt that adding something - as long as it's not resulting in the loss of something else - is a good thing.

However, if you can convince me otherwise, I'll agree with you ^_^

Well traditionally you create a new architecture around a new version of DirectX. This has been hugely successful method for both companies in the past (R300 & DX9, G80 and DX10).

It seems that NVIDIA is going to be using a new architecture while ATI is going to just shoe horn DX11 into its existing architecture. At least according to the latest rumors.

Although if my theory about AMD mostly dropping out of the video card market is true, these rumors fit in nicely with that theory.

But they are just rumors. But if current ATI hardware was close enough to DX11 specs, then it probably would take a massive rework to make it fully DX11 compliant, right?
I mean, weren't some people mentioning that DX10.1 is only a stones throw away from DX11? While I don't believe this is true, if ATI can make their current hardware DX11 compliant, I say they should go for it. If it's a mistake, I'm sure will all find out soon enough, as well as if it's a success. If it's a success, they'll probably save themselves a ton of R&D cash that they don't have to spend right now. And, if it can get them through the next gen with passable hardware, then I think that's the only choice they have.

I know they are going to move away from their Vec5 shader design, but I don't know when. I mean, they really need to. But the real test of their next gen hardware will be under Windows 7 and DirectX Compute performance. If they can hold their own there, they have it made til next gen, even at cut rate pricing. At least they will have their sales.
 
Originally posted by: Wreckage
Well traditionally you create a new architecture around a new version of DirectX. This has been hugely successful method for both companies in the past (R300 & DX9, G80 and DX10).

It seems that NVIDIA is going to be using a new architecture while ATI is going to just shoe horn DX11 into its existing architecture. At least according to the latest rumors.

Although if my theory about AMD mostly dropping out of the video card market is true, these rumors fit in nicely with that theory.

I think I'll just wait for the benchmarks to come in before deciding a victor. Right now, AMD is producing some of the fastest, cheapest and highest rated video cards they've ever released. From what I've seen in the forums, far more people switched from Nvidia to AMD this generation than those who chose to go to Nvidia from AMD.

And "New architecture" does not automatically equal "winner". We saw that with the FX5800 and 2900XT. Only time will tell whether "shoehorning" DX11 onto an existing card or coming out with a totally new design was the correct approach to take. Nvidia's "bigger is better" outlook to GPU design certainly didn't pan out for them as well as they'd hoped it would. The GT200 was a totally new architecture, yet Nvidia had to drastically slash its prices in order to maintain sales against the much smaller, cheaper AMD 4800 series. The next generation of cards could just as easily repeat this scenario as not.
 
AMD's Vec5 shaders actually aren't bad. Many of them are fully programmable; AMD simply has not dedicated resources to exploiting GPGPU software yet. I'm pretty sure they have one big fully programmable shader surrounded by 3 or 4 smaller less functional shaders. In terms of die space, I'm pretty sure that this setup is actually more efficient than what NV is doing when it comes to gaming performance.

I would actually be surprised to see NV release a DX11 GPU this year. It looks like they are focussing their efforts on GPGPU computing whereas AMD is concentrating on new DirectX features. This actually follows tradition for both companies. NV typically has gone off on tangents inventing new programming languages whereas ATi always stayed close to MS and helped develop DX.
 
Originally posted by: SickBeast
NV typically has gone off on tangents inventing new programming languages whereas ATi always stayed close to MS and helped develop DX.

This must be why the 8800 launched first with DX10 support and why they are still the fastest DX card 😕
 
Originally posted by: Wreckage
Originally posted by: Henrah
Originally posted by: Wreckage
Either way if it's ATI's plan to simply duct tape DX11 onto the 4xxx series, I think it will just about be the death of them.

How will adding DirectX11 result in the death of ATI? I'm in little doubt that adding something - as long as it's not resulting in the loss of something else - is a good thing.

However, if you can convince me otherwise, I'll agree with you ^_^

Well traditionally you create a new architecture around a new version of DirectX. This has been hugely successful method for both companies in the past (R300 & DX9, G80 and DX10).

It seems that NVIDIA is going to be using a new architecture while ATI is going to just shoe horn DX11 into its existing architecture. At least according to the latest rumors.

Although if my theory about AMD mostly dropping out of the video card market is true, these rumors fit in nicely with that theory.

And as others have pointed out a new architecture doesn't mean success, FX series and DX9, Radeon 2900 and DX10. We really don't know for sure what either AMD or Nvidia has planned for DX11 hardware, I would wait to see what actually happens.

I don't know much about what new features DX11 will add over DX10/10.1, but I know I've seen it mentioned here and other places that DX11 builds off of what DX10.1 already has. So it may not be a giant leap for AMD to just add DX11 capability to similar hardware as they currently have. Again, new hardware architecture sometimes works out great, sometimes not so great.

Seeing as AMD appears to be rushing to be the first with DX11 hardware out the door I don't see them dropping out of the GPU market any time soon. That would seem like a lot of resources being poured into an area you're going to quit on.
 
Originally posted by: SlowSpyder

And as others have pointed out a new architecture doesn't mean success, FX series and DX9, Radeon 2900 and DX10. We really don't know for sure what either AMD or Nvidia has planned for DX11 hardware, I would wait to see what actually happens.
One blip and suddenly that makes a rule??? Nope.
I don't know much about what new features DX11 will add over DX10/10.1, but I know I've seen it mentioned here and other places that DX11 builds off of what DX10.1 already has. So it may not be a giant leap for AMD to just add DX11 capability to similar hardware as they currently have. Again, new hardware architecture sometimes works out great, sometimes not so great.
Yes but the 4xxx series was slower this round. It's not going to magically leap frog an entirely new generation of cards.

Seeing as AMD appears to be rushing to be the first with DX11 hardware out the door I don't see them dropping out of the GPU market any time soon. That would seem like a lot of resources being poured into an area you're going to quit on.

What resources? They are using their old card.
 
Originally posted by: Creig
The GT200 was a totally new architecture

No, the GT200 is just the G80/92 architecture on a larger scale, with some extra Cuda features added. It isn't fundamentally different from its predecessors, just bigger and a tad more advanced.
 
Originally posted by: Wreckage
Originally posted by: SlowSpyder

And as others have pointed out a new architecture doesn't mean success, FX series and DX9, Radeon 2900 and DX10. We really don't know for sure what either AMD or Nvidia has planned for DX11 hardware, I would wait to see what actually happens.
One blip and suddenly that makes a rule??? Nope.
I don't know much about what new features DX11 will add over DX10/10.1, but I know I've seen it mentioned here and other places that DX11 builds off of what DX10.1 already has. So it may not be a giant leap for AMD to just add DX11 capability to similar hardware as they currently have. Again, new hardware architecture sometimes works out great, sometimes not so great.
Yes but the 4xxx series was slower this round. It's not going to magically leap frog an entirely new generation of cards.

Seeing as AMD appears to be rushing to be the first with DX11 hardware out the door I don't see them dropping out of the GPU market any time soon. That would seem like a lot of resources being poured into an area you're going to quit on.

What resources? They are using their old card.

And nvidia's approach is completely new from the ground up?
 
Originally posted by: Wreckage
Originally posted by: apoppin


thanks for that; but all his comments appear inaccurate -

It was just wishful thinking on his part, or more to the point.... FUD.

Dell, Apple and even Microsoft have made recent announcements using NVIDIA chips. Also according to the link I posted they have gained even more market share. (dispelling many conspiracy theories).

Either way if it's ATI's plan to simply duct tape DX11 onto the 4xxx series, I think it will just about be the death of them.

Yeah, look at Nintendo who duct taped two Gamecubes together and failed miserably.

Anyone who is unbiased (such as Keysplayer, an nVidia Focus Group member) is saying wait and let's see how both performs. The market is littered with examples of what can be considered "inferior" products losing to "superior" products.

To just brand one side as the loser without giving valid or even semi-valid reasons why shows bias. The old "they duct taped two products" or "they duct taped a new part onto the product" argument is just spreading FUD. To a degree, any and all new products today can be considered an old product with a new feature slapped on or just doubling up a existing product.
 
Originally posted by: Scali
No, the GT200 is just the G80/92 architecture on a larger scale, with some extra Cuda features added. It isn't fundamentally different from its predecessors, just bigger and a tad more advanced.

Yeah, if the architecture currently is working fine, there's no need to build a new one, specially that DX11 share things more in common with DX10/DX10.1 than any previous DirectX.

But at least something should change from the current architecture, so it will perform better, and will be something fresh, not more old rehashes like the nVidia's G92 chip, example; GTX 280M aka 8800GTS aka 8800GT, aka . . . . . . . . . . . . . it's sickening.
 
Originally posted by: Wreckage
This is a thread about speculating AMDs DX11 performance and yet I'm not supposed to speculate about it.

But we already know your speculations, nvidia rocks and ati sucks without end... (you are entertaining tho) lol
 
Originally posted by: Wreckage
This is a thread about speculating AMDs DX11 performance and yet I'm not supposed to speculate about it.

Wreckage, I completely agree with you in that if AMD just takes their current 4890 and gives is DX11 capability without any other changes I don't see them succeeding. But I'm willing to guess that their architecture is going to evolve just like Nvidia's. I don't see AMD sitting around twiddling their thumbs 'duct taping' DX11 on to the 4890 and calling it a day.
 
Originally posted by: SlowSpyder
Originally posted by: Wreckage
This is a thread about speculating AMDs DX11 performance and yet I'm not supposed to speculate about it.

Wreckage, I completely agree with you in that if AMD just takes their current 4890 and gives is DX11 capability without any other changes I don't see them succeeding. But I'm willing to guess that their architecture is going to evolve just like Nvidia's. I don't see AMD sitting around twiddling their thumbs 'duct taping' DX11 on to the 4890 and calling it a day.


They are going to add shaders, boost clocks, etc. They have to do something to counter nV adding shaders and increasing memory bandwidth with DDR5 and a bigger interface.
 
Originally posted by: evolucion8
Originally posted by: Scali
No, the GT200 is just the G80/92 architecture on a larger scale, with some extra Cuda features added. It isn't fundamentally different from its predecessors, just bigger and a tad more advanced.

Yeah, if the architecture currently is working fine, there's no need to build a new one, specially that DX11 share things more in common with DX10/DX10.1 than any previous DirectX.

But at least something should change from the current architecture, so it will perform better, and will be something fresh, not more old rehashes like the nVidia's G92 chip, example; GTX 280M aka 8800GTS aka 8800GT, aka . . . . . . . . . . . . . it's sickening.

If you get sick off of naming conventions for computer hardware, you should probably step outside for a while.
 
Originally posted by: OCguy
Originally posted by: SlowSpyder
Originally posted by: Wreckage
This is a thread about speculating AMDs DX11 performance and yet I'm not supposed to speculate about it.

Wreckage, I completely agree with you in that if AMD just takes their current 4890 and gives is DX11 capability without any other changes I don't see them succeeding. But I'm willing to guess that their architecture is going to evolve just like Nvidia's. I don't see AMD sitting around twiddling their thumbs 'duct taping' DX11 on to the 4890 and calling it a day.


They are going to add shaders, boost clocks, etc. They have to do something to counter nV adding shaders and increasing memory bandwidth with DDR5 and a bigger interface.

We can't say for sure what Nvidia is going to have for memory bandwidth. If the rumors are correct it very well could be similar to the Radeon 2900, more bandwidth than it actually needs that really only marginally adds to performance. Having 300Gb/s of bandwidth is cool, but who knows, maybe that's well over what it'll ever actually need. It could be that is uses every last bit that it can get. We just don't know, but we can't say for certain it adds performance significantly. <shrug>

Rumors are that Nvidia is doubling their shader count. I wouldn't put any faith into the rumors we've heard about AMD's upcoming specs. Last rumors we heard about an upcoming archictecture for AMD had the 4870 at 480SP.
 
Originally posted by: solofly
Originally posted by: Wreckage
This is a thread about speculating AMDs DX11 performance and yet I'm not supposed to speculate about it.

But we already know your speculations, nvidia rocks and ati sucks without end... (you are entertaining tho) lol

And yours would be the exact opposite. So what's the point?
 
Originally posted by: OCguy

They are going to add shaders, boost clocks, etc. They have to do something to counter nV adding shaders and increasing memory bandwidth with DDR5 and a bigger interface.

Don't forget MIMD. I have a feeling this will be the G80 launch all over again.
 
Originally posted by: Keysplayr
Originally posted by: solofly
Originally posted by: Wreckage
This is a thread about speculating AMDs DX11 performance and yet I'm not supposed to speculate about it.

But we already know your speculations, nvidia rocks and ati sucks without end... (you are entertaining tho) lol

And yours would be the exact opposite. So what's the point?

How many Ati cards did Wreckage possess and ask me how many Nv cards I bought in the last 10 years or so...(chances are probably more than you)
I have a right to speak my mind since I pay big bucks for it...

I don't open my mouth without a reason...
 
Back
Top