AMD beliefs: DirectX 11 Radeons pleasantly fast

Kakkoii

Senior member
Jun 5, 2009
379
0
0
http://www.pcgameshardware.com...-pleasantly-fast/News/

AMD's Evergreen family of DirectX 11 compatible graphics card is supposed to be launched in 2009 - the GPU manufacturer is quite sure that it will be possible to offer DirectX 11 cards in time for the release of Microsoft's Windows 7 on October 22.

According to AMD's head of the Developer Relations department, Richard Huddy, AMD expects to be surprise the customers with the performance of the Evergreen cards. Mr. Huddy said:

"I would say we don't make money by delivering slow hardware. Our expectation is that we'll give you a really pleasant surprise this year when we ship our DX11 hardware."

After that he ran several DirectX 11 demos that demonstrated the advantages of Hardware Tessellation and DirectX 11 Compute Shaders on appropriate DirectX 11 compatible graphics cards. As it has already been the case at the Computex AMD concealed the real performance of the graphics card - accordingly the fps values visible on the display are not representative for the final products. The Evergreen card with a dual slot cooling solution is, according to Huddy, neither an entry level nor a high-end product - it is supposed to be part of the product array above 100 USD.

Below the video interview with Richard Huddy you can find an example for the DirectX 11 Tessellation Wireframe Mode.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
I hope this is true, but looking back into the past with AMD, I don't put much stock in their not so subtle hints at performance. I'd love to be proved wrong, however.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i hate pleasant surprises
- in marketspeak it sounds like they are setting us up for a slow GPU

shock me; blow me away .. don't pleasantly surprise me


2 X 4870 performance in 1 GPU would be a nice surprise; like 4870 was
:p

now i am thinking 50% faster - which may be what AMD *wants* Nvidia to think


You used to be able to tell with ATi; not with AMD
rose.gif

 

uclaLabrat

Diamond Member
Aug 2, 2007
5,623
3,025
136
Usually when they have a slow card, they try to hype its performance (2900 series), but when they have a kickass card, they don't say a word (4800 series). I'm a tad worried.
 

Jacen

Member
Feb 21, 2009
177
0
0
Being the first out of the gate with DX11 hardware is an obvious selling point even if few games support it at the time. Considering ATI's success of the past couple of generations it would take a 2xxx series like mistake to botch things up at this point. I am hearing good things about this rumored 5xxx series.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: apoppin
i hate pleasant surprises
- in marketspeak it sounds like they are setting us up for a slow GPU

shock me; blow me away .. don't pleasantly surprise me


2 X 4870 performance in 1 GPU would be a nice surprise; like 4870 was
:p

now i am thinking 50% faster - which may be what AMD *wants* Nvidia to think


You used to be able to tell with ATi; not with AMD
rose.gif

You can't blame either company for keeping their cards close to their chests. I know we'd all like to know now of course. :)

With AMD it's hard to tell if they're about to launch something like the Phenom / Radeon 2900's or something like the A64 / Radeon 4800's... they just don't say much until launch. The only real indicator with them is if it launches on time, their late products usually end up appearing rushed and incomplete at launch. So my guess is if the next Radeons end up getting pushed back for some reason they'll be 'meh'.
 

jandlecack

Senior member
Apr 25, 2009
244
0
0
I don't think there's any doubt that the new series will be a worthwhile upgrade over all current cards, but the unknown is whether they aim to (and can) beat nVidia's new highend and overall models.
 

ochadd

Senior member
May 27, 2004
408
0
76
I don't think AMD is willing to do what it takes to beat them on the high end. Nvidia will unleash an unprofitable card to keep the single card performance crown while AMD has the common sense not to. I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76
"I would say we don't make money by delivering slow hardware. Our expectation is that we'll give you a really pleasant surprise this year when we ship our DX11 hardware."

I don't remember the last time AMD made money.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
If it's not based on a new architecture (as rumors suggest) they will get crushed.

Which I believe will be what happens. AMD has already suggested that they no longer wish to compete at the high end.

I've always thought that AMD was not too interested in video cards at all. Buying ATI was more for things like Integrated graphics, chipsets and a leg up on "fusion".

 

jandlecack

Senior member
Apr 25, 2009
244
0
0
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.

When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.
 

GundamSonicZeroX

Platinum Member
Oct 6, 2005
2,100
0
0
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.

When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.

What about the FX-55?
 

Mogadon

Senior member
Aug 30, 2004
739
0
0
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.

When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.

I guess you haven't been around too long then ... ;)
 

Paratus

Lifer
Jun 4, 2004
17,406
15,250
146
Originally posted by: Wreckage
If it's not based on a new architecture (as rumors suggest) they will get crushed.

Which I believe will be what happens. AMD has already suggested that they no longer wish to compete at the high end.

I've always thought that AMD was not too interested in video cards at all. Buying ATI was more for things like Integrated graphics, chipsets and a leg up on "fusion".

Really?

I'm more worried about NV.

From the rumors it sounds like another huge monolithic die with a bunch of the transistor budget going to features that don't support gaming.

It seem Nv only wants to improve their GPGPU capability while gaming prowess and cost are secondary.

:Shrug;

However we should probably add "for the price" to the "pleasantly surprised" comment by AMD.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: uclaLabrat
Usually when they have a slow card, they try to hype its performance (2900 series), but when they have a kickass card, they don't say a word (4800 series). I'm a tad worried.

NVIDIA has been really quiet. Not sure what that means...

Originally posted by: Jacen
I am hearing good things about this rumored 5xxx series.

Yeah, from AMD/ATI. :)

Originally posted by: ochadd
I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.

Not sure if I'd go that far, but yeah, I'd like to see AMD be more competitive. What I'd really like to see if AMD and NVIDIA agree on a chipset that will support both Crossfire and SLI. That is a HUGE selling point IMO for X58. Sure you pay more for your platform, but your video card options/configurations are virtually limitless.

That is nice peace of mind to have before an uncertain gpu generation like we have now. Say, the way to go is with the single big NV chip, you're covered and you can buy another one when the price drops. On the other hand, say dual ATI cards is the way to go, again, you're covered. AMD doesn't have a platform that can compete with X58/i7 i either performance or video card flexibility.

Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.

When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.

Yeah, I don't think this has ever happened. Even when AMD was king with their Athlon 64 X2 while Intel had the horrible Pentium D, SLI was much more mature and more reliable with the nForce 4 SLI chipset than Crossfire with its dongles and and crappy ATI chipset (with the ULi SB). Singe card was a different story though. An A64 X2 + X1950XTX was good stuff, but the best chipsets were still nForce.

People forget what a good match AMD and NVIDIA made against Intel. NV helped put the Athlon XP on the map with the awesome nForce2 (Asus A7N8X Dlx anyone?), then they launched A64 S754 with nForce 3, and finally A64 S939, X2, and SLI with nForce 4. Before NV starting making chipsets for AMD, all you had was VIA and ALI. I know they are in competition on the GPU side, but damn, I wish they could work together on the platform side of things. I've a number of really sweet AMD CPU/NV chipset based rigs that were really nice. The Intel/nForce boards just never had that magic.

...anyway, I'm totally wandering off topic. :)
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: Wreckage
Which I believe will be what happens. AMD has already suggested that they no longer wish to compete at the high end.

I've always thought that AMD was not too interested in video cards at all. Buying ATI was more for things like Integrated graphics, chipsets and a leg up on "fusion".
And it's a good strategy. I like how my AM2+ motherboard came with relatively good ATI graphics, but jesus christ guys at least try to put some effort into the drivers. Nvidia doesn't always have the best hardware, but they dominate the market because their drivers are always better. I had to put my spare Nvidia card into a server because the Catalyst Control fucks with Remote Desktop.

It seem Nv only wants to improve their GPGPU capability while gaming prowess and cost are secondary.
They probably have a good reason. If business software like Photoshop, AutoCAD, and Maya can use GPGPU, there's a lot of money to be made by selling $500 workstation cards. Gamers don't spend that kind of money, but engineers and artists do.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: ShawnD1
I had to put my spare Nvidia card into a server because the Catalyst Control fucks with Remote Desktop.

Just curious, why are you installing CCC on a server?
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Originally posted by: nitromullet
Originally posted by: ShawnD1
I had to put my spare Nvidia card into a server because the Catalyst Control fucks with Remote Desktop.

Just curious, why are you installing CCC on a server?

The computer is occasionally used as an extra desktop computer. Even basic things like web browsing are incredibly laggy without installing drivers.
 
Apr 20, 2008
10,065
984
126
Originally posted by: Wreckage
If it's not based on a new architecture (as rumors suggest) they will get crushed.

Are you kidding me? Their current architecture is 100% scalable in single card solutions. Overclocking even the 4890 brings in great results. There is no real flaws.

Nvidia really needs to hop on the GDDR5 bandwagon as GDDR3 is ridiculously dated and far less efficient.
 

thilanliyan

Lifer
Jun 21, 2005
12,010
2,231
126
Originally posted by: ShawnD1
The computer is occasionally used as an extra desktop computer. Even basic things like web browsing are incredibly laggy without installing drivers.

Why not install just the driver and something like ATI Tray Tools if you want to control any 3D functions? That's what I've been doing since January and have had no problems with it.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: ShawnD1
Originally posted by: nitromullet
Originally posted by: ShawnD1
I had to put my spare Nvidia card into a server because the Catalyst Control fucks with Remote Desktop.

Just curious, why are you installing CCC on a server?

The computer is occasionally used as an extra desktop computer. Even basic things like web browsing are incredibly laggy without installing drivers.

You need drivers, no argument from me there. You don't have to install CCC though. I believe you can omit CCC by selecting a custom install, and I'm sure you can download the drivers w/ out CCC straight from ATI. IIRC, this also negates the .NET requirement.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: Scholzpdx
Originally posted by: Cookie Monster
Originally posted by: Scholzpdx
Nvidia really needs to hop on the GDDR5 bandwagon as GDDR3 is ridiculously dated and far less efficient.

How so?

Bandwidth/power consumption. Easy.

I'm sure that at the time the GT200 cards were launched, GDDR3 was probably also a whole lot less expensive than GDDR5. Given that NV was using a 512-bit bus instead of a 256-bit bus they could make it work. Although, I imagine that GT300 will use GDDR5.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
A few things to note from the posts in this thread:

Architecture scalability: How is ATI's architecture any more or less scalable than their competitors?

ATI will get crushed if they don't change their arch: While I don't think they will get crushed, and will probably just double or triple the shaders and impliment DX11 support, they still end up with a arch that nobody wants to code for. Even if technically ATI's architecture is superior (subjective) on paper, it won't be realised in real world apps and games. But then agan we don't know how NVs MIMD arch will perform either. Total wait and see.

NV more focused on GPGPU than on gaming: It would appear that this statement has no teeth. NV was extremely focused on the CUDA architecture since G80, and has been leading ATI in gaming performance ever since. Doesn't appear they have lost site to the gaming aspect of their GPUs. Big die, small die, transistor budgets really shouldn't matter. I think that "Idontcare" did a rough calculation of what the die size should be for the number of rumored transistors of the GT300 core based on 40nm. Correct me if I'm wrong IDC, but I think you said somewhere around 220-250mm2.

ATI isn't interested in competing in the high end: This is a pleasant yet nonsensical spin on the real statement, "We can't best them, so we'll say we never intended to. Yeah, we'll go with that. And also fellow board members, we have opted to adopt Havok as our Physics method of choice. This will give us about two years to actually create a GPU that can actually be programmed for efficiently and actually be able to run GPU Physics. It saves us the embarrassment of the public actually finding out that we can't run a whole lot more than just games well on our current architecture. Meeting adjourned. Sushi anyone?"

By the way, anyone planning on getting insulted by these comments needs to understand that they are directed at AMD/ATI. Getting personally insulted over it would be kind of silly. Don't let it happen to you. :)