AMD beliefs: DirectX 11 Radeons pleasantly fast

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Paratus


It seem Nv only wants to improve their GPGPU capability while gaming prowess and cost are secondary.

Secondary? They have the gaming performance crown. :confused:

 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Originally posted by: Keysplayr
A few things to note from the posts in this thread:

Architecture scalability: How is ATI's architecture any more or less scalable than their competitors?

ATI will get crushed if they don't change their arch: While I don't think they will get crushed, and will probably just double or triple the shaders and impliment DX11 support, they still end up with a arch that nobody wants to code for. Even if technically ATI's architecture is superior (subjective) on paper, it won't be realised in real world apps and games. But then agan we don't know how NVs MIMD arch will perform either. Total wait and see.

NV more focused on GPGPU than on gaming: It would appear that this statement has no teeth. NV was extremely focused on the CUDA architecture since G80, and has been leading ATI in gaming performance ever since. Doesn't appear they have lost site to the gaming aspect of their GPUs. Big die, small die, transistor budgets really shouldn't matter. I think that "Idontcare" did a rough calculation of what the die size should be for the number of rumored transistors of the GT300 core based on 40nm. Correct me if I'm wrong IDC, but I think you said somewhere around 220-250mm2.

ATI isn't interested in competing in the high end: This is a pleasant yet nonsensical spin on the real statement, "We can't best them, so we'll say we never intended to. Yeah, we'll go with that. And also fellow board members, we have opted to adopt Havok as our Physics method of choice. This will give us about two years to actually create a GPU that can actually be programmed for efficiently and actually be able to run GPU Physics. It saves us the embarrassment of the public actually finding out that we can't run a whole lot more than just games well on our current architecture. Meeting adjourned. Sushi anyone?"

By the way, anyone planning on getting insulted by these comments needs to understand that they are directed at AMD/ATI. Getting personally insulted over it would be kind of silly. Don't let it happen to you. :)

I don't know about them getting crushed. Everyone keeps harping on AMD's arch, but is it really 'that' bad for gaming? Sure, they have 800 shaders to compete with Nvidia's 216-240 shaders, but they also have a huge performance per mm2 of silicon advantage right now. It'd probably be a lot easier for AMD to make a 2400 shader card than Nvidia to make a 720 shader card based on the current GPU sizes (just multiplying each current arch by 3x here). If AMD pulls something like that off on 40nm (not saying they will) than I think they can be more than competitive and probably still have a smaller GPU. <- Speculating.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
i got one thing from your post:

Originally posted by: Keysplayr
A few things to note from the posts in this thread:

Total wait and see.

yes, agreed

rose.gif


 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: nitromullet
Although, I imagine that GT300 will use GDDR5.

Rumours say that GT300 will be GDDR5 AND 512-bit, which would amount to roughly twice the bandwidth of today's high-end cards.
 

Henrah

Member
Jun 8, 2009
49
0
0
Originally posted by: Scali
Originally posted by: nitromullet
Although, I imagine that GT300 will use GDDR5.

Rumours say that GT300 will be GDDR5 AND 512-bit, which would amount to roughly twice the bandwidth of today's high-end cards.

Linky please ^_^
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
Originally posted by: Wreckage

If it's not based on a new architecture (as rumors suggest) they will get crushed.
You really can't make that inference. Remember the 5800 Ultra?

Or how about the 2900? ATi got back in the game by refining it into the 4xxx series, not by shipping a brand new architecture.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: Keysplayr
NV more focused on GPGPU than on gaming: It would appear that this statement has no teeth.

If you ask me, nVidia's main problem has been poor timing (or perhaps they were deliberately taking it slow to avoid risks that AMD was willing to take).
nVidia has been late with the move to 55 nm, and have not adopted GDDR5 yet either. They're still late with 40 nm, although the difference with AMD doesn't seem to be that big this time (although we have yet to see nVidia's 40 nm parts actually on the market).
And nVidia was also late with the adoption of DX10.1.

It could be that nVidia just was too far along in the design trajectory to incorporate these technologies as they became available... or perhaps they just 'played it safe'.
At any rate, they let AMD close the gap again, which AMD probably couldn't have done if nVidia also had DX10.1, GDDR5 and 55/40 nm at the same time as AMD.

But, new DX version, new architectures, new chances. Wait and see.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Henrah
Originally posted by: Scali
Originally posted by: nitromullet
Although, I imagine that GT300 will use GDDR5.

Rumours say that GT300 will be GDDR5 AND 512-bit, which would amount to roughly twice the bandwidth of today's high-end cards.

Linky please ^_^

http://forums.anandtech.com/me...=2317475&enterthread=y


oh .. you mean RUMORS besides here :p

http://www.hardware-infos.com/news.php?news=2930

http://www.brightsideofnews.co...cated-controllers.aspx

. . .nVidia decided to stick with 512-bit memory controller for the GT300 architecture. If that unconfirmed information is true, this would mean that nVidia has a playing field of eight 32-bit memory controllers connecting to multiple 32-core clusters for a total of 512-cores . . .

512-bit GPU connected to 1-2 GB of GDDR5 memory at 1000 MHz QDR [Quad Data Rate]- we are looking at memory bandwidth of 256GB/s per single GPU, more than dual-GT200b GPU in the form of GeForce GTX295 [896-bit x 1GHz DDR = 224 GB/s]. That would be a conservative estimate, but given the development of GDDR5 memory, if the GT300 chip ends up connected to GDDR5 memory at 1050 or 1150 [alleged 5870/5870X2 clocks] - we are looking at memory bandwidth anywhere between 268.8-294.4 GB/s.
. . .
512-bit GPU with 512 cores? if these numbers are true, we're looking at one beast of a GPU.
 

jandlecack

Senior member
Apr 25, 2009
244
0
0
Originally posted by: GundamSonicZeroX
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.

When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.

What about the FX-55?

Are you just saying that ATi may have had a better generation than nVidia and at the same time AMD did too?

Because back then that wasn't at all related to ties between AMD and ATi, and it still isn't even if it happens nowadays. It's coincidence at most.

Plus, whoever has the fastest CPU on the market, will be the better combo for the best GPU on the market. At the moment that would be Intel, and it doesn't matter if the GPU comes from ATi or nVidia, because AMD is smart enough not to alienate the Intel userbase by releasing a card that purposely performs better with AMD systems.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
I see much the same as this generation happening again.

Ati will release a cheaper little chip with 50% more of everything at higher clocks. It will prove great for gaming, but a bit weak for gpu compute.

Nvidia will produce bigger more expensive chip that's even faster then ati's. Part of the reason the chips are so big is the extra emphasis on gpu compute - these things with be folding monsters.

Fans of both with sing their praises, cards will be priced competitively against each other so market share basically stays about the same.

The differences might be that gpu compute will start to get used more and more - ati are well behind here and are going to be playing catch up, which is going to be painful for them.

Nvidia also might take their eye off the ball somewhat as they get distracted by tegra - both in smart phones and google chrome OS netbooks - lots of aggressive marketing to try and get any cpu intensive software running on these things to use cuda and hence make good use of the tegra gpu.
 

alyarb

Platinum Member
Jan 25, 2009
2,425
0
76
you guys speak as if CUDA will take center stage next year rather than OpenCL. If you are right, then the radeon is disadvantaged compute-wise. If you are wrong, then everything will be fine. You can't extrapolate flops per area (despite AMD's large advantage here) from our current generation because amd and nv are reinventing their shaders for dx11. Dx11 isn't merely about tesselation and supporting upcoming games. It is a generational hop like dx10 and the big thing is the new shader model/compute shader and open platform GPGPU. AMD's shader currently allows them higher "best case scenario" compute performance. The radeon right now is not as great at thread management and dependencies will slow it down far more than the geforce. I think we can expect reconsiderations on this from both parties for the next generation. I highly doubt AMD is going to squeeze in more shaders and call it a day. dx10.1 does not architecturally segue them into dx11. But the way this thread is going, it sounds like that's what you all expect.
 

Scali

Banned
Dec 3, 2004
2,495
1
0
Originally posted by: alyarb
you guys speak as if CUDA will take center stage next year rather than OpenCL.

That's a common misconception...
People think Cuda and OpenCL are mutually exclusive.
In reality, Cuda means: Compute Unified Device Architecture.

http://en.wikipedia.org/wiki/CUDA
http://www.nvidia.com/object/cuda_what_is.html
http://developer.download.nvid...hitecture_Overview.pdf

In short, Cuda is the name for the hardware and driver set, and OpenCL and DirectX Compute run on top of Cuda.
So yes, if the Cuda architecture is superior to AMD's GPGPU architecture, it will be an advantage to nVidia, even with OpenCL and DirectX Compute.
 

ochadd

Senior member
May 27, 2004
408
0
76
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.

When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.

When the ATI Radeon 9800 and 9700 cards were around I was running an all AMD/ATI system that was trouncing any other combination.
 

jandlecack

Senior member
Apr 25, 2009
244
0
0
Originally posted by: ochadd
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.

When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.

When the ATI Radeon 9800 and 9700 cards were around I was running an all AMD/ATI system that was trouncing any other combination.

As I said, that was because AMD had the upper hand in CPUs at the time. It had nothing to do with AMD and ATI actively pursuing anything. The next generation, it could have gone either way and AMD + nVidia could be the better combo.

It's always just a matter of whether Intel or AMD and ATI or nVidia have the faster chips at any given point, and AMD hasn't had that in a long time.
 

Stoneburner

Diamond Member
May 29, 2003
3,491
0
76
Originally posted by: jandlecack
Originally posted by: ochadd
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.

When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.

When the ATI Radeon 9800 and 9700 cards were around I was running an all AMD/ATI system that was trouncing any other combination.

As I said, that was because AMD had the upper hand in CPUs at the time. It had nothing to do with AMD and ATI actively pursuing anything. The next generation, it could have gone either way and AMD + nVidia could be the better combo.

It's always just a matter of whether Intel or AMD and ATI or nVidia have the faster chips at any given point, and AMD hasn't had that in a long time.


When the 9700 came out, i believe Intel was on the pentium 4b series (northwood?) and it was before AMD started overtaking intel with a64. The XP series were good at gaming but were a bit behind intel and those came after the 9700.

Now, the 9700/9800 ran roughshod over Nvidia. INitially people thought FX was competitive with the 9 series but then there was a scandal about NVIDIA rigging tests and cheating in games.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: ochadd
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.

When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.

When the ATI Radeon 9800 and 9700 cards were around I was running an all AMD/ATI system that was trouncing any other combination.

I doubt you were running an ATI or AMD chipset though. The best performing chipset for AMD at that time was nForce 2:

http://www.anandtech.com/mb/showdoc.aspx?i=1872&p=10
 

ochadd

Senior member
May 27, 2004
408
0
76
Originally posted by: nitromullet
Originally posted by: ochadd
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.

When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.

When the ATI Radeon 9800 and 9700 cards were around I was running an all AMD/ATI system that was trouncing any other combination.

I doubt you were running an ATI or AMD chipset though. The best performing chipset for AMD at that time was nForce 2:

http://www.anandtech.com/mb/showdoc.aspx?i=1872&p=10

I was probably running my old Asus A7v333 but don't recall. Via? My goodness people are relentless. Only wishing AMD CPUs and ATI GPUs were the top dogs, that's all I'm getting at.
 

jandlecack

Senior member
Apr 25, 2009
244
0
0
Originally posted by: ochadd
Originally posted by: nitromullet
Originally posted by: ochadd
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.

When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.

When the ATI Radeon 9800 and 9700 cards were around I was running an all AMD/ATI system that was trouncing any other combination.

I doubt you were running an ATI or AMD chipset though. The best performing chipset for AMD at that time was nForce 2:

http://www.anandtech.com/mb/showdoc.aspx?i=1872&p=10

I was probably running my old Asus A7v333 but don't recall. Via? My goodness people are relentless. Only wishing AMD CPUs and ATI GPUs were the top dogs, that's all I'm getting at.

I'm not saying they weren't, but any connection between AMD and ATI at the time was coincidence and still is today. One can't afford to alienate another market, that's why the best GPU will always perform better with the stronger CPU. The new ATI card won't perform better with a PH2 than it will with a Core i7.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: ochadd
I don't think AMD is willing to do what it takes to beat them on the high end. Nvidia will unleash an unprofitable card to keep the single card performance crown while AMD has the common sense not to. I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.

I must be too young to understand where this die-hard hate for anything not AMD comes from.


nV will have margins that they must meet for GT300. I dont think they will intentionally release an "unprofitable card."


As far as the monolithic garbabe, isnt the die on 5XXX supposed to be the same size the one on GT200 is?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: ochadd
Originally posted by: nitromullet

I was probably running my old Asus A7v333 but don't recall. Via? My goodness people are relentless. Only wishing AMD CPUs and ATI GPUs were the top dogs, that's all I'm getting at.

It's not a matter of being relentless, it's more a matter (IMO) that it could be argued that part of the reason for the current lack of success for AMD cpus is because AMD just wanted to be a cpu maker, while Intel went after the whole platform. Aside from gaming, Intel is killing AMD in the laptop segment with Centrino. NV didn't help their laptop case any with the recent snafu where their mobile chips were apparently cooking themselves to death.

Platform is the war between Intel, NV, and AMD right now. Graphics, CPU, and chipset individually are just the pieces of the puzzle. Ultimately, it's control of the dominant platform that is the major concern. It's why we have NV focusing on CUDA (maybe they think it can surpass x86 some day), Intel working on Larabee, and why AMD bought ATI in the first place. At the rate we're going now, Intel is set to win, even if Larrabee isn't as good as ATI/NV's competition, because right now they have the best two out of the three. Nothing suits Intel better than to have NV and AMD squabble over graphics while they dominate the cpu/chipset arena and continue to work on their gpu. AMD and NV really needed to be together on this, but with the purchase of ATI, AMD made that a slim chance.

So, sorry of I came off as attacking you. It wasn't my intention. Just discussing stuff.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
About the CCC thing, I'll try the custom install next time I'm dicking with the computer. Thanks for the suggestion.

Originally posted by: Stoneburner
When the 9700 came out, i believe Intel was on the pentium 4b series (northwood?) and it was before AMD started overtaking intel with a64. The XP series were good at gaming but were a bit behind intel and those came after the 9700.

IIRC, AMD never had a clear performance advantage after the Pentium 4B. AMD products were competitive because they were cheap. While the Pentium 4 was sometimes faster, it was always ridiculously expensive. I remember buying an Athlon 1700+ because it was at least a hundred dollars cheaper than the Intel equivalent. This would be similar to something like the Phenom II 955 against the Intel i7 920 as competing DDR3 platforms. The i7 is hands down a better processor, but I would probably buy the AMD platform for $100 less.

Now, the 9700/9800 ran roughshod over Nvidia. INitially people thought FX was competitive with the 9 series but then there was a scandal about NVIDIA rigging tests and cheating in games
Even the 9 series wasn't a clear win for ATI. They had better D3D performance, but Nvidia had the advantage in OpenGL. Lots of games like Doom 3, Call of Duty, Jedi Outcast, Jedi Academy, and Wolfenstein used OpenGL, so there was still a reason to buy Nvidia products. I had a Radeon 9600XT at the time and it had some major problems with Doom 3.

I must be too young to understand where this die-hard hate for anything not AMD comes from.
AMD is the only company keeping Intel and Nvidia in check. Without AMD, we wouldn't have $50 Intel processors or $100 Nvidia graphics cards.
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
Originally posted by: ShawnD1
Originally posted by: nitromullet
Originally posted by: ShawnD1
I had to put my spare Nvidia card into a server because the Catalyst Control fucks with Remote Desktop.

Just curious, why are you installing CCC on a server?

The computer is occasionally used as an extra desktop computer. Even basic things like web browsing are incredibly laggy without installing drivers.

What he (probably) means is that Catalyst Control Center is optional, drivers can be installed without CCC.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: BFG10K
Originally posted by: Wreckage

If it's not based on a new architecture (as rumors suggest) they will get crushed.
You really can't make that inference. Remember the 5800 Ultra?

Or how about the 2900? ATi got back in the game by refining it into the 4xxx series, not by shipping a brand new architecture.

I remember the 8800, seemed pretty good. Nvidia only really had one bad launch and it actually managed to recover during that generation fairly well.

ATI's track record has not been as stellar and the 4800 could not take the crown from NVIDIA's old architecture, so it seem logical that it won't stand a chance against a new one.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: Wreckage
Originally posted by: BFG10K
Originally posted by: Wreckage

If it's not based on a new architecture (as rumors suggest) they will get crushed.
You really can't make that inference. Remember the 5800 Ultra?

Or how about the 2900? ATi got back in the game by refining it into the 4xxx series, not by shipping a brand new architecture.

I remember the 8800, seemed pretty good. Nvidia only really had one bad launch and it actually managed to recover during that generation fairly well.

ATI's track record has not been as stellar and the 4800 could not take the crown from NVIDIA's old architecture, so it seem logical that it won't stand a chance against a new one.

Oh really?

In that generation, it was 9800xt>9800 pro>9800 non pro =9700 pro>9700 non pro>5950 Ultra>5900 Ultra> 5800 Ultra. This is for DX8 and lower games. In DX9 it is not even remotely close, it's like a generation of difference.

The 9600 Pro was greater than everything below that. In Direct X 9 games, the 9600 Pro was faster than the 5950 Ultra.

Doesn't sound like Nvidia managed to recover very well.

In the next generation, the x800 cards were consistently faster than the 6800 cards. In the next generation, the 7800s were faster than the x1800s, but the x1900s were faster than the 7800 and 7900s. Then came ATi's poor launch, the HD 2900. The 8800s beat those soundly. Then the HD 3800s came out and while they didn't take the performance crown, they were a good value and good performers. Then the GTX 200 series and the 4800 series came out and we all know how poorly that turned out for Nvidia.