"AMD’s next-generation family of high-performance graphics cards is expected to ship

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Can't estimate performance improvements on tahiti by comparing it's specs with those of cayman, because tahiti stream processors and cayman stream processors will likely be different (core-next and all that).

What about the doubling of the ROP count? Doesn't that usually make a huge difference?

Agreed with what you said. Since Frostbite 2.0 will be a foundation for some future games (Need for Speed: The Run, Mirror's Edge 2), we'll definitely need a lot more powerful cards soon. I just hope there is no FX5800 or HD2900 flop from either brand. The last 3 rounds were very competitive with excellent offerings from both camps (HD4800 vs. GTX200 series, HD5800 series vs. GTX400 series and HD6900 series vs. GTX500 series). I am hoping this continues.
 
Feb 19, 2009
10,457
10
76
Also, I would NEVER take anything AMD says to be set in stone let alone unconfirmed leaks. If the BD launch proved anything its that AMD will lie to its customers and have no issues hanging their marketing department out to dry to take the heat for it.

AMD also lies to their AIBs, with different briefs to different partners. They even send them different bioses in engineering samples so they come out with different numbers and performance.

Taking a cautious approach as suggested by Keys is a good idea.

SA definitely has insiders in AMD (not just insiders in AIBs, different scenario), if Charlie confirms rambus vram is rubbish, then i'm inclined to lean that way also.
 
Feb 19, 2009
10,457
10
76
Agreed with what you said. Since Frostbite 2.0 will be a foundation for some future games (Need for Speed: The Run, Mirror's Edge 2), we'll definitely need a lot more powerful cards soon.

I am actually very impressed with BF3, its performance vs visual is amazing. Run everything ULTRA, disable MSAA and you can have smooth 50-60 fps at 1080p on single GPUs.

Moving foward, games on Frostbite 2 should have similar characteristic.

Crysis 2 however, runs like a dog compared to its visuals. So, hardware is important, but efficient game engines are even more important.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
You clearly haven't been reading, then. LP (Low-Power) and HPM (High-Performance Mobile) are the ones that will be used for mobiles, while HPL and HP are for desktops.

http://www.tsmc.com/english/dedicatedFoundry/technology/28nm.htm

The 28nm Process Family

TSMC's 28nm technology delivers twice the gate density of the 40nm process and also features an SRAM cell size shrink of 50 percent.

The low power (LP) process is the first available 28nm technology. It is ideal for low standby power applications such as cellular baseband. The 28LP process boasts a 20 percent speed improved over the 40LP process at the same leakage/gate.

The 28nm high performance (HP) process is the first option to use high-k metal gate process technology. Featuring superior speed and performance, the 28HP process targets CPU, GPU, FPGA, PC, networking, and consumer electronics applications. The 28HP process supports a 45 percent speed improvement over the 40G process at the same leakage/gate.

The 28nm low power with high-k metal gates (HPL) technology adopts the same gate stack as HP technology while meeting more stringent low leakage requirements with a trade of performance speed. With a wide leakage and performance spectrum, N28HPL is best suitable for cellular baseband, application process, wireless connectivity, and programmable logics. The 28HPL process reduces both standby and operation power by more than 40%.

TSMC also provides high performance for mobile applications (HPM) technology to address the need for applications requiring high speed as well as low leakage power. Such technology can provide better speed than 28HP and similar leakage power as 28LP. With such wide performance/leakage coverage, 28HPM is also ideal for many applications from networking, tablet, to mobile consumer products.

It is clear that HPL is not suited for high performance Desktop chips and only HP will be used for Desktop.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Hmm, who do I believe more? A leaked roadmap from AMD, or some guy with unfounded skepticism?

Do YOU want to place a bet? You know you'll lose. These are leaked specs from AMD.

Yes I'd love to bet you those specs are not accurate and will differ in number of cores, memory bus, memory type, memory bandwidth, and/or operating core frequency.

I will put a $20 steam game up. Care to match?
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
There may be a lesson learned here! Even with two of the most prominent AMD Graphic Division employees and comments from executives -- the reason why it is wise to have caution for forward looking statements.

So I guess if you believe that the article is accurate with the delay (not saying that you are) then you should also believe this

The good news is that the wait might be worth it, with the new high-end single GPU offering performance levels close to the HD 6990 dual-GPU flagship card.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
There may be a lesson learned here! Even with two of the most prominent AMD Graphic Division employees and comments from executives -- the reason why it is wise to have caution for forward looking statements.

There is just a ton of speculation going on right now. As they mentioned in the report, there are other factors that are involved here. Apparently in this case it is TSMC's fault (as usual) for the delay.

Best thing for the AMD graphics division would be for them to get spun back off (ATI v 2.0 maybe?) and start using intel as their fab. Clearly TSMC and GF just can't handle it.

The good news is that the wait might be worth it, with the new high-end single GPU offering performance levels close to the HD 6990 dual-GPU flagship card.

They just used fud-like terms: "the wait might be worth it" means that the wait might also NOT be worth it. If they had true inside info on this they would have said something more like: "the new high end single gpu from AMD will be extremely close to the performance of 6990".
 
Last edited:

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
AMD also lies to their AIBs, with different briefs to different partners. They even send them different bioses in engineering samples so they come out with different numbers and performance.
How do you know this about lying?

Clearly there are different BIOSes that are sent out because they each need testing. And the briefing to Sapphire (a manufacturing partner) would not be the same as to Axle (a slap on sticker AiB vendor). What is your point there?

And the brief to Sapphire would be sent long before the one to Axle (for example); during which time, features (inc. specs) may change.
:\

i do not see why anyone is surprised, back in June at ABT forum we declared that there would be no 28nm GPUs until 2012; certainly not high end
:whiste:
http://alienbabeltech.com/abt/viewtopic.php?f=6&t=23192
 
Last edited:

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
ok well you keep believing the leaks and i'll wait for an official announcement then. You should probably look up all the "leaks" for the 6xxx launch and the 5xxx launch though and see how many of them were complete BS.

Those weren't leaks. Sorry to disappoint.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
nothing official, comes from a "source" but passing it along.

"AMD Delays Next Generation Radeon launch to Q1 2012"
http://www.rage3d.com/index.php?cat=75#newsid33983358

that news makes me sad.

Um, why would it? It's pretty obvious it'd be harder to introduce the GPUs using the GCN architecture. The delay is only for those, and was to be expected. The only one that has a good chance of launching December is the VLIW4 GPUs, meaning HD 7800 and everything below.

There may be a lesson learned here! Even with two of the most prominent AMD Graphic Division employees and comments from executives -- the reason why it is wise to have caution for forward looking statements.

What lesson? I'm pretty sure very few people were expecting the GPUs having the GCN architecture to be launched so soon. Launching the die-shrinked Cayman would be a lot easier. The delay is only for the HD 7900 series, the ones using GCN.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
Yes I'd love to bet you those specs are not accurate and will differ in number of cores, memory bus, memory type, memory bandwidth, and/or operating core frequency.

I will put a $20 steam game up. Care to match?

I don't care about your Steam game, or your money. I don't have Steam; I use physical, tangible games.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
"The good news is that the wait might be worth it, with the new high-end single GPU offering performance levels close to the HD 6990 dual-GPU flagship card."

It's now just a matter of 1 quarter delay before the entire Bulldozer CPU line-up is obsolete for gaming.


its already obsolete. i wonder if amd will recommend reviewers to test their gpu's on intel systems from now on.

bulldozer = most epic fail of all time
 

arredondo

Senior member
Sep 17, 2004
841
37
91
"The good news is that the wait might be worth it, with the new high-end single GPU offering performance levels close to the HD 6990 dual-GPU flagship card."

It's now just a matter of 1 quarter delay before the entire Bulldozer CPU line-up is obsolete for gaming.

How many cores of each CPU does that game make use of? In other words, if there were an ARMA III released that better maximized the abilities of each processor, would the distance results be more even or perhaps in BD's favor?

I have no idea, so that's why I'm asking. Maybe Bulldozer was fighting with one arm tied behind its back.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
its already obsolete. i wonder if amd will recommend reviewers to test their gpu's on intel systems from now on.

bulldozer = most epic fail of all time

I am not an engineer but I don't understand why didn't they just do a quick 32nm spin for a 1-1.5ghz 1 module Bulldozer prototype back in 2010 and saw that it has worse IPC than their dual core Phenom II? Did they expect most programs to use 8 threads in 2011 or for Bulldozer to launch at 7-8ghz clock speeds? :hmm:

Surely they could have postponed the Bulldozer architecture until they could do major revisions and simply spent a full year shrinking Phenom II?

How many cores of each CPU does that game make use of? In other words, if there were an ARMA III released that better maximized the abilities of each processor, would the distance results be more even or perhaps in BD's favor?

I have no idea, so that's why I'm asking. Maybe Bulldozer was fighting with one arm tied behind its back.

Perhaps. Most modern games don't use more than 2-4 cores. The thing is, it's the fault of AMD design engineers/management who likely knew that most non-professional apps want 4 fast cores not 8 slow cores, but STILL ended up going with this design choice. They were aware of the risks of making an 8 core CPU that would be underutilized in 4-threaded apps. AMD engineers basically went all out on cores and spent almost no time improving the performance of each of those cores. It's like gambling all or nothing hoping that software is going to be very multi-threaded by the time you launch.

Bulldozer design is too forward looking imho, probably at least 3-4 years ahead of software. I see games becoming a lot more multi-threaded after Xbox720/PS4 launch in 2013-2014. Even then it can take developers another 2-3 years to start utilizing the potential of next generation consoles. The I don't see 6/8-threaded games being mainstream until 2014-2015, at the earliest.
 
Last edited:

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
its already obsolete. i wonder if amd will recommend reviewers to test their gpu's on intel systems from now on.

bulldozer = most epic fail of all time

I won't say "of all time", but it's definitely a huge failure, especially for gaming. Kinda funny seeing the AMD fanboys saying "hey, they're the same in gaming!" when they're using single-GPU setups. If you care about extending your system's gaming use, you'll probably do CF/SLI, and there Bulldozer falls flat in its face in comparison to Sandy Bridge. That's not counting future GPUs, either. The HD 7900 series will be released in Q1 next year, and by then we'll start to see CPUs bottlenecking more. If a Radeon HD 7970 is comparable to, say, a Radeon HD 6870 or 6950 CF then Bulldozer will start to fall behind by a very good margin.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Um, why would it? It's pretty obvious it'd be harder to introduce the GPUs using the GCN architecture. The delay is only for those, and was to be expected. The only one that has a good chance of launching December is the VLIW4 GPUs, meaning HD 7800 and everything below.



What lesson? I'm pretty sure very few people were expecting the GPUs having the GCN architecture to be launched so soon. Launching the die-shrinked Cayman would be a lot easier. The delay is only for the HD 7900 series, the ones using GCN.

AMD stated they were planning Q4, unless Eric Demers and Rick Bergman don't know what they're talking about at that time -- think they have a clue.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
So I guess if you believe that the article is accurate with the delay (not saying that you are) then you should also believe this

I tend to place some merit in Jim's sources but as with anything that is speculation, rumor, forward looking statements -- cautious. Jim has an outstanding relationship with AMD in my mind.
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
I am not an engineer but I don't understand why didn't they just do a quick 32nm spin for a 1-1.5ghz 1 module Bulldozer prototype back in 2010 and saw that it has worse IPC than their dual core Phenom II? Did they expect most programs to use 8 threads in 2011 or for Bulldozer to launch at 7-8ghz clock speeds? :hmm:

Surely they could have postponed the Bulldozer architecture until they could do major revisions and simply spent a full year shrinking Phenom II?

AMD saw they had trouble manufacturing it ever since the beginning and that IPC wasn't as high. Because of the huge issues at the beginning and the fact they needed so many revisions, they expected it to be released in 2020 and just stuck with it because by then 95% of desktop workloads would be multi-threaded and they had no money for anything else, and it'd look better than whatever Intel had at that time. They didn't expect us to complain so much about delays, so they decided to release what they had because fanboys would still buy it, and some consumers would think "8 COAREZ!!! TWICE THAT OF INTEL!!!". They know they'd still needed more revisions and time to make it competitive with Intel in the eyes of enthusiasts, so they'd make a version with 10-15% higher performance/watt and higher overclockability each year. That, coupled with 95% of desktop applications being completely multi-threaded by 2020, would ensure their dominance over Intel. The model would be the FX-81990, denoting its amazing 80 integer cores, 9100MHz base clock speed, and its low 750W PSU requirement. A performance improvement over the 32 core/64-thread Intel Core i7-16000K of 10%. A truly forward-looking architecture. AMAZING!!!

:awe:
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The model would be the FX-81990, denoting its amazing 80 integer cores, 9100MHz base clock speed, and its low 750W PSU requirement. A performance improvement over the 32 core/64-thread Intel Core i7-16000K of 10%. A truly forward-looking architecture. AMAZING!!!

:awe:

:D Thanks for making me laugh today!! hahahah

If the FX-81990 comes with a fanless Platinum Seasonic PSU and a $100 air cooler bundled for Free, I might start saving my $$ for that.
 
Last edited:

kdubbs

Member
Jan 26, 2011
48
0
0
its already obsolete. i wonder if amd will recommend reviewers to test their gpu's on intel systems from now on.

bulldozer = most epic fail of all time

Dare I say it?

Tebow > Bulldozer

That's right, I said it.