• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Digital Foundry: next-gen PlayStation and Xbox to use AMD's 8-core CPU and Radeon HD

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I doubt the speed will be as low as 1.6 GHz. I betting on closer to 2 if not 2.4 or something like that. An 8 core Jaguar would fix the 360's anemic IPC performance while retaining and expanding on it's very high relative GFLOPS. It wouldn't be that big either. Probably half the size of Bulldozer, and same GFLOPS per clock.
 
It's not needed.

That being said, Tons of research shows the majority of Console gamers are happy with today graphics.

Just for fun I fired up my Xbox 360 version of GTA IV side by side with my PC version (no mods, but all settings Max). It looks liked a Playstation (1) game by comparison. Hideous.

Kills me that GTA V is only coming to a vastly inferior platform.
 
It seems people have missed this little nugget

However, there's a fair amount of "secret sauce" in Orbis and we can disclose details on one of the more interesting additions. Paired up with the eight AMD cores, we find a bespoke GPU-like "Compute" module, designed to ease the burden on certain operations - physics calculations are a good example of traditional CPU work that are often hived off to GPU cores. We're assured that this is bespoke hardware that is not a part of the main graphics pipeline but we remain rather mystified by its standalone inclusion

If true, sounds like an APU configuration to me where the igp is dedicated to compute rather than try a hybrid crossfire. It will be interesting to find out. Hopefully in march if those rumors are true.
 
I wouldn't be so sure. We're talking about a console after all. There's something to be said for having known, deterministic performance, as opposed to the uncertainly turbo would introduce.

Found this, seems Turbo functionality was already baked in the powerpc core they tested before dumping it for AMDs Jaguar.

20uqq74.jpg
 
Your reference to "a decent gpu to run hl2" itself is making a point: the game came out over EIGHT years ago.

hehehe....i forgot that the game is actually old XD

but still, my point remains....they are trading cpu power for gpu power, and IMO that works better....actually better than a balanced system
 
Turbo also depends on temperature. Something you cant predict at the customer end.

I wouldn't say that it is that deterministic, since it likely depends on the ambient temps of the console.

AMD's TurboCORE is deterministic.


John 'IPC' Fruehe said:
AMD Turbo CORE is deterministic, governed by power draw, not temperature as other competing products are. This means that even in warmer climates you’ll be able to take advantage of that extra headroom if you choose. This helps ensure a max frequency is workload dependent, making it more consistent and repeatable.


I'm really excited about these future consoles. I like games, but I really don't want to spend tons of cash on hardware to play them. At this rate, the only thing I'll have to upgrade is my graphics card 😀
 
No.

Again I'll reiterate what I posted:

From Anandtech's article about the A10 (emphasis mine)
Power is still estimated based on workload, which AMD claims has less than a 1% error rate, but the new model gets accurate temperatures from those estimations.

If you had read podspi's link, the AMD engineer also affirms that the local temperature will not affect Turbo.
 
This is going to be a wonderful starting point for not only future consoles but games ported to the PC. Developers will now have their hand forced to tune for multi threaded workloads. Coding for 8 cores will become a second nature sort of thing a couple years out beyond initial release at the talented development houses.

Note, I expect these chips to have lower latency tweaks to ('sorta') make up for lack of pure speed. That is generally the case for consoles.
 
I misspoke. AMD representative.

Still, are you going to attack trivial mistakes in light of your complete error?

No i just wondered.

Anyway, AMDs slides keep talking about electrical limits and TDPs. And those limits would vary from CPU to CPU and depend on heat.

http://www.anandtech.com/show/3641/amd-divulges-phenom-ii-x6-secrets-turbo-core-enabled

And I dont think newer turbo core methods would be less advanced.

Maybe we can get IDC to actually test it?
 
It's not needed.

Its a closed system, much more optimized then PC Gaming. Games are designed specifically for the hardware found in the console. It does allow lower end hardware to shine more compared to PC's. Hardware performance out of low end Parts is much greater than it was in 2004/2005.

That being said, Tons of research shows the majority of Console gamers are happy with today graphics.

The gamers that genuinely want major upgrades in Console hardware are PC Gamers. Its a tiny percentage of the market.

Its not happening.

Microsoft and Sony are not going to sell Consoles at a loss again. They need better margins.

I was thinking about cost also. How much does a 7950M cost? It seem like it would be pretty expensive to put into a console.
 
Most likely they would qualify the chips in reverse from the maximum TDP and just have the OEMs make sure to design the cooling around that. Since it's workload based, the maximum power draw is known.

This method doesn't push as close to the envelope as doing by temperature would but it's completely deterministic.
 
just saying
"That being said, Tons of research shows the majority of Console gamers are happy with today graphics. "
BY WHO not PC gamers that get the 3rd party PORTS.

a game only has one budget
you have to wonder what % in the past has gone into game play vs 2010-2012 console graphics ,
leaving us pc gamers with crap ports with no graphics nor game play.
-maybe these new consoles will shift the % of budget towards game play.
-getting bored with AAA games that can only offer 2006 graphics BUT you can replay the game x 4 with 4 different identities. WTF
 
just saying
"That being said, Tons of research shows the majority of Console gamers are happy with today graphics. "
BY WHO not PC gamers that get the 3rd party PORTS.

a game only has one budget
you have to wonder what % in the past has gone into game play vs 2010-2012 console graphics ,
leaving us pc gamers with crap ports with no graphics nor game play.
-maybe these new consoles will shift the % of budget towards game play.
-getting bored with AAA games that can only offer 2006 graphics BUT you can replay the game x 4 with 4 different identities. WTF

The projection bets on PC gaming.

graph_1.jpg
 
But I'm certain that includes most "social gaming" aka. facebook and pogo. Not 3D FPS, which are in the minority of PC gaming (even though they are a pervasive genre).

Some additional telemetric numbers.

Picture21.png

Picture12.png


And an example of the US gaming industry anno 2010:
media


Consoles dont seem to age well in terms of gaming revenue.
 
Huh. Isn't the 360 supposed to have the greatest attach rate? How come it's being beaten by both the Wii and PS3? If those are accurate figures, the Wii did pretty well even in software unlike the prevailing "wisdom".
 
Huh. Isn't the 360 supposed to have the greatest attach rate? How come it's being beaten by both the Wii and PS3? If those are accurate figures, the Wii did pretty well even in software unlike the prevailing "wisdom".

I think the key word is active install base. And not units shipped. Many consoles have died/thrown out/hidden away over the last 7 years. I think from 2005 to 2007 the RROD rate for Xbox360 was 33% or so for example. And in 2009 a survey showed up to 54.2%.
 
Last edited:
Back
Top