AMD Jaguar : the great hope for next-gen PC gaming?

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
The thread title seems preposterous, but hear me out.

For years now, we've been stuck in a somewhat stale situation where most games struggled to efficiently utilize multicore systems. It goes far enough that even titles that work better with Quads / Hexes / Octos even have a hard time loading more than a couple of cores to any meaningful degree. Couple that with the undeniable fact that much of game and engine development is of course inextricably tied to the PS3/360 console capability, due to the idea of porting games and engines across the platforms is a lucrative thing to do. This seems to be a relatively recent thing when considering the totality of console and PC history, I don't remember ports being very common in the 70s-90s eras for PC. But what we mostly get now is even historical PC gaming heavyweights such as iD and Epic focusing on making engines that are equally applicable to the now-ancient 360/PS3 as well as PC. When it's done poorly, you get trash like the CoD games, watered down, low-resolution, trash textured dribble that takes next to zero advantage of more contemporary PC hardware. When it's done well, such as BF3 (the engine, not necessarily the game, for those that don't care for it), you get something that is both scalable and can continue to take advantage of higher and higher-end PCs along the way. I've seen it play well on an Athlon II X2 w/GTS250, and of course all the way up to as high as you want to go. Still, it is not the usual cross-platform title.

The AMD Jaguar is essentially an 8-core netbook processor. I am thinking that due to the fact that per-core the Jaguar is fairly weak, and the general truth that console devs eventually come to squeeze every last drop of performance possible in the consoles over their lifespan (look at Uncharted 1 vs 3 for example) : we will start to see a lot more heavily threaded titles coming down the pipe. I don't think laziness will cut it for long with the new consoles, and there is no significant single-core IPC to rely on. I don't even think that just going with existing engines that somewhat take advantage of duals and quads will be passable for AAA titles for all that long.

Previously in the industry, you had not much legitimate financial reasoning to even bother making games that depended upon a Quad to run optimally, let alone a hex or beyond. It's understandable given the dominance of common dual-core processors over the past 5 years. I think it's time we start seeing a change in that area as many old Duals die out, and new games will start to push for Quad-core as minimum recommended spec for PC ports. Just developing those engines will allow for better game worlds, AI, and so forth even on uniquely PC-only games.

What say AT?

EDIT : To clarify, I'm not saying that PC gamers should use AMD Jaguars as their gaming GPUs, but proposing rather that our full-fledged FX/i5/i7/etc will begin to be utilized more due to console development opening the door to much more highly threaded engines and game worlds. :)
 
Last edited:

futurefields

Diamond Member
Jun 2, 2012
6,471
32
91
they should have gone for less cores more higher ghz in my opinion, but they went the way that uses the least amount of power draw which makes sense.
 

darkewaffle

Diamond Member
Oct 7, 2005
8,152
1
81
There's as much a logical and technical barrier to multithreading as there is a hardware barrier. It's not just a switch you flick, the code itself and the underlying design logic have to be built specifically for it.

It can be done, but you can't just parallelize everything. Particularly when, in games, the 'order' actions are performed in can actually have tangible effects. Not to mention, the current crop of programmers out there have by and large been trained to see operations as serial processes; transitioning from single to multithread thinking is not an easy task.

Interesting summary on some of the boundaries to multithreading code.
 

CuriousMike

Diamond Member
Feb 22, 2001
3,044
543
136
transitioning from single to multithread thinking is not an easy task.

Audio has historically ran on its own thread because it's fairly straightforward to offload.
Some graphics stuff has/can be offloaded.

When you get to the meat of the physics/core gameplay code, it becomes a much harder problem to solve (the linked article above gives a few examples of why it's "hard").
Even if you're designing the engine thinking about multi-threading up front, you have to get all the engineers on your project on the same page.

Updates to the development tools ( and newer C++ standards ) should help engineers tackle this ... but it'll still be a challenge.
 

futurefields

Diamond Member
Jun 2, 2012
6,471
32
91
It was an awful choice imo.

Shoulda gone 4 core 3 ghz but they wanted to save a little money and use 8 cores as a marketing tool.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
It was an awful choice imo.

Shoulda gone 4 core 3 ghz but they wanted to save a little money and use 8 cores as a marketing tool.

More like saved TONS of money. Less cost for the APU, less cost for cooling, less cost for power supply, lest cost for motherboard, less cost for everything. Frankly, I think game devs will definitely make use of 8 cores and that is a GOOD thing. No more dual threaded games! It's 2013!
 

futurefields

Diamond Member
Jun 2, 2012
6,471
32
91
Well IMO its not such a good thing if my 2500k is way stronger than these console cpu's but it never gets utilized properly by games because they are optimized for 8 weak cores.
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
I'm sure there will be a push to use more cores, but I also think there will be a push to frame lock engines to make syncing threads easier. That might translate into more powerful PC hardware using more threads, but not really seeing the benefit. You already see this on occasion with lazy ports to PC where the dev team doesn't really have PC experience.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
It was an awful choice imo.

Shoulda gone 4 core 3 ghz but they wanted to save a little money and use 8 cores as a marketing tool.

There was a rumor a while back that the PS4's CPU had a max clock speed of 2.75 GHz..

Source

Since two cores are going to be reserved for the OS, I think the gaming cores are going to be clocked much higher than 1.6 GHz.

*Edit* It's possible that the 2.75 GHz figure was referring to the memory speed, and not the processor speed.
 
Last edited:

lord_emperor

Golden Member
Nov 4, 2009
1,380
1
0
This makes me happy because I basically wont have to spend a penny upgrading my gaming PC for years to come.
 

futurefields

Diamond Member
Jun 2, 2012
6,471
32
91
I'm sure there will be a push to use more cores, but I also think there will be a push to frame lock engines to make syncing threads easier. That might translate into more powerful PC hardware using more threads, but not really seeing the benefit. You already see this on occasion with lazy ports to PC where the dev team doesn't really have PC experience.

They will do anything in their power to make the console version the better experience. Im sure all sorts of shit will get done on console versions that wont on pc.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
hopefully it'll push intel to FINALLY release octacore CPUS. kinda silly when smartphones & tablets in 2013 have octacores and desktops are still stuck with quadcores.

quick question, would a 2600k with its HT be able to run games like an octacore? So if a game on next gen consoles is hecta/octacore optimized, the HT will help in running them?
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
There was a rumor a while back that the PS4's CPU had a max clock speed of 2.75 GHz..

Source

Since two cores are going to be reserved for the OS, I think the gaming cores are going to be clocked much higher than 1.6 GHz.

*Edit* It's possible that the 2.75 GHz figure was referring to the memory speed, and not the processor speed.

Turbo clock? If I'm not mistaken, the Jaguar cores can all have their core clocks independently controlled.


Also, poor AMD. There's still people that see their APUs as netbook CPUs on the same level as the anemic Intel Atom's. :(
I had an older HP DMZ1 laptop with an E-350 AMD APU in it, one of the first models to use it. It definitely qualified as a netbook but would run rings around any Atom based netbook at the time without working a sweat. The Jaguar is several generations ahead, with better cores, better GPU, built in OCing, more cache, etc. Nothing 'netbook' about it.
 

futurefields

Diamond Member
Jun 2, 2012
6,471
32
91
How about a 8ghz dual core processor. Would that ever be possible?

Would be able to make some badass games for it.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Turbo clock? If I'm not mistaken, the Jaguar cores can all have their core clocks independently controlled.


Also, poor AMD. There's still people that see their APUs as netbook CPUs on the same level as the anemic Intel Atom's. :(
I had an older HP DMZ1 laptop with an E-350 AMD APU in it, one of the first models to use it. It definitely qualified as a netbook but would run rings around any Atom based netbook at the time without working a sweat. The Jaguar is several generations ahead, with better cores, better GPU, built in OCing, more cache, etc. Nothing 'netbook' about it.

No, I don't consider their APUs nearly as horrible as the completely worthless Atom.

However, the quad-core version of this same thing (with a much more cut down iGPU, but that doesn't matter for the CPU performance aspect I'm talking about here), loses easily to a dual-core Non-HT Intel chip (SB-based no less) in every single CPU test, you know you're not talking legit CPU performance.

I've used E350, it's still stunningly painful to me.

http://www.anandtech.com/show/4023/the-brazos-performance-preview-amd-e350-benchmarked/3

Losing to ancient P4s, Celerons, VIA Nano, the cut-down Conroe-based Pentium Dual Cores, etc.

Don't get me wrong, on a per-watt basis it's very impressive, and does make very cool running, compact devices possible. For me personally, I can't even stand using them for browsing the web. This includes after upgrading them with a cheap 120gb SSD (Samsung 840 non-pro), which I really regret recommending, as performance is still sluggish and irritating to use.

Of course the Atom contemporary products were EVEN WORSE. But the Jaguar, in my mind, is just a very wide Netbook CPU that would get thrashed in almost everything by even an i5 (CPU perf).
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
It was an awful choice imo.

Shoulda gone 4 core 3 ghz but they wanted to save a little money and use 8 cores as a marketing tool.
What 4 cores at 3GHz? Not AMD's, that's for sure. If they were lucky, that would be 60-80W, and it would make the whole chip far bigger, adding a great deal to the cost. Based on notebook Jaguars, theirs are probably not using more than 25W, and maybe not even that, looking just at CPU+cache.

Nearly 8GB of memory, though (7GB, was it?)? That should be quite nice. The last consoles had not only CPUs slow even by Jaguar as a benchmark, but also 256/256 and 512 MB RAM, which was restrictive from the very start, on top of low bandwidth and lack of ROPs.

Well IMO its not such a good thing if my 2500k is way stronger than these console cpu's but it never gets utilized properly by games because they are optimized for 8 weak cores.
Because, as we all know, a faster CPU will not actually run code faster. Nope. Never happens that way. We've all moved to pure VLIW CPUs, now. :rolleyes:

They will do anything in their power to make the console version the better experience. Im sure all sorts of shit will get done on console versions that wont on pc.
So don't buy a shitty port, even if the game is decent. Easy. That has nothing to do, however, with what hardware goes into the console. That is purely a business decision by the developer and/or publisher.

How about a 8ghz dual core processor. Would that ever be possible?
Yes. It would be a total POS, though, so nobody is going to bother. Notice how craptastic BD is at 5GHz? Imagine that, but worse. To reach 8GHz, one needs either faster xtors (higher voltage), or less xtors switched per cycle (and then, the wires will necessitate higher voltage and current anyway), and in either case, lower density.
 

futurefields

Diamond Member
Jun 2, 2012
6,471
32
91
Well they better figure something out. Sick of hte compromises and bad ports. Also I want open world games with some real persistance to the environments.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Well they better figure something out. Sick of hte compromises and bad ports. Also I want open world games with some real persistance to the environments.
Like Gothic, various Bethesda games, and who-knows-how-many RPGs in eons past (IE, the 90s :))? Not a hardware issue, and has never been one. It takes time and care to make that happen, so unless it is a crucial element of the game, it's not likely to be there.

Bad ports have nothing to do with the hardware, and everything to do with the bean counters wanting to save money. They know it'll look bad not to have a PC version, but the console market is larger, so they don't put enough emphasis on the quality of it.

Now, the poor quality graphics do have to do with the hardware, by making the PC version basically the same game, visually. Also, it is quite possible, that some complexities of the environment, and AIs, were held back by the craptastic CPUs on the consoles, as well. Even Jaguar is going to be a monumental step up from the PPU(s), in every way but on-paper FLOPS.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
No, I don't consider their APUs nearly as horrible as the completely worthless Atom.

However, the quad-core version of this same thing (with a much more cut down iGPU, but that doesn't matter for the CPU performance aspect I'm talking about here), loses easily to a dual-core Non-HT Intel chip (SB-based no less) in every single CPU test, you know you're not talking legit CPU performance.

I've used E350, it's still stunningly painful to me.

http://www.anandtech.com/show/4023/the-brazos-performance-preview-amd-e350-benchmarked/3

Losing to ancient P4s, Celerons, VIA Nano, the cut-down Conroe-based Pentium Dual Cores, etc.

Don't get me wrong, on a per-watt basis it's very impressive, and does make very cool running, compact devices possible. For me personally, I can't even stand using them for browsing the web. This includes after upgrading them with a cheap 120gb SSD (Samsung 840 non-pro), which I really regret recommending, as performance is still sluggish and irritating to use.

Of course the Atom contemporary products were EVEN WORSE. But the Jaguar, in my mind, is just a very wide Netbook CPU that would get thrashed in almost everything by even an i5 (CPU perf).

In those charts, there's no Atom chips in the comparison. Comparing Brazos to Conroe based Pentium Dual Cores is not even a remotely fair comparison. When you compare an Atom powered, non-Ion, to an E-350 and 6310 IGP, its night and day.

This the article that paints a more complete picture, as its actually a Jaguar based CPU.
http://www.anandtech.com/show/6974/amd-kabini-review/5

And even then, its not complete as the part in the Xbone/PS4 contains double the cores, is clocked higher, and boast Radeon 7790/7870 class GPUs on the same die. If I'm not mistaken, these GPUs also have compute capabilities as well, so theres the very likely possibility that they'll be handling some computational tasks.

Still, an i5 will easily trounce the Jaguar part in raw performance. There's no question on that. One of the big complaints against these new consoles is that they are using low-midrange CPUs and midrange GPUs, from 2012. I don't see these new devices enjoying a ten year life cycle like their predecessors. With die shrinks, I'd expect them to be excellent with performance per watt, but I'd bank on a refresh in 4-5 years.

For gaming development, they're still a big jump over the 360 and PS3 and will definitely jerk that minimal standard we've all be stuck at for years upward. We'll just quickly plateau again in a fairly short time frame.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I don't see these new devices enjoying a ten year life cycle like their predecessors.
Their predecessors haven't exactly enjoyed their life cycles, thus far. That was a grand ivory tower idea that was destined not to pan out. Not being OELed for 10 years doesn't mean they'll have been worth using for more than 4-5.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Their predecessors haven't exactly enjoyed their life cycles, thus far. That was a grand ivory tower idea that was destined not to pan out. Not being OELed for 10 years doesn't mean they'll have been worth using for more than 4-5.

Thats technically true of every console generation. At launch, there's nothing better out there. A year in, PC tech has leapfrogged the consoles. At the end of the console life cycle, the console is an archaic paperweight, running highly optimized software. With this generation, the consoles are launching with hardware thats significantly weaker than even a midrange gaming PC today.


Will they be using GPU's in these consoles to do process physics or is that all handled by Jaguar?

I'm not certain, but I would think that would be up to the developers whether or not to utilize the GPU and CPU capabilities in a method thats most conducive to their game.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Thats technically true of every console generation.
PS1: 1995
PS2: 2000
PS3: late 2006 (due to being delayed; and it was probably delayed before the delays we knew about, too)

A 5 year cadence is probably still a good way to do it, allowing devs to make the most of it, since they'll have to invest time in it, and customers to get decent value from it. Try to go too fast, and you might misstep as far as consumer wants, and/or piss off devs and distributors (see the historical cluster**** that was 32X, Saturn, an Dreamcast :)).

The hardware always lags. That's fine, as long as it is on track to be replaced in a few years (which it wasn't, this last time). The idea that they could make a <$1000 machine that would last longer than 3-5 years, competitively, was beyond asinine, and denied historical reality.