• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

amd Grassfx apu&gpu vs cpu&gpu

monstercameron

Diamond Member
It is possible that in the future apus will be better in gaming than even Intel ee cpus... then it is a good thing that Intel makes apus too 😉

GrassFX.jpg


https://www.youtube.com/watch?v=2ktG7QX2BH4

http://www.extremetech.com/computin...us-better-pricing-and-lower-power-consumption

Darn, its taking sooooo long, I want the future now!
 
Last edited:
I wont hesitate to pull the trigger if a 3 module Carrizo part arrives to the desktop. 2 Modules is still too low for my workloads.
 
I wont hesitate to pull the trigger if a 3 module Carrizo part arrives to the desktop. 2 Modules is still too low for my workloads.
OT: If there is going to be a change in Excavator. It will be the addition of 2-way SMT in the cores and 4-way SMT in the FPU. (Yep, in the SMT crew now). No need to increase core count when you increase thread count and performance.
ALL HAIL THE MIGHTY LINKEDIN
Indeed, I've lost track of the Voxel Raytracing engine built for VI and PI profile. It is out there somewhere floating on the near infinite web. Upcoming tech demos from AMD should be big and all feasible in mass production via game engines.
 
I'm still open to the idea that those with newer GCN APUs may get more gaming longevity than those with current quad core Intels or quad module AMDs because of the inherit performance capabilities of using the APU purely for non-graphics workloads. While it may sound far fetched, it's certainly likely since both main consoles this generation use AMD APUs and are both GPGPU centric. In order to really move things along graphically and interactively on consoles, devs will have to use GPGPU with both new consoles, of which the PS4 especially has been tailored for. Such technologies should trickle down to the PC or vice versa if AMD really pushes PC game developers to GPGPU.

I don't think we'll see the same kind of hardware exclusion as PhysX, but the possibilities are certainly more vast with an APU in terms of environmental interactivity and manipulation on the fly. It just took a new console generation to make it happen.
 
Last edited:
In order to really move things along graphically and interactively on consoles, devs will have to use GPGPU with both new consoles, of which the PS4 especially has been tailored for and that should trickle down to the PC or vice versa.

Yeah, I keep hoping for such to come to fruition, but it sure is moving at a snail's pace. There's zero reason why GPGPU functions couldn't have been running on integrated graphics since Llano and Sandy Bridge over 3 years ago. Having the consoles offer that capability might get things moving, or it might just result in the status quo continuing and this generation of consoles having a reduced lifecycle due to software's inability or refusal to adopt the necessary programming models to make the chosen SoCs keep up.
 
Yeah, I keep hoping for such to come to fruition, but it sure is moving at a snail's pace. There's zero reason why GPGPU functions couldn't have been running on integrated graphics since Llano and Sandy Bridge over 3 years ago. Having the consoles offer that capability might get things moving, or it might just result in the status quo continuing and this generation of consoles having a reduced lifecycle due to software's inability or refusal to adopt the necessary programming models to make the chosen SoCs keep up.

These consoles will have at least 5 years to wow people, and to compete with one another, devs will need to use GPGPU if they want to have dense interactive particles in the form of smoke, water, or w/e else. The eight x86 Jaguar cores may provide the same single precision FLOPS rate per clock as the PS3's Cell, but they are only half the speed (1.6 GHz vs 3.2 for Cell). I don't think any PS4 devs want to take a step backwards, so GPGPU is necesssary if they hope to improve upon what they did on the PS3, even if the efficiency and hit rate gets a big boost thanks to multitudes of improved general IPC and OoO processing.
 
These consoles will have at least 5 years to wow people, and to compete with one another, devs will need to use GPGPU if they want to have dense interactive particles in the form of smoke, water, or w/e else. The eight x86 Jaguar cores may provide the same single precision FLOPS rate per clock as the PS3's Cell, but they are only half the speed (1.6 GHz vs 3.2 for Cell). I don't think any PS4 devs want to take a step backwards, so GPGPU is necesssary if they hope to improve upon what they did on the PS3, even if the efficiency and hit rate gets a big boost thanks to multitudes of improved general IPC and OoO processing.

In that case, where are all the new games tailor written for the new consoles which make use of both CPU multi-threading and GPGPU? Because it's not like they haven't had ample time to do so given that, for example, the xbox one dev kit was available over 2 years ago. I fully expect that they'll be able to make decent games for the consoles, and it might indeed be enough to push developers into making better use of multi-threading/GPGPU resources... But I'm certainly not holding my breath for it. After all, the only thing that matters is making money, which is why 'good enough' has a pretty good chance of winning out.
 
this why you aren't on my ignore list...i have faith in your LinkedIn skills!

Because people can't create fake linkedin profiles, or add a ton of fluff BS to their otherwise legitimate profile whilst job-hunting knowing that it is all "hush hush, top secret, etc" unverifiable bunk when interviewing?

You guys are so thirsty for some news, any news, that you are willing to drink the sand of the desert for sake of somebody telling you it is water...
 
Because people can't create fake linkedin profiles, or add a ton of fluff BS to their otherwise legitimate profile whilst job-hunting knowing that it is all "hush hush, top secret, etc" unverifiable bunk when interviewing?
https://plus.google.com/114750392537331559870/posts

Don't worry every person I look at I do an in depth illegal(<-- this is a joke) background check. COMPUTERS!
You guys are so thirsty for some news, any news, that you are willing to drink the sand of the desert for sake of somebody telling you it is water...
Stop being jelly all the time.
It is possible that in the future apus will be better in gaming than even Intel ee cpus... then it is a good thing that Intel makes apus too 😉
I don't think AMD's CPUs will ever match the speed or performance of Intel. Intel has already caught up with GT3e, supporting a lot of the same APIs as Kaveri. (Waits patiently for Intel to announce Mantle, TressFX, GrassFX support for Haswell Iris onwards.)
 
Last edited:
Because people can't create fake linkedin profiles, or add a ton of fluff BS to their otherwise legitimate profile whilst job-hunting knowing that it is all "hush hush, top secret, etc" unverifiable bunk when interviewing?

You guys are so thirsty for some news, any news, that you are willing to drink the sand of the desert for sake of somebody telling you it is water...

yes.
 
Putting GPGPU on the APU while rendering on the GPU? Finally. AMD should have pushing this since 6 months before Llano.
 
Putting GPGPU on the APU while rendering on the GPU? Finally. AMD should have pushing this since 6 months before Llano.

Aye, this should have been the selling point of AMD's APUs for those using dGPUs.

Budget build? Just use the iGP for rendering.
Have enough $ for a dGPU? use the iGP to offload compute.
 
Is this a better option than crossfire with the igpu?

Much better. You can have mismatched GPUs, for a start- say an AMD APU and Nvidia GPU, or Intel APU with AMD GPU. Imagine usinga Haswell's IGP to offload physics work while rendering flat out on a 290X.
 
Aye, this should have been the selling point of AMD's APUs for those using dGPUs.

Budget build? Just use the iGP for rendering.
Have enough $ for a dGPU? use the iGP to offload compute.

I think developers just didn't care and I'm sure AMD didn't provide any kind of toolkits, middleware, etc to make it even remotely feasible for anyone with a budget. It would've been like PhysX, since not that many probably had top end Llanos, and it might've shut out Intel and non-Llano AMD users out of the "secret sawce" only creating disparity between gamers and developers who might've used it. However, now that their are consoles with APUs, it might be feasible, and such a solution could ship in a game with both APU and CPU-only solutions so people with Kaveri APUs could see the same kind of physics performance as say a quad-core Haswell or even better.
 
Last edited:
I think developers just didn't care and I'm sure AMD didn't provide any kind of toolkits, middleware, etc to make it even remotely feasible for anyone with a budget. It would've been like PhysX, since not that many probably had top end Llanos, and it might've shut out Intel and non-Llano AMD users out of the "secret sawce" only creating disparity between gamers and developers who might've used it. However, now that their are consoles with APUs, it might be feasible, and such a solution could ship in a game with both APU and CPU-only solutions so people with Kaveri APUs could see the same kind of physics performance as say a quad-core Haswell or even better.

opencl can do this sort on mixed hardware approach and devs have had opencl tools for a long time. I blame lack of foresight.
 
With IGPs occupying ~50% of the die of a modern CPU i am just glad that we are starting to see some hint of being able to use them on gaming when you have a dGPU
 
Back
Top