Core i3 4330 vs Kaveri in dGPU 1080p Gaming

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

2is

Diamond Member
Apr 8, 2012
4,281
131
106
That is the definition for getting a higher performance GPU in the first place, to raise the image quality settings. I dont understand why people have this mentality that in order to use a high end GPU you also have to pair it with high-end CPU.
You can use the Athlon 860K with a R9 390 and play every game at 1080p with Ultra settings. Whats wrong with that ???

Shit minimum frames for one. Shit average frames in demanding games for another. Stupid way to configure a gaming system for a 3rd reason. DX12 isn't going to be the answer either.
 
Aug 11, 2008
10,451
642
126
That is the definition for getting a higher performance GPU in the first place, to raise the image quality settings. I dont understand why people have this mentality that in order to use a high end GPU you also have to pair it with high-end CPU.
You can use the Athlon 860K with a R9 390 and play every game at 1080p with Ultra settings. Whats wrong with that ???

Or to raise the minimum and average framerates, which are dependent on both cpu and gpu.

For instance, I have been playing Witcher 3 on a HD7770. I get in the high 20FPS, lowest settings, at 1080p. According to your argument, if I got a better gpu, I should raise the quality until I still was only getting 25-30 FPS, because the purpose of a high end gpu is to raise image quality, isnt that what you said? But I am done with this. Fell free for whatever reason to continue to advocate cranking settings until you are gpu limited no matter what abysmal framerate you get in order to justify using an inferior cpu.

And in case you think this is some AMD hate thing, I specifically said I would not pair an i3 with a high end gpu either.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
Shit minimum frames for one. Shit average frames in demanding games for another. Stupid way to configure a gaming system for a 3rd reason. DX12 isn't going to be the answer either.

Source for your DX12 wisdom? Seem like zero (stupid?) reasoning to me.

On topic; I've got a 7600 system. I get my low end gaming fix (CS & such) @ 1080f with good fps and using lower power than a vast majority of systems. Last night I threw in my dGPU (~2min job) & can now play nifty new titles I purchase in the winter sale. Once I've clocked the games (or get busy) I'll rip out the dGPU again.

OP's results look in line with what I was expecting, & plenty fast enough for what I need atm.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Source for your DX12 wisdom? Seem like zero (stupid?) reasoning to me.

On topic; I've got a 7600 system. I get my low end gaming fix (CS & such) @ 1080f with good fps and using lower power than a vast majority of systems. Last night I threw in my dGPU (~2min job) & can now play nifty new titles I purchase in the winter sale. Once I've clocked the games (or get busy) I'll rip out the dGPU again.

OP's results look in line with what I was expecting, & plenty fast enough for what I need atm.

My source is 10 years worth of AMD hopefuls pinning their dreams on a future technology to make their CPU's competitive, and not once having it ever happen.

My source is developers talking about how much more than can do because of DX12's lower over head... Read that again. It's not simply lower overhead and that's the end of it. It's taking those now free resources due to lower overhead, and doing something tangible with it within the game itself. In other words, it's not simply going to be lower CPU usage it's going to be less wasted CPU usage.

So, there's your answer. Now lets hear your rebuttal.
 
Last edited:

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
If I can look at which account posted a thread and not have to read it to know what the conclusion will be, what would you say the value of that content is?
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
My source is 10 years worth of AMD hopefuls pinning their dreams on a future technology to make their CPU's competitive, and not once having it ever happen.

My source is developers talking about how much more than can do because of DX12's lower over head... Read that again. It's not simply lower overhead and that's the end of it. It's taking those now free resources due to lower overhead, and doing something tangible with it within the game itself. In other words, it's not simply going to be lower CPU usage it's going to be less wasted CPU usage.

So, there's your answer. Now lets hear your rebuttal.

So you have no source. You're just another biased poster making claims without a source. And you deserve to be reported like any other.
 

john5220

Senior member
Mar 27, 2014
551
0
0
Looks to me like the Kaveri chips are overall decent competitors for an i3. However, when the i3 loses, it isn't by much, and it sometimes wins by a landslide while drawing a lot less power (e.g. Metro, Formula 1).

The 7850K has some added value in that you can overclock, but it's slightly more expensive and looks like you'd need to bump up your power supply by ~100 watts and probably also use a more expensive motherboard than what you could get away with when not overclocking (i3 or Kaveri).

What about the fact that FM2+ provides no upgrade path. The most important part actually, a lot of people keep their mainboard for many years and just swap in a new CPU when the time is right. And save theirself the trouble of formatting HDD and crap and having to rewire the mainboard etc

You must remember we have no idea if AMD will release an upgrade to the Fm2+ they are also losing out on a good market with say a 6 core CPU for Fm2+ for those who decided they want a dGPU instead. But are now stuck with FM2+

Amd is such a shit company I am happy I left them for intel. Really stupid people who run that company the fact that they even bought ATI then sell APU to compete with their own dGPU is amazingly stupid.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
So you have no source. You're just another biased poster making claims without a source. And you deserve to be reported like any other.

It's not biased to say AMD's processors are weak compared to Intel. It's a verifiable fact.

It's not biased to say pairing a high end GPU with an Athlon is stupid, it's my opinion and I think it's equally stupid to do so on a Celeron or Pentium processor too.

If you think I have no source, please google DX12. I have yet to find a single article talking about how weak processors will suddenly be revived. It's all about getting more out of your hardware so they can do more with the games.

The genesis of DX12 can be found in technology trends. GPUs have continued to rapidly increase in performance, while single-core CPU performance has been gated by power limits. Multi-core CPUs have provided some advancement but still trail GPUs in peak performance. In parallel, applications have embraced task-parallelism, adopting sophisticated scheduling systems to scale performance with the number of CPU cores. This has in turn driven the need for an API that scales similarly with core count. GPU performance can be exploited three ways: drawing better pixels, more pixels and more objects. We have reaped much of what can be gained from pixels. DX12’s focus is on enabling a dramatic increase in visual richness through a significant decrease in API-related CPU overhead. Historically, drivers and OS software have managed memory, state, and synchronization on behalf of developers. However, inefficiencies result from the imperfect understanding of an application’s needs. DX12 gives the application the ability to directly manage resources and state, and perform necessary synchronization. As a result, developers of advanced applications can efficiently control the GPU, taking advantage of their intimate knowledge of the game’s behavior. - See more at: http://blogs.nvidia.com/blog/2014/03/20/directx-12/#sthash.Cd9zxKYC.dpuf

If you need a translation to what that means, see my last post.

Feel free to report me for telling you a truth you don't want to hear.
 
Last edited:

john5220

Senior member
Mar 27, 2014
551
0
0
My source is 10 years worth of AMD hopefuls pinning their dreams on a future technology to make their CPU's competitive, and not once having it ever happen.

My source is developers talking about how much more than can do because of DX12's lower over head... Read that again. It's not simply lower overhead and that's the end of it. It's taking those now free resources due to lower overhead, and doing something tangible with it within the game itself. In other words, it's not simply going to be lower CPU usage it's going to be less wasted CPU usage.

So, there's your answer. Now lets hear your rebuttal.

I feel your pain, I too know about the garbage trash that AMD is, do you remember their CEO who was caught in internal corruption just to name a few things embarrassing about that company?

I bought intel and never looked back now I have an upgrade path, if my mainboard lasts 20 years then I won't even need to upgrade, ROFL. In 10 years when games really need high CPU count I could buy a 6 Core 12 threaded I7 devils canyon which will be cheap by then. And have me going another 20 years.

This is how legendary Intel is, the first thing to go would be the mainboard I would have 0 need to change mainboards. The upgrade path for haswell is so incredible.

Now lets take Fm2+ my good friend bought going against my advice, now AMD has abandoned the Fm2+ and AM3+ both platform which delivered the same if not less performance then the almost decade old Phenom II platform. Lets just allow that to sink in, for a minute.

One could have simply not even bothered to upgrade with AMD after 2007.

Infact watch this, the day games use 8 threads, those with the very old i7 whats the name of the one before the sandy bridge? the first gen old old core i series in DDR 2 I think. That uses 8 threads right? so by the time games need more than 8 threads it would mean those people had gotten 20 years out of their system assuming the mainboard lasted even that long.

And AMD will still be trying to play catch up. They can't even match the power of sandy bridge, AMD is still playing catch up with the old first gen fore i series.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Infact watch this, the day games use 8 threads, those with the very old i7 whats the name of the one before the sandy bridge? the first gen old old core i series in DDR 2 I think. That uses 8 threads right? so by the time games need more than 8 threads it would mean those people had gotten 20 years out of their system assuming the mainboard lasted even that long.

And AMD will still be trying to play catch up. They can't even match the power of sandy bridge, AMD is still playing catch up with the old first gen fore i series.

Before Sandy Bridge was Nahelam/Lynfield/Bloomfield, they all used DDR3. There are no Core i3, i5 or i7 that use DDR2 actually.

Before that were the Core 2 Quads. Memory controller for those were on the motherboard and could be used with either DDR2 or DDR3 depending on the chipset.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Or to raise the minimum and average framerates, which are dependent on both cpu and gpu.

For instance, I have been playing Witcher 3 on a HD7770. I get in the high 20FPS, lowest settings, at 1080p. According to your argument, if I got a better gpu, I should raise the quality until I still was only getting 25-30 FPS, because the purpose of a high end gpu is to raise image quality, isnt that what you said?

NO, what i said is that you can use a faster GPU in order to increase the image quality of the games you play. I havent said you will need to play at 20fps.

Take for example the GTA V, one of the most CPU Demanding Games of 2015.

A10-7870K paired with the R9 285 produces 33,86 fps,

http://anandtech.com/show/9307/the-kaveri-refresh-godavari-review-testing-amds-a10-7870k
74873.png


Now, you do an upgrade and you install the GTX980 to your system,
The A10-7870K now produces 52,53 fps, even higher than what the A10-7870K or even the Core i5 4690 could do with the R9 285 above.
And that is in a very CPU demanding game, in other games you will gain much more.

74905.png


People should understand that raising the Image Quality when they have a faster GPU will not result in lower fps than before.

Another example,

Bioshock Infinity, A10-7850K paired with HD7790 and Medium setting has 79,64 fps average.

21y4o4.jpg


Now you upgrade your GPU with the HD7950 1GHz, you can increase the Image Quality to Ultra DDOF and have almost the same fps as before.

bex5lh.jpg


Have another one more demanding game, Metro Last Light Redux.

1080p Medium Settings AF 16x, Tess = off
sphsg1.jpg


And now with the HD7950 1GHz you increase the Image Quality to High and Tessellation to Normal
Even the A8-7600 with the HD7950 1GHz can do better.

30hwo4n.jpg


Just because the Core i5 will get you higher fps than the A10-7850K/Athlon 860K doesnt mean you will play at 20fps.

Anyway i will try to do the same review with a faster GPU to showcase that point when i will get the time.
 
Last edited:

richaron

Golden Member
Mar 27, 2012
1,357
329
136
Shit minimum frames for one.

Wrong (specify games). Biased. Subjective: Define shit.

Shit average frames in demanding games for another.

Biased. Subjective: Define shit. Zero proof: must define "demanding games".

Stupid way to configure a gaming system for a 3rd reason.

Wrong (I gave you one legitimate example). Biased. Subjective: Define stupid.

DX12 isn't going to be the answer either.

Zero proof. None. A few claims of cherry picked comments from dx12 devs are nothing. They also talk about offloading more tasks to GPU, but meh.

Combine those quotes and you get the entirety of a post. I didn't want this to escalate, and I will not let our discussion move forwards from this post I quoted. If you write something this stupid in a technical forum someone deserves to point it out.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Wrong (specify games). Biased. Subjective: Define shit.



Biased. Subjective: Define shit. Zero proof: must define "demanding games".



Wrong (I gave you one legitimate example). Biased. Subjective: Define stupid.



Zero proof. None. A few claims of cherry picked comments from dx12 devs are nothing. They also talk about offloading more tasks to GPU, but meh.

Combine those quotes and you get the entirety of a post. I didn't want this to escalate, and I will not let our discussion move forwards from this post I quoted. If you write something this stupid in a technical forum someone deserves to point it out.

You wanted a source, and you got exactly that, and now you're claiming shens lol.

You can call it "stupid" all you want, but I told you the premise behind DX12, you didn't want to believe it and insisted on a source, you got it. A source and a decade worth of evidence to boot. What have you got besides getting upset and crying me a river?

Go ahead and keep thinking they're investing time and money into DX12 development to make weaker processors viable if it makes you feel better. Reality will hit you soon enough, not that you'd recognize it based on your posts here.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
Zero proof. None. A few claims of cherry picked comments from dx12 devs are nothing. They also talk about offloading more tasks to GPU, but meh.

Well,if you bother to read up on it you will see that the focus is on power consumption,it is all about giving the masses that (will) use win10 tablets more playtime on the next crappybird.
http://blogs.msdn.com/b/directx/arc...-high-performance-and-high-power-savings.aspx
Yes it also shows that you can get better performance with the same power draw,but the main thing you see is that everything starts with the driver thread having lower needs,translation = just like in mantle games there will be one less thread not more thread,less threads = less performance( /$ ) for multicored processors.

And if you think that dx12 is multithreaded and will give much better results on more cores..well not really.
http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/4
Keep in mind that this is only draw calls with no game logic running at all.
Full power of all cores straight to the vga,so best case imaginary dream world scenario still only ~15% faster
66.9fps on 4 and 6 cores,not much scaling going on there, to 55,7fps on 2cores.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
If I can look at which account posted a thread and not have to read it to know what the conclusion will be, what would you say the value of that content is?

Nailed it.

Funny thing we had this before. Claiming some lowend AMD CPU was good enough absed on prescripted benchmark. Unfortunately there was up to 400% performance difference between benchmark and ingame playing.

Oh ye, and the usual "next thing will make it much better for AMD". And we all know how that turns out.
 

gorion

Member
Feb 1, 2005
146
0
71
I bought intel and never looked back now I have an upgrade path, if my mainboard lasts 20 years then I won't even need to upgrade, ROFL. In 10 years when games really need high CPU count I could buy a 6 Core 12 threaded I7 devils canyon which will be cheap by then. And have me going another 20 years.

You are fooling yourself with your upgrade path. Chances are that most of the time is better to simply change MB and CPU instead than dropping a more powerful (probably used) CPU in an old mobo.

You'll either be stuck on an ancient platform (no latest USB, SATA, whatever it will be the hottest thing) or you are going to drop a new CPU there to see the mobo fail after a few more months/years and at that point the money would have been better invested somewhere else as you'll be looking for expensive used MB replacements.

This, unless you plain to upgrade your CPU quite frequently, but that doesn't really make sense compared to buying the right CPU from the start.

Intel or AMD, I think you must buy the right CPU for your needs. AMD has some niches in which perf/$ makes it a good choice, while Intel is often the better choice.