What budget CPU would you use for repurposing a 28nm Nvidia or 28nm AMD dGPU?

What budget CPU would you use for repurposing a 28nm Nvidia or 28nm AMD dGPU?

  • FX-8300 (Stock speed)

    Votes: 1 5.9%
  • Core i3 6100

    Votes: 16 94.1%

  • Total voters
    17

cbn

Lifer
Mar 27, 2009
12,968
221
106
Both these CPUs go on sale for around the same money at Newegg:

$100 (FX- 8300), Normal price $115 free shipping

$105 (Core i3 6100), Normal price I believe is $115 free shipping.

The FX8300 also comes with a Deus EX Mankind Divided game code.

Platform cost of AM3+ with 970 chipset and usb 3.0 added tends to be a tiny bit higher than a low end LGA 1151 board. There is also only one mATX board using the 970 chipset in contrast to the many LGA 1151 boards available in mATX. Furthermore, the idle (and total power consumption) on the AM3+ is higher than LGA 1151.

In addition, LGA 1151 also offers a greater level of CPU upgradability.

However, outside of CPU upgradability, I think the FX-8300 with 970 chipset motherboard could be a good CPU and platform for a repurposing a midrange or high end 28nm dGPU based on AtenRa's testing here. (Stock speed FX-8300 is about equal to the stock speed FX-8150 used in AtenRa's testing in terms of multi-thread and a bit faster in single thread due to the slightly higher IPC).

In fact, with a 28nm Nvidia card for DX11 I think a FX-8300 would do even better. This due to Nvidia's graphic driver being multi-threaded (as compared to the AMD single threaded DX11 graphic driver) and thus reducing the some of the single thread requirement of the CPU. (DX12 also fixes the driver issue).

SIDE NOTE: AtenRa did use DDR3 1866 for this Core i3 6300 testing, but I don't think the performance difference would not be that much compared to DDR4 2133 (re: unlike DDR4 @ 1866, DDR3 1866 would have tighter timings). Also consider that I am referring to a Core i3 6100 for the purposes of this thread, not the 100 Mhz faster Core i3 6300.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
I have a GTX 760 in my closet from upgrading to a GTX 970 last year. I suppose if I was building a new system for someone I could throw it in.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I have a GTX 760 in my closet from upgrading to a GTX 970 last year. I suppose if I was building a new system for someone I could throw it in.

And let me guess you also have a spare 2 x 4GB DDR3 kit lying around after upgrading your Haswell Core i7 system to 2 x 8GB DDR3 (or better).

Do you see where this is going?

(Pretty soon the average person has a fairly good pile of assorted spare parts they could possibly make another PC out of)
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Comparing LGA 1151 boards with DDR3 vs. LGA 1151 boards with DDR4:

http://www.newegg.com/Product/Produ...70 8000&IsNodeId=1&bop=And&order=PRICE&page=1

http://www.newegg.com/Product/Produ...4735 601187447&IsNodeId=1&bop=And&order=PRICE

I noticed the DDR4 LGA 1151 boards are actually cheaper (starts $38 AR free shipping) than the LGA 1151 DDR3 boards (starts @ $51 shipped). There is also a much greater selection of DDR4 boards compared to the DDR3 boards.

Interestingly, those DDR3 LGA 1151 boards are actually more expensive than the 970 chipset AM3+ boards (starts @ $42 AR shipped):

http://www.newegg.com/Product/Produ...60 8000&IsNodeId=1&bop=And&order=PRICE&page=1

This might be something to consider for someone who also has spare DDR3 to re-purpose as well.
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,012
136
The i3 has massively better per thread performance, and overall multithread performance is not far behind the FX. It's a much better combination for games. Just go read the Anandtech review: http://www.anandtech.com/show/10543/the-skylake-core-i3-51w-cpu-review-i3-6320-6300-6100-tested It beats the 8370 in almost every single gaming test. The lower clocked 8300 would not stand a chance.

The fact that the i3 has a much more modern chipset and far lower power consumption is just icing on the cake.
 
  • Like
Reactions: Ken g6 and Yuriman

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
And let me guess you also have a spare 2 x 4GB DDR3 kit lying around after upgrading your Haswell Core i7 system to 2 x 8GB DDR3 (or better).

Do you see where this is going?

(Pretty soon the average person has a fairly good pile of assorted spare parts they could possibly make another PC out of)
Nope just the Video card and a spare PSU.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
The i3 has massively better per thread performance, and overall multithread performance is not far behind the FX. It's a much better combination for games. Just go read the Anandtech review: http://www.anandtech.com/show/10543/the-skylake-core-i3-51w-cpu-review-i3-6320-6300-6100-tested It beats the 8370 in almost every single gaming test. The lower clocked 8300 would not stand a chance.

I've seen the charts (from various websites) comparing FX83xx vs. Core i3 over the last 3 years. History has been that even the Haswell Core i3 usually wins in FPS. A notable exception would be some of the newer games like Fallout 4:

http://gamegpu.com/rpg/rollevye/fallout-4-test-gpu.html

proz.jpg


In the above case even FX-8150 (about the speed of a FX8300) does well against the Core i3 4330.

However, AtenRa's testing went beyond FPS and measured game smoothness and energy consumption of Core i3 6300 (with DDR3 1866), Core i5 2500K (both stock and OC), and FX-8150 (both stock and OC) over 60 minutes in the scenario of rather powerful 28nm dGPU (HD7950 @ 1000 Mhz). What he found was that even the stock speed FX-8150 did surprisingly well on smoothness with the exception of one game (Civilization: Beyond Earth in DX11 mode).

P.S. Taking my own experience into account I have noticed that my 4.3 Ghz G3258 was about 10 FPS faster than my Stock speed Athlon x 4 860K in BF 4 64 player. That was very surprising considering the multi-threaded nature of the game. The gameplay was also extremely smooth with the G3258 using the rather small R7 250X dGPU. When switching to my GTX 660, however, I remember the G3258 still having the same frame rate advantage over the Athlon x 4 860K....but now the game stuttered. With the Athlon x 4 860K and the GTX 660 the game was fine though. Maybe some of this was due to my using Mantle with the R7 250X, but then again that card only has 1GB VRAM. A more likely cause of the stutter with 4.3 Ghz G3258 and GTX 660 I believe related to a CPU to dGPU imbalance as illustrated by the example in this post.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,232
5,012
136
I've seen the charts (from various websites) comparing FX83xx vs. Core i3 over the last 3 years. History has been that even the Haswell Core i3 usually wins in FPS. A notable exception would be some of the newer games like Fallout 4:

http://gamegpu.com/rpg/rollevye/fallout-4-test-gpu.html

proz.jpg


In the above case even FX-8150 (about the speed of a FX8300) does well against the Core i3 4330.

However, AtenRa's testing went beyond FPS and measured game smoothness and energy consumption of Core i3 6300 (with DDR3 1866), Core i5 2500K (both stock and OC), and FX-8150 (both stock and OC) over 60 minutes in the scenario of rather powerful 28nm dGPU (HD7950 @ 1000 Mhz). What he found was that even the stock speed FX-8150 did surprisingly well on smoothness with the exception of one game (Civilization: Beyond Earth in DX11 mode).

P.S. Taking my own experience into account I have noticed that my 4.3 Ghz G3258 was about 10 FPS faster than my Stock speed Athlon x 4 860K in BF 4 64 player. That was very surprising considering the multi-threaded nature of the game. The gameplay was also extremely smooth with the G3258 using the rather small R7 250X dGPU. When switching to my GTX 660, however, I remember the G3258 still having the same frame rate advantage over the Athlon x 4 860K....but now the game stuttered. With the Athlon x 4 860K and the GTX 660 the game was fine though. Maybe some of this was due to my using Mantle with the R7 250X, but then again that card only has 1GB VRAM. A more likely cause of the stutter with 4.3 Ghz G3258 and GTX 660 I believe related to a CPU to dGPU imbalance as illustrated by the example in this post.

The G3258 is a two thread CPU, the i3 6100 is a four thread CPU. It doesn't have the same stuttering problems.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
The G3258 is a two thread CPU, the i3 6100 is a four thread CPU. It doesn't have the same stuttering problems.

I agree that the Core i3 6100 would be more resistant to stuttering than a OC G3258, but it can stutter even when using Mantle API:

https://forums.anandtech.com/threads/60-minutes-–-cpu-performance-and-energy-consumption-in-gaming-atenra.2459963/#post-37949265

Core i3 6300 - The lack of CPU Cores/Threads on the Core i3 6300 is evident even when Mantle is used. There is a lot of stuttering and fps performance is lower than even the Bulldozer FX-8150. The Core i3 6300 is not a good CPU if you have a 120Hz monitor and fast GPU since this game in Multiplayer mode needs a lot of Cores/Threads.

Core i5 2500K – The fps performance for the Quad Core Sandybridge is very good, the Core i5 2500K stuttering is good enough and it can maintain high fps close to 120fps cap.

FX8150 - At default clocks the FX8150 has even less stuttering than the Core i5 2500K but the fps performance is lower than the Quad Core Sandybridge. Mantle helps a lot but the low single thread performance have an impact in the fps performance. Although it cannot reach the same fps performance of the Core i5 2500K, the low stuttering makes it better than the latest Core i3 Skylake.
 
Last edited:

PontiacGTX

Senior member
Oct 16, 2013
383
25
91
if you plan to upgrade later to a quad core the i3 if not then try to get an used i7 3770k/4770k/4790k 3930k/4930k/xoen e5 1650/1660 (v2)
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,012
136
I agree that the Core i3 6100 would be more resistant to stuttering than a OC G3258, but it can stutter even when using Mantle API:

https://forums.anandtech.com/threads/60-minutes-–-cpu-performance-and-energy-consumption-in-gaming-atenra.2459963/#post-37949265

That is a single game, with the CPU running with sub-optimal memory. And yes, memory performance makes a difference- Eurogamer measured performance with DDR4-2133 vs DDR4-2666, and found a noticeable difference http://www.eurogamer.net/articles/digitalfoundry-2015-intel-core-i3-6100-review Imagine how much worse it would run with 1866 memory!

So a single game, running a codepath codeveloped by AMD, in suboptimal conditions. Against the large number of games tested by Anandtech. And they did not just measure framerate, they also measured smoothness:

83103.png


83079.png
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
And yes, memory performance makes a difference- Eurogamer measured performance with DDR4-2133 vs DDR4-2666, and found a noticeable difference http://www.eurogamer.net/articles/digitalfoundry-2015-intel-core-i3-6100-review Imagine how much worse it would run with 1866 memory!

Yes, it would be slower with DDR4 1866, but he wasn't using that. He was using DDR3 1866.

I think that makes a difference due to DDR3 1866 having tighter timings that what DDR4 1866 would have.
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,012
136
Yes, it would be slower with DDR4 1866, but he wasn't using that. He was using DDR3 1866.

I think that makes a difference due to DDR3 1866 having tighter timings that what DDR4 1866 would have.

It will have comparable latency to DDR4-2133, but the bandwidth will be significantly lower.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
It will have comparable latency to DDR4-2133, but the bandwidth will be significantly lower.

Here is some Skylake memory testing done by Anandtech using DDR3 (1866 CAS 9) vs. DDR4 2133 (CAS 15):

http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/7

The DDR3 1866 (CAS 9) did well:

DDR4%20DDR3L_575px.png


Comparing default DDR4 to a high performance DDR3 memory kit is almost an equal contest. Having the faster frequency helps for large frame video encoding (HandBrake HQ) as well as WinRAR which is normally memory intensive. The only real benchmark loss was FastStone, which regressed by one second (out of 48 seconds).

End result, looking at the CPU test scores, is that upgrading to DDR4 doesn’t degrade performance from your high end DRAM kit, and you get the added benefit of future upgrades, faster speeds, lower power consumption due to the lower voltage and higher density modules.

And in the gaming part of the testing DDR4 2133 (CAS15) vs. DDR3 1866 (CAS9) essentially tied in CPU (perhaps a hair better for the DDR4 2133), with the DD4 2133 edging out the DDR3 1866 in integrated graphics:

http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/8
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
That is a single game.

(Snip)

Against the large number of games tested by Anandtech. And they did not just measure framerate, they also measured smoothness:

83103.png


83079.png

One advantage that FX8xxx has in BF4 that it doesn't have in that very CPU intensive game above (GTA V) is Mantle.

A low level API is going to help reduce the single thread CPU demand as I understand things.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,232
5,012
136
One advantage that FX8xxx has in BF4 that it doesn't have in that very CPU intensive game above (GTA V) is Mantle.

A low level API is going to help reduce the single thread CPU demand as I understand things.

A low level API is helpful, but the engine written on top of it also needs to be fantastically multithreaded. Certain tasks such as AI are notoriously difficult to multithread, and you need to load-balance that task with 7 others appropriately to saturate all 8 cores.

With the FX you are always dependent on the game developer doing a fantastic job of multithreading, or your 8-core CPU isn't going to shine. The i3 doesn't have the same peak multithreaded performance, but it has enough single threaded performance to push through single-thread bottlenecks. Given that the majority of games out there will at some point in the frame have a single-thread bottleneck, I'd take the i3.
 
Feb 25, 2011
16,788
1,468
126
And let me guess you also have a spare 2 x 4GB DDR3 kit lying around after upgrading your Haswell Core i7 system to 2 x 8GB DDR3 (or better).

Do you see where this is going?

(Pretty soon the average person has a fairly good pile of assorted spare parts they could possibly make another PC out of)
Sell the spare parts on eBay or toss them. No point to spending money on - and no spare room for - a second desktop computer that isn't as good as my primary. LAN parties don't happen anymore, grandma PCs don't need dGPUs, etc.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,326
10,034
126
LAN parties don't happen anymore

Sniff, sniff. Dave, you're breaking my heart.

You mean my quest to build many budget gaming PCs for a LAN party is in vain?

(That was my original motivation for building multiple gaming PCs at my old place, and why I have not just one, but several computer desks.)

Sigh.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
A low level API is helpful, but the engine written on top of it also needs to be fantastically multithreaded. Certain tasks such as AI are notoriously difficult to multithread, and you need to load-balance that task with 7 others appropriately to saturate all 8 cores.

With the FX you are always dependent on the game developer doing a fantastic job of multithreading, or your 8-core CPU isn't going to shine. The i3 doesn't have the same peak multithreaded performance, but it has enough single threaded performance to push through single-thread bottlenecks. Given that the majority of games out there will at some point in the frame have a single-thread bottleneck, I'd take the i3.

Looking at cpu uitilization charts over at GameGPU many recent games do appear to spread the load out to eight FX cores. However, as I understand things, the AMD DX11 graphics driver (which adds to the load imposed by the game engine) is single threaded (and has overhead problems?).

Some threads on this :

https://forums.anandtech.com/threads/amd-radeon-directx-11-multithreaded-rendering.2346619/

http://forums.guru3d.com/showthread.php?t=398858

Therefore, I'd imagine that extra single thread stress (from AMD DX11 driver) is going to reduce the effectiveness of the FX8300 turbo to handle sudden game engine load increases across 1 or 2 cores.

In contrast, Nvidia in DX11, should result in less peak single threaded stress.....most likely helping the FX8300 more than the Skylake Core i3.
 
Last edited:

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
None of the above. I buy cheap used board or cheap used workstations off of eBay and FS/T forums.
 
Feb 25, 2011
16,788
1,468
126
Sniff, sniff. Dave, you're breaking my heart.

You mean my quest to build many budget gaming PCs for a LAN party is in vain?

(That was my original motivation for building multiple gaming PCs at my old place, and why I have not just one, but several computer desks.)

Sigh.
And when was the last time you had a half dozen of your friends over to play?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Overall, I think there is a good future for AM3+ with respect to game scaling.

However, at the same time a LGA 1151 system with Core i3 can be substantially upgraded....whereas a system with FX8300 doesn't have much more left it (though the FX8300 can be overclocked).

With that mentioned, it would take a Core i7 to substantially beat a stock speed FX8300 in a game with this type of scaling.
 

HexiumVII

Senior member
Dec 11, 2005
661
7
81
Threw in a 460GTX into an HP with a core 2 quad core q8200 that i got off the street (literally 3 of them sitting on the curb). They play Overwatch really amazingly.
 
  • Like
Reactions: cbn