AMD FX 4350 or 6300 for Gaming?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bononos

Diamond Member
Aug 21, 2011
3,883
142
106
Good luck milking that system for 10 years, because it is fairly likely that 4 GB video cards will be the minimum requirement for AAA video games within 5 years time. Plus, you'll likely be burning up a ton of money in electricity running an inefficient 15 year old computer -- While I will likely get much better performance upgrading to more efficient video cards and CPU's. Pennywise, and very pound foolish. I'm actually a little shocked.... Because electricity in America is relatively cheap, it's probably way more expensive where you live.

The 660 Ti is actually pretty damn power hungry:
.......

Its strange that you bring up the power consumption issue when the AMD FX cpus are less efficient and power hungry compared to Intel Ivy/Haswell/Broadwell. And also talked about the advantage of being able to overclock on the FX earlier in the thread.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
^Yeah I admit G3258 4.4GHZ on aftermarket cooler isn't exactly your everyday dual core CPU.

Nevertheless I am proud of my purchase of the MSI ECO B85 board that is TUV German certified for 25 year life expectancy.

I can finally now build my 20 year dream PC. Yes I have been for some time now trying to put the funds together to build a PC that will last me 20 years. With my current CPU I ordered being the i3 4170, and my next CPU I am ordering in 7 years from now being the 4790K which will run at max 4ghz since this board don't support auto boost clock thingy to 4.4ghz.

I expect close to 20 years out of this system.

I am not your everyday gamer, I play games like Fighting Games, MOBA, RTS, Valve FPS like CS GO and TF2. All of which are targeted at very low end PC and last over a decade.

I don't buy new games just like that, I stick with one game for over 5000 hours. So this new PC will last me as long as the mainboard holds out. If its 25 years as they claim then so be it. I don't need upgrades like you guys. The main reason I picked intel was because of an upgrade path I am pretty sure 7 to 8 years from now I will get a 4790K to buy used at a much cheaper price.

If I still had my old Athlon 64 bit CPU from 10 years ago I would be running CS GO, TF2 and DOTA 2 just fine and dandy. Games that are targeted for 20 years.

I can guarantee you all, Valve and Blizzard isn't going to require a 4790K until the next 30 years. Their games still run on single core CPU just fine.

I just had to get this intel system because I really like DayZ which I would be playing for a couple decades to come aswell and any game made by Bohemia just flat out DOES NOT run on AMD systems when you are talking online with big servers lots of people.

My 760K Athlon AMD CPU 4.1GHZ and HD 7850 xfx black OC GPU used to get 10 FPS in Arma 3, DayZ etc on lowest settings. While my friend's G3220 pentium gets 30 fps, the best games out there are always intel favored games for some odd reason.

25yr life expectancy on a mobo? :eek: And how is that so durable it doesn't have MCE to autoboost that 4790K? And you bought a last gen B85 chipset?

Nothing tech related lasts 20yrs.

None of this rambling makes any sense. In 7yrs the 4790k won't even be produced and the equivalent "i" model will be what you should be looking at not a relic from 2015. Who buys Q6600s (used too) now? They are relics.

MOBAs and MMOs (even if they are even around in 20yrs) are targeted to make the most money so target low end hardware that most are running. AAA games will murder that i3 you bought in 2yrs or less. Picture GTA VII (if it even comes to PC or even if it still has single-player) or the next CD Projekt title.

You would have been better off buying an i5 as a minimum with a higher GPU and upgrade that in 2yrs.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
25yr life expectancy on a mobo? :eek: And how is that so durable it doesn't have MCE to autoboost that 4790K? And you bought a last gen B85 chipset?

Nothing tech related lasts 20yrs.

None of this rambling makes any sense. In 7yrs the 4790k won't even be produced and the equivalent "i" model will be what you should be looking at not a relic from 2015. Who buys Q6600s (used too) now? They are relics.

MOBAs and MMOs (even if they are even around in 20yrs) are targeted to make the most money so target low end hardware that most are running. AAA games will murder that i3 you bought in 2yrs or less. Picture GTA VII (if it even comes to PC or even if it still has single-player) or the next CD Projekt title.

You would have been better off buying an i5 as a minimum with a higher GPU and upgrade that in 2yrs.

Some food for thought: Used 4-year old 2500Ks are still selling at 50% of their new price. If market conditions hold for the 4790K one would have only saved $170 over 4 years without the benefit of the 4790K in the same timeframe, and let's not forget the initial cost of an i3. Penny-wise, pound foolish.
 

TeknoBug

Platinum Member
Oct 2, 2013
2,084
31
91
Its strange that you bring up the power consumption issue when the AMD FX cpus are less efficient and power hungry compared to Intel Ivy/Haswell/Broadwell. And also talked about the advantage of being able to overclock on the FX earlier in the thread.
PC's 15 years ago had lower TDP parts, the Pentium III was 15W, my dual Pentium Pro was only 35W/ea, many GPU's were ~80-100W, etc. The Power supply on the other hand as it ages it loses efficiency then starts drawing a lot more power than needed. I dunno what Middle's talking about.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
PC's 15 years ago had lower TDP parts, the Pentium III was 15W, my dual Pentium Pro was only 35W/ea, many GPU's were ~80-100W, etc. The Power supply on the other hand as it ages it loses efficiency then starts drawing a lot more power than needed. I dunno what Middle's talking about.

Idle consumpion has dropped significanly. Overclocked Q6600 + HD4870 was drawing nearly 200w idle at the wall, compared with closer to 40w for my overclocked i5 + 7850.
 

TeknoBug

Platinum Member
Oct 2, 2013
2,084
31
91
Idle consumpion has dropped significanly. Overclocked Q6600 + HD4870 was drawing nearly 200w idle at the wall, compared with closer to 40w for my overclocked i5 + 7850.
Yeah that's bad for idle, did you not have C5 or C6 turned on?
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Its strange that you bring up the power consumption issue when the AMD FX cpus are less efficient and power hungry compared to Intel Ivy/Haswell/Broadwell. And also talked about the advantage of being able to overclock on the FX earlier in the thread.

Not particularly -- I really don't give a rat's ass about power consumption for my desktop.
It does matter a bit for my laptop usage -- as I like to maximize my battery life between charges.

But penny pinchers like him certainly shouldn't complain about the cost of upgrades -- then fire up their 200+ watt video card,
adding a bunch of dollars to their electricity bill each month. For the extra money he's wasting on electricity, it would probably
cost the same -- if not less to upgrade to a more energy efficient video card over a 15 year period.

Electricity is cheap in the USA -- which is why most don't worry about it here.
But if he's over in Europe, he's burning up his upgrade budget by running an outdated video card for 15+ years......

electriccost1.gif
 
Last edited:

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
PC's 15 years ago had lower TDP parts, the Pentium III was 15W, my dual Pentium Pro was only 35W/ea, many GPU's were ~80-100W, etc. The Power supply on the other hand as it ages it loses efficiency then starts drawing a lot more power than needed. I dunno what Middle's talking about.

Very true.... Power consumption didn't spike until the Pentium 4's showed up. That's when things went nuts for power consumption.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
I can guarantee you all, Valve and Blizzard isn't going to require a 4790K until the next 30 years. Their games still run on single core CPU just fine.

You're completely insane if you actually think this is true. Go back to using your Commodore 64 -- and leave the PC for the actual normal people. A single core already can't run many games like GTA 5. You're very misinformed.
 
Last edited:

throwa

Member
Aug 23, 2015
59
0
0
I'm a younger whipper-snapper at 27, and the curiosity is killing me so I have to ask you older guys..... is it true some of the first Pentium 4's literally melted themselves from running so hot?

I'm afraid the Pentium 4 era was a little before my time, I just barely missed it. My first ever "gaming" PC was a Best buy/store bought AMD 3200+ machine that I modified by adding a graphics card and swapping out the PSU. This was around 2006 or 2007 when I first got into WoW (good times back then!).
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
I'm a younger whipper-snapper at 27, and the curiosity is killing me so I have to ask you older guys..... is it true some of the first Pentium 4's literally melted themselves from running so hot?

I'm afraid the Pentium 4 era was a little before my time, I just barely missed it. My first ever "gaming" PC was a Best buy/store bought AMD 3200+ machine that I modified by adding a graphics card and swapping out the PSU. This was around 2006 or 2007 when I first got into WoW (good times back then!).

P4 was still around in 2007.

Early Northwood chips would slowly die of electromigration if given a little too much voltage to get a good overclock. Sudden Northwood Death Syndrome.

I don't recall any melting.

I have a 3.06 Northie with HT still going strong.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
Northwood had thermal protection built into the CPU. It would shut down before killing itself due to high temps.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Northwood had thermal protection built into the CPU. It would shut down before killing itself due to high temps.

Right, this. AMD CPUs would die from fan failures or heatsink mis-mounts though.

Before Prescott, high end P4's were in the area of 70-90w. The industry hadn't really figured out how to cool them yet though, and it wasn't until a few years later that heatsinks really started to grow, and heatpipes became a thing.
 

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
Before Prescott, high end P4's were in the area of 70-90w. The industry hadn't really figured out how to cool them yet though, and it wasn't until a few years later that heatsinks really started to grow, and heatpipes became a thing.

Even before Prescott, in the days of Northwood there were heatsinks with Heat-pipes like the outstanding ThermalRight XP-94 and SP-97, released in 2003.