Question The FX 8350 revisited. Good time to talk about it because reasons.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
@DAPUNISHER

Value aside, both Phenom II X6 & the FX rev2 were a bit too advanced for the mass market, when everybody was using Windows XP/7 and games mostly ran okay even on dual cores 2c/2t. We didn't have Windows 10 which is very multi-core aware, let alone the web browsers, you are using today. At the release date, these processors were not good enough and because of that, they were heavily discounted. On my Thuban, I couldn't get Turbo Boost to work properly until I upgraded to Windows 8.1, for example. Now, I use it occasionally with Windows 10 and its overall experience is 10 times better than it was.

Hardware is nothing without proper software support. Years later, these processors/software have caught up, oh well, better late than never. Same can be said about those X5650s people were getting for ~$50 a pop for the LGA1366 socket, though.

2) IMHO, Haswell i7 for ~$300 was a far better investment if you look at it that way, lower power, AVX2 support, iGPU. Legacy ST software (Windows XP+) performance, MT (Windows 10) still decent. All of that straight from release date.
 
Last edited:

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,446
20,452
146
@DAPUNISHER

Value aside, both Phenom II X6 & the FX rev2 were a bit too advanced for the mass market, when everybody was using Windows XP/7 and games mostly ran okay even on dual cores 2c/2t. We didn't have Windows 10 which is very multi-core aware, let alone the web browsers, you are using today. At the release date, these processors were not good enough and because of that, they were heavily discounted. On my Thuban, I couldn't get Turbo Boost to work properly until I upgraded to Windows 8.1, for example. Now, I use it occasionally with Windows 10 and its overall experience is 10 times better than it was.

Hardware is nothing without proper software support. Years later, these processors/software have caught up, oh well, better late than never. Same can be said about those X5650s people were getting for ~$50 a pop for the LGA1366 socket, though.

2) IMHO, Haswell i7 for ~$300 was a far better investment if you look at it that way, lower power, AVX2 support, iGPU. Legacy ST software (Windows XP+) performance, MT (Windows 10) still decent. All of that straight from release date.
Agreed, and I covered most of this in my posts in this thread. I feel the same way about the old i7s, I posted I think it may be the best investment made to date.
 

ondma

Platinum Member
Mar 18, 2018
2,720
1,280
136
Hahah, no they don't. Many of the people on these forums who were FX-bashers are gone or completely mum about Intel's problems.
Obviously, I did not mean the same people.
My point is, the 10900k is being bashed just as......enthusiastically (to put it mildly) as the FX ever was. At least it gives much more competitive performance than the FX did.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,542
14,496
136
Obviously, I did not mean the same people.
My point is, the 10900k is being bashed just as......enthusiastically (to put it mildly) as the FX ever was. At least it gives much more competitive performance than the FX did.
Well, they seem to be alike, both run hot, not that fast compared to the competition, and draw a lot of power. So why wouldn't they be bashed just as much ?
 

Ranulf

Platinum Member
Jul 18, 2001
2,348
1,165
136
Obviously, I did not mean the same people.
My point is, the 10900k is being bashed just as......enthusiastically (to put it mildly) as the FX ever was. At least it gives much more competitive performance than the FX did.

Payback is a witch. ;)


I've been pleasantly surprised how well my 8350 has held out. I sort of regretted doing the drop in upgrade in 2014 from a phenom II 955. Knock on wood the gigabyte 970 board is 9 years old this year, despite its various minor issues (usb3 is slow, cranky or bad with uefi gpus). At the time I paid $170 for the 8350 plus $60 for the noctua cooler that is still on it. Under load it tops out at 54C playing Conan Exiles in 72F/22C ambient.

As much as I loved my 2500k, I've regretted not going with the i7 or trying a drop in upgrade of a 3770k (in 2014/15 anyway as I just bought the 4790k).

Also, my idle total system wattage (minus monitor) according to my UPS is generally 60-70w.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
Obviously, I did not mean the same people.
My point is, the 10900k is being bashed just as......enthusiastically (to put it mildly) as the FX ever was. At least it gives much more competitive performance than the FX did.

Actually the FX was more competitive than the Core i9 10900K.

As of today, Core i9 10900K manages to have higher performance only in Gaming but at a higher price vs the competition.

FX8350 at the same or lower price as the Core i5 had higher MultiThreading performance, and later FX8350 price drop to Core i3 levels.
Intel will never drop the price of the Core i9 10900K to compete against cheaper Ryzen, ever.

Back in 2012
FX8350 launch price $199
Core i5 3570K launch price $212

FX8350 was faster in MT loads at lower price

Today
Core i9 10900K launch price $488
Ryzen 9 3900X launch price $499, current price $431 (newegg)

Core i9 10900K faster in Gaming, slower in MultiThread at higher price.
 

BigDaveX

Senior member
Jun 12, 2014
440
216
116
Funny how just a few years ago, Conroe was considered a masterpiece of engineering while Bulldozer (and Piledriver) was a complete epic fail.

Now, more and more people seem to be agreeing that Bulldozer was actually quite a cool concept that was too far ahead of its time and really needed another revision or two (and probably a better fab partner than GloFo), while Conroe in retrospect was another of Intel's mid-2000s hackjob solutions that just happened to work out really well thanks to AMD getting too ambitious for their own good with K10.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
832
136
Funny how just a few years ago, Conroe was considered a masterpiece of engineering while Bulldozer (and Piledriver) was a complete epic fail.

Now, more and more people seem to be agreeing that Bulldozer was actually quite a cool concept that was too far ahead of its time and really needed another revision or two (and probably a better fab partner than GloFo), while Conroe in retrospect was another of Intel's mid-2000s hackjob solutions that just happened to work out really well thanks to AMD getting too ambitious for their own good with K10.

LOL What a perverse interpretation of either this thread or history itself.
 

BigDaveX

Senior member
Jun 12, 2014
440
216
116
I don't think anyone considers Conroe to be a hackjob.
Maybe not so much the original dual-core Conroe itself - though it's worth remembering that it was supposed to be a mobile-only chip until Intel realised that Tejas wasn't going to be even remotely viable - but Core 2 Quad definitely was. It was two laptop CPUs slapped onto the same package, going up against an advanced native quad-core CPU with an shared last-level cache and on-board memory controller. The fact that the C2Q actually won that match-up proves how badly AMD screwed up the original Phenom.
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
Actually the FX was more competitive than the Core i9 10900K.

As of today, Core i9 10900K manages to have higher performance only in Gaming but at a higher price vs the competition.

FX8350 at the same or lower price as the Core i5 had higher MultiThreading performance, and later FX8350 price drop to Core i3 levels.
Intel will never drop the price of the Core i9 10900K to compete against cheaper Ryzen, ever.

Back in 2012
FX8350 launch price $199
Core i5 3570K launch price $212

FX8350 was faster in MT loads at lower price

Today
Core i9 10900K launch price $488
Ryzen 9 3900X launch price $499, current price $431 (newegg)

Core i9 10900K faster in Gaming, slower in MultiThread at higher price.

I tend to agree but ill add two points:

1) Dont forget all this started earlier with the FX8150 VS 2500K, ill never forget that day, on the FX8150 launch date i went running to buy a 2500K, i was already pissed at AMD about the AM3+ thing making my 770 AM3 board obsolete. 2500K went out of stock on almost all stores here that day.

2) The FX ST perf really made them look bad with the software and games being used back in that day, it took a really long time for the FX to be competitive vs Intel cpus, and when that happened Haswell was already out. This is not something that applies to the 10900K.

But yeah as i said, in a few years, 10900K owners will be in the same spot as FX owners today. Something that performs like a entry level mainstream CPU and uses a lot of power and generates a lot of heat.
 

DrMrLordX

Lifer
Apr 27, 2000
21,617
10,826
136
It was two laptop CPUs

Ehhhhh that's revisionist history. If you told me that Yonah was a laptop CPU, I would agree. There was enough differentiation between Core and Banias or Yonah that I would not have quantified it or its younger Yorktown siblings as laptop CPUs.

slapped onto the same package, going up against an advanced native quad-core CPU with an shared last-level cache and on-board memory controller. The fact that the C2Q actually won that match-up proves how badly AMD screwed up the original Phenom.

C2Q was a beast for its time. Agena, on the other hand, was on a broken process with TLB problems and severe clockspeed limitations.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
But yeah as i said, in a few years, 10900K owners will be in the same spot as FX owners today. Something that performs like a entry level mainstream CPU and uses a lot of power and generates a lot of heat.
That's why I stopped buying CPUs with huge TDPs, they are just not economical in the long run, unless you *must* have the fastest. Hated with a vengeance my Phenom II X4 965 140w CPU, had my fans on full speed all the time. Never again. 95w PL1 is about the highest I can tolerate these days.

I'd been on a dual socket P3 1.4-S from 2001 till about 2008. They both consumed less than then moderns CPUs. So I kinda hated all that TDP increase over the years. Desktop P3 TDP was what is now Mobile CPU. Can't wait for the real evolution of this stuff in 2-4 years. 8C-12C ~3Ghz each with ~50-80% IPC over Skylake I will gladely take @ 45w or so.
 
Last edited:
  • Like
Reactions: The red spirit

chrisjames61

Senior member
Dec 31, 2013
721
446
136
I don't think anyone considers Conroe to be a hackjob.
That's why I stopped buying CPUs with huge TDPs, they are just not economical in the long run, unless you *must* have the fastest. Hated with a vengeance my Phenom II X4 965 140w CPU, had my fans on full speed all the time. Never again. 95w PL1 is about the highest I can tolerate these days.

I'd been on a dual socket P3 1.4-S from 2001 till about 2008. They both consumed less than then moderns CPUs. So I kinda hated all that TDP increase over the years. Desktop P3 TDP was what is now Mobile CPU. Can't wait for the real evolution of this stuff in 2-4 years. 8C-12C ~3Ghz each with ~50-80% IPC over Skylake I will gladely take @ 45w or so.
I think you are exaggerating things a bit. Unless you are running Prime95 there is no reason to have all your fans on "full speed". That is unless they were 80mm fans or something. Also a gpu will throw off a lot more heat than a cpu. Where are you getting this from? "Can't wait for the real evolution of this stuff in 2-4 years. 8C-12C ~3Ghz each with ~50-80% IPC over Skylake I will gladely take @ 45w or so." Certainly not from Intel I hope.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
I think you are exaggerating things a bit. Unless you are running Prime95 there is no reason to have all your fans on "full speed". That is unless they were 80mm fans or something.
The stock fan liked to rev up to 5000 rpms even under regular loads producing incredibly annoying high pitch sound, I think the fan was indeed 80mm in size. So, I upgraded my cooling, got some Noctua tower and even then the 140mm fan had to run at about 1000 rpm to keep Deneb under recommended 62c (yes, that low!) temp. I wasn't comfortable with that as it was still quite audible, so I ditched that power hungry chip and got Thuban, 6 cores in the same power envelope, which I undervolted to suit my acoustics needs. That chip I loved and still do.


Also a gpu will throw off a lot more heat than a cpu. Where are you getting this from?
Indeed, GPU can consume twice as much, but that's the point, the cooler running CPU you have, more power/heat can go to your GPU; ultimately you choose what you think is best for you, balance-wise.

"Can't wait for the real evolution of this stuff in 2-4 years. 8C-12C ~3Ghz each with ~50-80% IPC over Skylake I will gladely take @ 45w or so." Certainly not from Intel I hope.
Maybe AMD with AM5, who knows. They've reduced the deficit in recent years, now we need a huge leap from them to rapidly move things forward.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,326
10,034
126
I don't think anyone considers Conroe to be a hackjob.
You don't think using their P4 FSB tech, for multi-processor setups, and simply "gluing" 2x dual-core dies together, as if it were a dual-processor mobo (yet, on-package in the same physical socket) was a bit of a hack-job, as opposed to AMD actually taking time to engineer a "true quad-core"? (Too bad about the TLB bug... which really overshadowed the true technological innovations that AMD came up with, as opposed to "glue".)
 

Thunder 57

Platinum Member
Aug 19, 2007
2,674
3,796
136
You don't think using their P4 FSB tech, for multi-processor setups, and simply "gluing" 2x dual-core dies together, as if it were a dual-processor mobo (yet, on-package in the same physical socket) was a bit of a hack-job, as opposed to AMD actually taking time to engineer a "true quad-core"? (Too bad about the TLB bug... which really overshadowed the true technological innovations that AMD came up with, as opposed to "glue".)

Sometimes the hack job works better. Had AMD released Deneb as the first K10 then the hack job would've lost.
 

DrMrLordX

Lifer
Apr 27, 2000
21,617
10,826
136
Sometimes the hack job works better. Had AMD released Deneb as the first K10 then the hack job would've lost.

Pretty much. Plus if you look at the core and cache of Conroe and Kentsfield, they're quite sophisticated. Yes, the FSB and external memory controller did hold back the platform overall, not that it mattered.