- Aug 22, 2001
- 28,496
- 20,605
- 146
You and I are not referring to the same "they"Seriously??? They do it in thread after thread.
You and I are not referring to the same "they"Seriously??? They do it in thread after thread.
Huh??? Your definition must be a very selective subset of posters then.You and I are not referring to the same "they"
You joined in 2018, this was all before your time here. If you were here, there would be no confusion.Huh??? Your definition must be a very selective subset of posters then.
Agreed, and I covered most of this in my posts in this thread. I feel the same way about the old i7s, I posted I think it may be the best investment made to date.@DAPUNISHER
Value aside, both Phenom II X6 & the FX rev2 were a bit too advanced for the mass market, when everybody was using Windows XP/7 and games mostly ran okay even on dual cores 2c/2t. We didn't have Windows 10 which is very multi-core aware, let alone the web browsers, you are using today. At the release date, these processors were not good enough and because of that, they were heavily discounted. On my Thuban, I couldn't get Turbo Boost to work properly until I upgraded to Windows 8.1, for example. Now, I use it occasionally with Windows 10 and its overall experience is 10 times better than it was.
Hardware is nothing without proper software support. Years later, these processors/software have caught up, oh well, better late than never. Same can be said about those X5650s people were getting for ~$50 a pop for the LGA1366 socket, though.
2) IMHO, Haswell i7 for ~$300 was a far better investment if you look at it that way, lower power, AVX2 support, iGPU. Legacy ST software (Windows XP+) performance, MT (Windows 10) still decent. All of that straight from release date.
Seriously??? They do it in thread after thread.
Obviously, I did not mean the same people.Hahah, no they don't. Many of the people on these forums who were FX-bashers are gone or completely mum about Intel's problems.
Well, they seem to be alike, both run hot, not that fast compared to the competition, and draw a lot of power. So why wouldn't they be bashed just as much ?Obviously, I did not mean the same people.
My point is, the 10900k is being bashed just as......enthusiastically (to put it mildly) as the FX ever was. At least it gives much more competitive performance than the FX did.
Obviously, I did not mean the same people.
My point is, the 10900k is being bashed just as......enthusiastically (to put it mildly) as the FX ever was. At least it gives much more competitive performance than the FX did.
Obviously, I did not mean the same people.
My point is, the 10900k is being bashed just as......enthusiastically (to put it mildly) as the FX ever was. At least it gives much more competitive performance than the FX did.
Funny how just a few years ago, Conroe was considered a masterpiece of engineering while Bulldozer (and Piledriver) was a complete epic fail.
Now, more and more people seem to be agreeing that Bulldozer was actually quite a cool concept that was too far ahead of its time and really needed another revision or two (and probably a better fab partner than GloFo), while Conroe in retrospect was another of Intel's mid-2000s hackjob solutions that just happened to work out really well thanks to AMD getting too ambitious for their own good with K10.
Maybe not so much the original dual-core Conroe itself - though it's worth remembering that it was supposed to be a mobile-only chip until Intel realised that Tejas wasn't going to be even remotely viable - but Core 2 Quad definitely was. It was two laptop CPUs slapped onto the same package, going up against an advanced native quad-core CPU with an shared last-level cache and on-board memory controller. The fact that the C2Q actually won that match-up proves how badly AMD screwed up the original Phenom.I don't think anyone considers Conroe to be a hackjob.
Actually the FX was more competitive than the Core i9 10900K.
As of today, Core i9 10900K manages to have higher performance only in Gaming but at a higher price vs the competition.
FX8350 at the same or lower price as the Core i5 had higher MultiThreading performance, and later FX8350 price drop to Core i3 levels.
Intel will never drop the price of the Core i9 10900K to compete against cheaper Ryzen, ever.
Back in 2012
FX8350 launch price $199
Core i5 3570K launch price $212
FX8350 was faster in MT loads at lower price
Today
Core i9 10900K launch price $488
Ryzen 9 3900X launch price $499, current price $431 (newegg)
Core i9 10900K faster in Gaming, slower in MultiThread at higher price.
Indeed... I still use my i7-3770K as my daily driver. Never thought that it would last this long.If you bought an i7 Sandy or Ivy, it was the best investment maybe to date. Still do the job for most games.
It was two laptop CPUs
slapped onto the same package, going up against an advanced native quad-core CPU with an shared last-level cache and on-board memory controller. The fact that the C2Q actually won that match-up proves how badly AMD screwed up the original Phenom.
That's why I stopped buying CPUs with huge TDPs, they are just not economical in the long run, unless you *must* have the fastest. Hated with a vengeance my Phenom II X4 965 140w CPU, had my fans on full speed all the time. Never again. 95w PL1 is about the highest I can tolerate these days.But yeah as i said, in a few years, 10900K owners will be in the same spot as FX owners today. Something that performs like a entry level mainstream CPU and uses a lot of power and generates a lot of heat.
I don't think anyone considers Conroe to be a hackjob.
I think you are exaggerating things a bit. Unless you are running Prime95 there is no reason to have all your fans on "full speed". That is unless they were 80mm fans or something. Also a gpu will throw off a lot more heat than a cpu. Where are you getting this from? "Can't wait for the real evolution of this stuff in 2-4 years. 8C-12C ~3Ghz each with ~50-80% IPC over Skylake I will gladely take @ 45w or so." Certainly not from Intel I hope.That's why I stopped buying CPUs with huge TDPs, they are just not economical in the long run, unless you *must* have the fastest. Hated with a vengeance my Phenom II X4 965 140w CPU, had my fans on full speed all the time. Never again. 95w PL1 is about the highest I can tolerate these days.
I'd been on a dual socket P3 1.4-S from 2001 till about 2008. They both consumed less than then moderns CPUs. So I kinda hated all that TDP increase over the years. Desktop P3 TDP was what is now Mobile CPU. Can't wait for the real evolution of this stuff in 2-4 years. 8C-12C ~3Ghz each with ~50-80% IPC over Skylake I will gladely take @ 45w or so.
The stock fan liked to rev up to 5000 rpms even under regular loads producing incredibly annoying high pitch sound, I think the fan was indeed 80mm in size. So, I upgraded my cooling, got some Noctua tower and even then the 140mm fan had to run at about 1000 rpm to keep Deneb under recommended 62c (yes, that low!) temp. I wasn't comfortable with that as it was still quite audible, so I ditched that power hungry chip and got Thuban, 6 cores in the same power envelope, which I undervolted to suit my acoustics needs. That chip I loved and still do.I think you are exaggerating things a bit. Unless you are running Prime95 there is no reason to have all your fans on "full speed". That is unless they were 80mm fans or something.
Indeed, GPU can consume twice as much, but that's the point, the cooler running CPU you have, more power/heat can go to your GPU; ultimately you choose what you think is best for you, balance-wise.Also a gpu will throw off a lot more heat than a cpu. Where are you getting this from?
Maybe AMD with AM5, who knows. They've reduced the deficit in recent years, now we need a huge leap from them to rapidly move things forward."Can't wait for the real evolution of this stuff in 2-4 years. 8C-12C ~3Ghz each with ~50-80% IPC over Skylake I will gladely take @ 45w or so." Certainly not from Intel I hope.
You don't think using their P4 FSB tech, for multi-processor setups, and simply "gluing" 2x dual-core dies together, as if it were a dual-processor mobo (yet, on-package in the same physical socket) was a bit of a hack-job, as opposed to AMD actually taking time to engineer a "true quad-core"? (Too bad about the TLB bug... which really overshadowed the true technological innovations that AMD came up with, as opposed to "glue".)I don't think anyone considers Conroe to be a hackjob.
You don't think using their P4 FSB tech, for multi-processor setups, and simply "gluing" 2x dual-core dies together, as if it were a dual-processor mobo (yet, on-package in the same physical socket) was a bit of a hack-job, as opposed to AMD actually taking time to engineer a "true quad-core"? (Too bad about the TLB bug... which really overshadowed the true technological innovations that AMD came up with, as opposed to "glue".)
Sometimes the hack job works better. Had AMD released Deneb as the first K10 then the hack job would've lost.