• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

How many cores and threads do you think are too many for a mainstream desktop?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

How many cores and threads do you think are too many for a mainstream desktop?

  • 6C/12T

    Votes: 9 6.9%
  • 8C/16T

    Votes: 17 13.1%
  • 10C/20T

    Votes: 41 31.5%
  • 12C/24T

    Votes: 13 10.0%
  • 14C/28T

    Votes: 2 1.5%
  • 16C/32T

    Votes: 5 3.8%
  • 18C/36T

    Votes: 16 12.3%
  • 20C/40T

    Votes: 1 0.8%
  • 22C/44T

    Votes: 0 0.0%
  • 24C/48T and greater

    Votes: 26 20.0%

  • Total voters
    130

VirtualLarry

No Lifer
Aug 25, 2001
51,929
6,887
126
Sandy doesn't use much power;
you shouldn't be using hardware decode
Irony much?

The key to using "less power" watching videos online, is "hardware decode". In case you hadn't noticed.

Edit: And while I've heard that online streaming (uploading) is better-quality using software encoding, instead of hardware ENCODING, I've never heard anyone say bad things about software DECODING, when using a format that the hardware is capable of decoding.
 

Thunder 57

Golden Member
Aug 19, 2007
1,676
1,724
136
Your mouse and keyboard don't need USB 3.x; Sandy doesn't use much power; component failure is along a bathtub curve meaning you're worse off buying new than a proven performing system; you shouldn't be using hardware decode and Sandy has the horsepower to decode anything this side of 8k without issue.
So moms don't share large high quality original photos of a wedding or any number of events, or large video files of their family?
Power management has gotten a lot better over the years, and you have to factor in the whole system.
We're talking about using it through 2024, the hardware will be at least 12 years old at that point. Things will fail.
And VirtualLarry covered the hardware decode part.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,566
96
With no USB 3? Using much more power? Used components that are likely to fail much sooner (PSU in particular). An iGPU that doesn't support nearly as many video formats for decode? You have to wonder if the hard drive is trustworthy. I could go on.
I agree systems based on the Sandy Bridge CPUs are rather long in the tooth and could fail at any time.
 
  • Like
Reactions: Thunder 57

DominionSeraph

Diamond Member
Jul 22, 2009
8,391
31
91
Irony much?

The key to using "less power" watching videos online, is "hardware decode". In case you hadn't noticed.

Edit: And while I've heard that online streaming (uploading) is better-quality using software encoding, instead of hardware ENCODING, I've never heard anyone say bad things about software DECODING, when using a format that the hardware is capable of decoding.
Spend some time in the madVR thread on Doom9. madshi is forever going back and forth with Intel, AMD, and Nvidia over their decode issues. If you care about accuracy you should not be using hardware decode.

So moms don't share large high quality original photos of a wedding or any number of events, or large video files of their family?
Power management has gotten a lot better over the years, and you have to factor in the whole system.
We're talking about using it through 2024, the hardware will be at least 12 years old at that point. Things will fail.
And VirtualLarry covered the hardware decode part.
Why do people continually seem to think that Sandy is the same as dual socket 771 server? A HP DL380 G5 will pull ~500W; that doesn't mean Sandy does.




Anandtech's review doesn't have system power for the 2200G, but package power alone is 53W, which is in the same range as the i5 2500 CPU.

Also, in my experience AMD systems last around half as long as Intel ones, so buying a new AMD system will not likely see you more longevity than a used Intel.
I have a stable full of Lynnfield, Sandy, and Ivies all running like new, while my Phenom II's are all junked.
 
Last edited:
  • Like
Reactions: pcp7

VirtualLarry

No Lifer
Aug 25, 2001
51,929
6,887
126
If you care about accuracy you should not be using hardware decode.
I thought that you cared about power consumption; now the goalposts are "accuracy", with mention of MadVR decode (in which case, we'll need to add a dGPU's power-consumption to your i5-2400 numbers...)
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,391
31
91
I thought that you cared about power consumption; now the goalposts are "accuracy", with mention of MadVR decode (in which case, we'll need to add a dGPU's power-consumption to your i5-2400 numbers...)
Thunder 57 brought up hardware decode. I pointed out that you shouldn't be using it in the first place.
It's the same as if someone pointed out that Intel has Quick Sync while AMD doesn't. Who cares when Quick Sync is crap and you should only be using x264/x265.
 

Thunder 57

Golden Member
Aug 19, 2007
1,676
1,724
136
Spend some time in the madVR thread on Doom9. madshi is forever going back and forth with Intel, AMD, and Nvidia over their decode issues. If you care about accuracy you should not be using hardware decode.
Opinion. Most people, especially a mom who doesn't need USB 3 will be fine with hardware decode.


Why do people continually seem to think that Sandy is the same as dual socket 771 server? A HP DL380 G5 will pull ~500W; that doesn't mean Sandy does.




Anandtech's review doesn't have system power for the 2200G, but package power alone is 53W, which is in the same range as the i5 2500 CPU.
Under load the difference won't be as much, but at idle, which a "mom PC" where spend most of it's time, there has been a lot of improvement in power usage.

Also, in my experience AMD systems last around half as long as Intel ones, so buying a new AMD system will not likely see you more longevity than a used Intel.
I have a stable full of Lynnfield, Sandy, and Ivies all running like new, while my Phenom II's are all junked.
Well, either you are very unlucky, or you are making mistakes somewhere. I've never had longevity issues with Intel or AMD. I still see K6-2 systems that work.

Thunder 57 brought up hardware decode. I pointed out that you shouldn't be using it in the first place.
It's the same as if someone pointed out that Intel has Quick Sync while AMD doesn't. Who cares when Quick Sync is crap and you should only be using x264/x265.
Quicksync is more than acceptable in most cases. Keep a high quality source using x264, and use quicksync to transcode for mobile devices or streaming.

You also left out any arguments regarding USB 3 (or USB type C) and the age of the hardware. I suppose that is because there really is no argument to be made. Don't let your poor experience with AMD influence others decision when they clearly make a fine, long lasting product.
 
Last edited:

DominionSeraph

Diamond Member
Jul 22, 2009
8,391
31
91
Opinion. Most people, especially a mom who doesn't need USB 3 will be fine with hardware decode..
The capability isn't relevant. It's not the difference between able to play a video or not, it's the difference between playing the video at 1% CPU util vs 10%.
If we were talking about an Atom D510 vs a Raspberry Pi 3 then the question of hardware decoding would be a little more pertinent. An i5 2500's capabilities render the question moot.


Under load the difference won't be as much, but at idle, which a "mom PC" where spend most of it's time, there has been a lot of improvement in power usage.
40W.
6hours a day @ 12c/KWh, that comes out to $10/yr for a ~$200 reduction in price.



Well, either you are very unlucky, or you are making mistakes somewhere. I've never had longevity issues with Intel or AMD. I still see K6-2 systems that work.
That you've seen one doesn't mean anything statistically. There are almost no X2's out there compared to C2D's, pretty much zero Phenom I's compared to C2Q's, almost no Athlon II's/Phenom II's vs Lynnfields, and almost no FX's vs Sandy, Ivy, and Haswell.
AMD seems toredactedbefore making it to the used market.




No profanity in the tech forums.


esquared
Anandtech Forum Director
 
Last edited by a moderator:

Thunder 57

Golden Member
Aug 19, 2007
1,676
1,724
136
The capability isn't relevant. It's not the difference between able to play a video or not, it's the difference between playing the video at 1% CPU util vs 10%...
It would be more than 10%. And you are proposing a solution that has less capability; Sound logic :rolleyes:.

40W.
6hours a day @ 12c/KWh, that comes out to $10/yr for a ~$200 reduction in price.
OK, that's assuming energy is cheap. Fair enough. Even then though, you have to factor in the heat output. In some places, it may not matter or even be a benefit. Where I am, efficiency is king because I want as little waste heat as possible. Since you bring up costs, you are still ignoring my other points. $200 now, or $200 later, for a less capable system?

That you've seen one doesn't mean anything statistically. There are almost no X2's out there compared to C2D's, pretty much zero Phenom I's compared to C2Q's, almost no Athlon II's/Phenom II's vs Lynnfields, and almost no FX's vs Sandy, Ivy, and Haswell.
AMD seems to shit the bed before making it to the used market.
You are letting your own bias blind you. Intel has had a much larger market share, so if course there will be less of the AMD variants. Also, the FX chips required a dGPU, so of course you will see less of them. Don't let facts get in the way of your opinion, though.
 

Thunder 57

Golden Member
Aug 19, 2007
1,676
1,724
136
Sounds like FUD to me.
Right? I guess I've never sold an Athlon XP 2000+ and Epox motherboard (Epox was awesome). Or an Athlon 64 3500+ and Athlon 64 X2 3800+. Or a Phenom II 940. I guess everyone who received those had them crash and burn within 3 months.
 

SPBHM

Diamond Member
Sep 12, 2012
4,998
356
126
a lot less AMD CPUs were sold for the past 10 years, that's why they are much rarer in the used market...
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,566
96
a lot less AMD CPUs were sold for the past 10 years, that's why they are much rarer in the used market...
I'm of the thought that AMD would fared way better if they simply made steady improvements to the Alton 64 instead of wasting resources on Bulldozer/Pilediver based designs.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,391
31
91
It would be more than 10%. And you are proposing a solution that has less capability; Sound logic :rolleyes:.
I don't think you know what the word "capability" means.

OK, that's assuming energy is cheap. Fair enough. Even then though, you have to factor in the heat output. In some places, it may not matter or even be a benefit. Where I am, efficiency is king because I want as little waste heat as possible. Since you bring up costs, you are still ignoring my other points. $200 now, or $200 later, for a less capable system?
As long as you aren't running Faildozer heat isn't an issue. My i7 4790 is at 100% load for days at a time and I can't tell. It's only when I add a couple i7 3770's that the rooms start to warm up.
It would take 20 years to cost you that $200. In 5 you could upgrade to a 5W NUC for $50, and your i5 2500 would only be worth a couple bucks less than what you bought it for.

Now the question of the hour is, do I REALLY need a 4th i7 3770 system?

 
Last edited:

Thunder 57

Golden Member
Aug 19, 2007
1,676
1,724
136
I don't think you know what the word "capability" means.
Fine. Is less capable. Want to insult me more?

As long as you aren't running Faildozer heat isn't an issue. My i7 4790 is at 100% load for days at a time and I can't tell. It's only when I add a couple i7 3770's that the rooms start to warm up.
It would take 20 years to cost you that $200. In 5 you could upgrade to a 5W NUC for $50, and your i5 2500 would only be worth a couple bucks less than what you bought it for.
Your hatred of AMD is noted. When did the Bulldozer line come into this? You still fail to answer my questions about USB 3/C and hardware. I'm not even going to bother debating with you anymore. Stop being so abrasive.
 

arandomguy

Senior member
Sep 3, 2013
542
168
116
Regarding "mom boxes" maybe other people have different experiences but at least when people ask me whom have been exposed to what more streamlined modern form factors look like (including my mom) they don't want those typical ATX towers as their "mom boxes" because they are eyesores. And this does follow the general overall trend of the home desktop market declining outside of high end gaming.

Not to mention the general issue of comparing prices via used and new goods.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,391
31
91
Your hatred of AMD is noted. When did the Bulldozer line come into this?
Bulldozer or the i9 9900k would really be the only recent desktop processors where the heat pouring off of them would be an issue, so you brought them up when you mentioned heat.

You are letting your own bias blind you. Intel has had a much larger market share, so if course there will be less of the AMD variants.
Is AMD under 1% market share? Looking at my craigslist, of ~300 PCs, I see two Ryzen gaming PCs that were obviously built to resell, one FX-6300, and one broken X2.
So one actually working used AMD PC.

*wait, there's a third Ryzen PC there. A Cyberpower PC they're trying to sell for the same price it sold at Best Buy in December of 2017. But that's probably a legitimate used PC. So two.
 
Last edited:

TheELF

Diamond Member
Dec 22, 2012
3,309
404
126
So moms don't share large high quality original photos of a wedding or any number of events, or large video files of their family?.
Do they specifically go out and buy external storage that has not only a usb 3 interface but actually storage fast enough that it takes advantage of speeds above usb 2?
USB 3 is a marketing gimmick right now and will remain one for years,at least for moms.
You really still want it? Get a PCI card,you want better hardware decoding get a cheap dgpu.
Also cloud,even moms now about it.

Just saying.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,566
96
Do they specifically go out and buy external storage that has not only a usb 3 interface but actually storage fast enough that it takes advantage of speeds above usb 2?
USB 3 is a marketing gimmick right now and will remain one for years,at least for moms.
You really still want it? Get a PCI card,you want better hardware decoding get a cheap dgpu.
Also cloud,even moms now about it.

Just saying.
Dude USB 3 is considerably faster then USB 2 when it comes to doing backups. So it isn't a marketing gimmick now is it?
 

scannall

Golden Member
Jan 1, 2012
1,807
1,302
136
Easy enough to put together a build. If I were getting a computer together for a mom someplace, it wouldn't be with used parts. Moms deserve something better than that do they not? I added an optical drive, since moms seem to have a lot of pictures etc on optical media still. Probably some still have things on floppy disks, but I'm not willing to go that far. But that setup should easily last until 2024.

Basic Mom Box
 

TheELF

Diamond Member
Dec 22, 2012
3,309
404
126
Dude USB 3 is considerably faster then USB 2 when it comes to doing backups. So it isn't a marketing gimmick now is it?
Even with slow usb storage devices that are usually not even getting up to usb 2 speeds?
 

Thunder 57

Golden Member
Aug 19, 2007
1,676
1,724
136
Do they specifically go out and buy external storage that has not only a usb 3 interface but actually storage fast enough that it takes advantage of speeds above usb 2?
USB 3 is a marketing gimmick right now and will remain one for years,at least for moms.
You really still want it? Get a PCI card,you want better hardware decoding get a cheap dgpu.
Also cloud,even moms now about it.

Just saying.
I was thinking more like flash drives which have no problem blowing past USB 2 speeds. Also, there is no cheap modern GPU since the iGPU has rendered them obsolete.
 
Jul 24, 2017
93
25
61
2/4 even the OS these days lags
If your OS is lagging on 2c/4t that's an OS problem. I'm running Ubuntu on a 2c/4t i7-3520M @ 2.9Ghz and the OS does not lag. It takes a while to boot up but that is due to this laptop having a 5400RPM HDD, not the CPU.

At the risk of sounding overly "Linuxmasterrace," if Windows is lagging for basic web browsing and document editing on 2c/4t then Windows is too heavy.
 
  • Like
Reactions: maddie

TheELF

Diamond Member
Dec 22, 2012
3,309
404
126
I was thinking more like flash drives which have no problem blowing past USB 2 speeds. Also, there is no cheap modern GPU since the iGPU has rendered them obsolete.
Yeah but flash drives with those kinds of speeds are very expensive and very unlikely to be bought by moms.
There is no cheap new GPU but depending on what you need you could get an old one or even be ok with getting a new one even if it's expensive,it's just an alternative if you need it you can get it,just like very expensive external storage.
 

TheELF

Diamond Member
Dec 22, 2012
3,309
404
126
Edit: And while I've heard that online streaming (uploading) is better-quality using software encoding, instead of hardware ENCODING, I've never heard anyone say bad things about software DECODING, when using a format that the hardware is capable of decoding.
The only difference between the two is that the one writes the result to the screen (gpu mem) while the other writes the result to a file (IO mem) if you're not ok with the quality of the one you're also not ok with the quality of the other because it's exactly the same quality. (if the same settings are being used)
 

ASK THE COMMUNITY