How many cores and threads do you think are too many for a mainstream desktop?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

How many cores and threads do you think are too many for a mainstream desktop?

  • 6C/12T

    Votes: 9 6.9%
  • 8C/16T

    Votes: 17 13.1%
  • 10C/20T

    Votes: 41 31.5%
  • 12C/24T

    Votes: 13 10.0%
  • 14C/28T

    Votes: 2 1.5%
  • 16C/32T

    Votes: 5 3.8%
  • 18C/36T

    Votes: 16 12.3%
  • 20C/40T

    Votes: 1 0.8%
  • 22C/44T

    Votes: 0 0.0%
  • 24C/48T and greater

    Votes: 26 20.0%

  • Total voters
    130

VirtualLarry

No Lifer
Aug 25, 2001
56,229
9,990
126
Even with slow usb storage devices that are usually not even getting up to usb 2 speeds?
I don't know what kind of storage that you're using (maybe flash drives?), but from my experience, USB3.0 becoming more-or-less standard on newer computers, as well as on portable storage devices (1-2TB 5400RPM HDDs, mainly), has been a real God-send.

USB2.0, would meander along at 28-30MB/sec, USB3.0 will go at 90-100, maybe even 110-120 if the drive is good, for a portable external HDD. A 3-4x reduction in time needed for effective system backups.

USB3.0 is FAR from a "marketing gimmick", at least for portable external HDDs.
 
  • Like
Reactions: Thunder 57

maddie

Diamond Member
Jul 18, 2010
4,723
4,628
136
If your OS is lagging on 2c/4t that's an OS problem. I'm running Ubuntu on a 2c/4t i7-3520M @ 2.9Ghz and the OS does not lag. It takes a while to boot up but that is due to this laptop having a 5400RPM HDD, not the CPU.

At the risk of sounding overly "Linuxmasterrace," if Windows is lagging for basic web browsing and document editing on 2c/4t then Windows is too heavy.
Well said.

The average, low end home user can function very well with linux on machines that are considered marginal if used with windows. Just imagine, a free OS also allows the use of cheaper machines.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,391
31
91
The only difference between the two is that the one writes the result to the screen (gpu mem) while the other writes the result to a file (IO mem) if you're not ok with the quality of the one you're also not ok with the quality of the other because it's exactly the same quality. (if the same settings are being used)

You're assuming that the hardware decoders are properly implemented and software decode is a mere emulation. This is not the case. Onboard hardware decoders are designed to be as cheap as possible, not to properly process every feature.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,229
9,990
126
You're assuming that the hardware decoders are properly implemented and software decode is a mere emulation. This is not the case. Onboard hardware decoders are designed to be as cheap as possible, not to properly process every feature.
I am no assuming that "software decode is a mere emulation". These codecs have mathematical algorithms, on how the process the bit-stream. It's in the codec standard. Sure, hardware may take some shortcuts, but ... so might the software decoders. I don't believe, a-priori, that somehow hardware is automatically "less accurate" than software decode. If anything, I would assume that the hardware decoders might be more accurate, to a point, because the software-decoders don't have unlimited CPU cycles to process, they have a "latency deadline", otherwise, you get frame-skips.

And I disagree with your assertion, that hardware decoders "don't properly process every feature". You know this how? You've spoken to hardware codec designers?

While it's true, that hardware codecs are perhaps "brittle", in a sense, that the incoming video stream needs to be standard-compliant, and within the defined parameters of the specified profile level of the codec. They aren't as flexible as a software codec, in terms of handling streams that are non-compliant. Perhaps that's what you really meant to say? That hardware decoders can't process some streams that more forgiving software decoders can work around? Sure, I'll give you that.

But these things, codecs and profile levels, are written in standards documents. And if a CPU supports "HEVC Main12 profile" in hardware, then sure, I expect it to be supported as well as a software codec is, in terms of features.

Edit: I mean, if you're in the Anime community, I know that they use some advanced encoding stuff, that's often incompatible with mainstream hardware video decoders. So I can see where you might get the idea that hardware decoding is "missing features", or otherwise, a "lesser creature" than certain software decoders or codecs. But I don't really believe that to be true. Take a good, mastered video stream, say a Blu-Ray or HD-DVD raw rip (into MKV), and play it back using hardware decoding... to my eyes, it looks amazing, beautiful, and thanks to the hardware decoding support, perfectly fluid.

I don't pretend to attempt to understand people, that think that video isn't encoded "correctly", unless they run it through 20-50 passes of the CineCraft encoder, or something.

But hey, I still love 'em for their releases.
 
Last edited:

TheELF

Diamond Member
Dec 22, 2012
3,967
720
126
I don't know what kind of storage that you're using (maybe flash drives?),
Yes I was replying to someone who was talking bout flash drives.
You're assuming that the hardware decoders are properly implemented and software decode is a mere emulation. This is not the case. Onboard hardware decoders are designed to be as cheap as possible, not to properly process every feature.
I was talking about hardware decoding and hardware transcoding,not about software encoding at all.