Are CPUs advancing faster than software requires?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lamedude

Golden Member
Jan 14, 2011
1,230
68
91
Going by MS's requirements you can run Win3.0 on a 8086 and you can run Win95 on a 386DX. But then that gigahertz race got started. The fastest CPU would be slower than the cheapest CPU released a year later.
Fast forward to the era of moar cores. The cheap Dell is a 1.8GHZ Celeron. We have made little progress on the low end in the past decade and if you include Atom we've gone backwards.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
None of this works the way implied in much of the comments. As as software developer what you do is you look at the hardware currently available, you look at what is expected to become available around your release date and you write software that limits its capabilities to the hardware likely available to your users.

So if Intel/AMD are going to be giving you about 60 Giga Ops across 6 cores maximum then that is what you target. There is no point writing software that needs a 600 Giga op CPU because the hardware isn't available, and thus your software wont sell. Your features are thus entirely driven by the capability of the hardware available, along with the teams ability to deal with the complexity of the software being written. Developers aren't so much lazy as they are overwhelmed by the sheer complexity of what they do. Most products we use today are millions of lines of code, it takes a long time to find the exact line you are looking for when its equivalent to trying to find the right line in a Word document that is 40,000 pages long.

Software today trails hardware for this very reason. Microsoft, just like every other software company, is producing products within the limits of what they are given by the hardware manufacturers. They would love to give us touch interfaces, voice recognition, holographic displays and all sorts of other goodies no one has imagined yet, but the computers just aren't quick enough.

There is actually an interesting trend in the industry to use significantly more servers than ever before to solve problems that desktops just can not do. A lot of software has moved to the web not because its better there but because it allows a company to deliver a solution that no users computer could run quickly enough. But split across 100 servers or more its perfectly usable. Google is a great example of a company that does exactly that for your searches, each of which hits thousands of boxes and searches 100's Gigabytes of data and yet returns in well under a second.
 

pantsaregood

Senior member
Feb 13, 2011
993
37
91
Win8 is NT 6.2 Win7 is NT 6.1.

Hence why, wait for the real mccoy with NT 7.0.

Win8 is also downgraded in terms of functionality.

1.5GB aint fun if you work alittle and/or browse. So just because you can boot the OS, seems it runs ok. Its far from ok.

And I think you modify history. XP SP0 ran very well on a Pentium MMX.

Windows 8 isn't a downgrade in functionality - it can do everything Windows 7 can. As for XP on a Pentium MMX, it could run, but leaving the themes service running (among other things) would make it a painful experience.

Vista and 7 don't need anywhere near 2 GB of RAM to run smoothly. The OS itself uses under 512 MB of RAM. At 1.5 GB, you'd have plenty of space to work with.
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
Ignoring the off-topic discussion above...

No, CPUs are not advancing faster than software requires. Until every task that I'd ever want to run is completed in such a short amount of time that I cannot perceive it, a CPU will not be fast enough. Besides, HPC will always need faster processing power.

This. Although, until SSD's the bottleneck was on disk performance but fast SSD's sort of put the bottleneck back on the CPU.
 

N4g4rok

Senior member
Sep 21, 2011
285
0
0
It might depend on your definition of faster. Higher IPC in a CPU doesn't require changes in software to utilize, whereas high thread count firendly CPUs require software that's written to accomodate.

When developing software, the goal is to make it usable by everyone, but allow for newer hardware to expand on performance. if we're talking strictly IPC, then no, software isn't behind, because we will see performance benefits with current software as IPC improves. If we look at more efficient multi-threading processors, then yes, most software is behind. and my remain so until CPUs prior to Nahelem era are almost phased out.
 

CPUarchitect

Senior member
Jun 7, 2011
223
0
0
Your features are thus entirely driven by the capability of the hardware available, along with the teams ability to deal with the complexity of the software being written. Developers aren't so much lazy as they are overwhelmed by the sheer complexity of what they do.
I couldn't agree more. And this is why I'm excited about AVX2. It brings the GPU's throughput computing technology straight into the CPU cores, without requiring to rewrite anything like with GPGPU. Some code loops will even run up to eight times faster, with no developer effort other than recompiling.

Haswell's TSX technology is also mainly added for the purpose of making the developer's life easier. We've seen great increases in theoretical performance through multi-core in the past years, but it's deceptively hard to make use of that extra performance. TSX should make it a lot more attractive.

So I believe we're on the verge of a revolution in the CPU's practical computing power. Of course it will take several more years for this technology to become ubiquitous, but the ROI is too high to ignore.
There is actually an interesting trend in the industry to use significantly more servers than ever before to solve problems that desktops just can not do. A lot of software has moved to the web not because its better there but because it allows a company to deliver a solution that no users computer could run quickly enough. But split across 100 servers or more its perfectly usable. Google is a great example of a company that does exactly that for your searches, each of which hits thousands of boxes and searches 100's Gigabytes of data and yet returns in well under a second.
While there is definitely a movement toward more connectivity, I don't think computing itself is moving to the web. Even a Google search doesn't use a whole lot of computing power per user. It just needs many systems to service many users. So there's still a lot of demand for more computing power in the hands of the user. Literally. Even mobile devices are in a fierce race for higher performance. I don't see thins changing any time soon, and throughput computing technology and hardware transactional memory adds new fuel to it so developers can create new user experiences.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Windows 8 isn't a downgrade in functionality - it can do everything Windows 7 can. As for XP on a Pentium MMX, it could run, but leaving the themes service running (among other things) would make it a painful experience.

Vista and 7 don't need anywhere near 2 GB of RAM to run smoothly. The OS itself uses under 512 MB of RAM. At 1.5 GB, you'd have plenty of space to work with.

512MB? Lol....

Anyway:
http://forums.anandtech.com/showpost.php?p=33518374&postcount=42

Not to mention the DPC:
http://forums.anandtech.com/showthread.php?t=2250925
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
None of this works the way implied in much of the comments. As as software developer what you do is you look at the hardware currently available, you look at what is expected to become available around your release date and you write software that limits its capabilities to the hardware likely available to your users.

So if Intel/AMD are going to be giving you about 60 Giga Ops across 6 cores maximum then that is what you target. There is no point writing software that needs a 600 Giga op CPU because the hardware isn't available, and thus your software wont sell. Your features are thus entirely driven by the capability of the hardware available, along with the teams ability to deal with the complexity of the software being written.

Um, tell that to the people who developed TES: Oblivion and Crysis. Both games brought the absolute top-end systems of their day to a crawl. It wasn't until 1-2 years after each was released that mid-range systems could comfortably play them (meaning they passed into the realm of mainstream - appealing to the 'masses' of pc gamers).
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
Just to point out...

some softwares, already reached it's "maximum" utility... i mean, what else can be done with excell? or word?

ray-tracing letters and numbers is just silly :p
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I think tablets/smart phones have shown that most people are happy with 'good enough'. There was a time when a processor upgrade made a HUGE difference in how your computer ran software. These days I bet the average person wouldn't be able to tell the difference between an i3 and a top of the line i7 in a given system for what most people do.

I kind of miss the days where hardware upgrades really felt substantial. I remember going from a 700MHz Duron (bumped to 805MHz) to a 1.8GHz Duron (ran around 2GHz) to a socket 939 system with an A64 3000+ that I ran at ~2.2GHz. Each upgrade made a real difference.
 

BenchPress

Senior member
Nov 8, 2011
392
0
0
I think tablets/smart phones have shown that most people are happy with 'good enough'.
I have to strongly disagree. They're only "good enough" for their form factor, and for a short period of time. Nobody expects to be playing Crysis 3 on a tablet any time soon. But if one manufacturer released such a tablet, while still achieving good battery life and not costing a kidney, it would sell better than the competition's tablets.

Don't forget that we actually get used to higher performance really quickly. If you've used an iPhone 4S for a while, and then for some reason have to switch back to an iPhone 4, it feels noticeably slower. So it's only what you've experienced before that determines what is "good enough".

Mobile chip manufacturers are pushing the limits really hard, because speed sells. The CPU rating is often the first thing you see below the picture of a mobile phone in an advertisement. It would be unacceptable to have a single-core or less than 1 GHz, even though that was "good enough" just a year ago!

We also typically replace or phones more often than our desktop PC, which means that while each generation of phone is only slightly faster than the previous one, over the course of several years they've become way faster!

And as mentioned above, the desktop/laptop market is temporarily stagnating due to a lack in the ability to easily extract all the performance offered by modern CPUs. This will change dramatically with Haswell, which has an instruction set suitable for SPMD (read: easy vectorization), and TSX for efficient multi-core scaling. Once applications make use of these capabilities, you'll want to upgrade to keep up.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Nobody expects to be playing Crysis 3 on a tablet any time soon. But if one manufacturer released such a tablet, while still achieving good battery life and not costing a kidney, it would sell better than the competition's tablets.

Just try wolfenstien 3d on an ipad to see what an exercise in futility tasks that require more than the simplest of inputs are.

I always snicker to myself at people who carry an ipad, then a case with a keyboard. My inner monologue snidely remarks, "oh look, now you've made an underpowered laptop"
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
CPUs aren't at all ovepowered, especially the IPC or per core performance. There isn't a CPU available that can keep BF3 at over 60fps in all circumstances unless HIGHLY overclocked and even then its unlikely.
I know, there are people who will claim min's in the 80's, but after all the mess i've gone through I must say I don't believe you. We need faster CPU's.
 

Eeqmcsq

Senior member
Jan 6, 2009
407
1
0
Why, though? Why can hardware last so effectively now?

I think that's because there's only so much processing power an OS kernel and UI needs to get its tasks done. In fact, the OS and UI SHOULD be lightweight enough to run so that the CPU's processing power can be used for whatever apps you want to run.

It's similar to storage space and RAM: the OS only needs so much storage space and RAM to install. And even though 2TB HDDs and 16GB RAM are now easily available, the OS should not hog up 1TB of HDD and 8GB of RAM just to install/run.

So hardware can last so effectively long because the needs of an OS haven't needed to grow along with processing power.
 
Last edited:

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Windows 8 isn't a downgrade in functionality - it can do everything Windows 7 can. As for XP on a Pentium MMX, it could run, but leaving the themes service running (among other things) would make it a painful experience.

Vista and 7 don't need anywhere near 2 GB of RAM to run smoothly. The OS itself uses under 512 MB of RAM. At 1.5 GB, you'd have plenty of space to work with.

Right... I'd like to see you run a vista box with 512MB of ram as your primary machine and see how long before you start developing suicidal thoughts.
 

happysmiles

Senior member
May 1, 2012
340
0
0
CPU is fast enough for most things but with loads of multi-tasking and rendering it's still could be better.

I think that's partly why Ultrabooks and touch screens are coming along to generate interest since affordable hardware can do most of what people need.

Personally I'm glad that win 8 has been made with mobile in mind because it runs better than previous OS

gives longer life to my older hardware
 
Last edited:

bononos

Diamond Member
Aug 21, 2011
3,939
190
106
Easy. Your OS was full of bloatware. Install a fresh copy of vista and it would be just as quick.
Thats not true at all.

Windows 8 isn't a downgrade in functionality - it can do everything Windows 7 can. As for XP on a Pentium MMX, it could run, but leaving the themes service running (among other things) would make it a painful experience.

Vista and 7 don't need anywhere near 2 GB of RAM to run smoothly. The OS itself uses under 512 MB of RAM. At 1.5 GB, you'd have plenty of space to work with.
Only 512Mb? That doesn't sound right. I'm certain its 800+ and it literally crawls with 1Gb ram. I made a thread about this in the OS subforum. I also made sure Vista was patched to SP2 and it was slow before and just as slow after.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I think tablets/smart phones have shown that most people are happy with 'good enough'. There was a time when a processor upgrade made a HUGE difference in how your computer ran software. These days I bet the average person wouldn't be able to tell the difference between an i3 and a top of the line i7 in a given system for what most people do.

I kind of miss the days where hardware upgrades really felt substantial. I remember going from a 700MHz Duron (bumped to 805MHz) to a 1.8GHz Duron (ran around 2GHz) to a socket 939 system with an A64 3000+ that I ran at ~2.2GHz. Each upgrade made a real difference.

You get that same feeling upgrading a phone now though. I replaced my Droid X with a Bionic, and the difference between the two is staggering. The Bionic isn't even anything cutting edge (it was a free from Verizon a few weeks ago), but the dual core OMAP processor is way faster than the single core I had in the Droid X.

Contrast that to my desktop with an i7 920... I could probably get higher scores in benchmarks if I were to upgrade to a sandy/ivy bridge cpu, but would I really notice a difference?
 

nenforcer

Golden Member
Aug 26, 2008
1,779
20
81
Don't forget the proportion of users who actually own a quad core or greater CPU is very small despite the fact that they have been available for several years now.

The majority of PC users are still on single and dual core CPU's.

This is why major game developers like Blizzard stll only optimize for dual core CPU's and games like GTA IV and Battlefield 3 Multiplayer are the only ones which truly require or can take advantage of a quad core.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Don't forget the proportion of users who actually own a quad core or greater CPU is very small despite the fact that they have been available for several years now.

The majority of PC users are still on single and dual core CPU's.

This is why major game developers like Blizzard stll only optimize for dual core CPU's and games like GTA IV and Battlefield 3 Multiplayer are the only ones which truly require or can take advantage of a quad core.

The issue is entirely different. You can just make x many threads because you want to. Plus if there is no performance gain or the development time and cost is above the performance gain you dont do it.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Don't forget the proportion of users who actually own a quad core or greater CPU is very small despite the fact that they have been available for several years now.

The majority of PC users are still on single and dual core CPU's.

This is why major game developers like Blizzard stll only optimize for dual core CPU's and games like GTA IV and Battlefield 3 Multiplayer are the only ones which truly require or can take advantage of a quad core.

I'm sure there is a more recently published version of the the following graph but I'll be damned if I can't find one online at the moment, does anyone know of an updated one?

kaigai2_small.jpg
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
I think Steam's Hardware Survey is an interesting piece of statistics.
Of course the average Steam customer is not like the average PC user.
But this data says a lot about the PC users who do care about the performance of their systems.

Last survey is of May 2012.
I am always amazed that gamers have much better hardware than people would guess.

http://store.steampowered.com/hwsurvey

1 core - 7.3%
2 cores - 50.5%
4 cores - 37.8%
6 cores - 2.5%

Another noticable stat is RAM:
1 GB - 4.8%
2 GB - 16.8%
3 GB - 22.7%
4 GB - 23.6%
5 GB or more - 30.7%

Combine that with the version of OS running.
64-bit Windows - 60%
32-bit Windows - 35% (the rest is MaOS)

It looks to me that game-developers can start making 64-bit applications that use more than 4GB of RAM. There is a market.
 
Last edited:

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
I'm sure there is a more recently published version of the the following graph but I'll be damned if I can't find one online at the moment, does anyone know of an updated one?

kaigai2_small.jpg

That's an interesting graph. Although, I'm surprised they didn't adjust the forecast based on the actual data. Did they really expect single core market share to grow contrary to the preceding quarters?
 

PaGe42

Junior Member
Jun 20, 2012
13
0
0
Going back to the original question: "Are CPUs advancing faster than software requires?", the answer is basically: Yes!

CPU's are advancing on an exponential scale (a doubling every two years), whereas software development is linear (adding x million lines of code every year).

And yes, I know the exponential rate is slowing down (so it is not purely exponential anymore) and software has historically been advancing faster.

The latter was due to various factors: better tools allowing a programmer to be more productive, better project management allowing for larger teams, better hardware allowing programmer's to focus more on functionality instead of tuning, etc. But this is now reaching its ceiling, if it hasn't been reached already. Besides, software is not purely additive: a lot of old code is getting replaced, reducing the linearity.

So now we have hardware advancing at (almost) exponential speed and software getting along at (basically) linear speed. Guess who gets there first?