If a revolutionary CPU came out today, what would software look like tomorrow?

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Lets imagine Intel shocked the world with the announcement of a brand new technology. Its cheap, easy to produce, and its absolutely ingenious. Its also all theirs with no plans to share it. Anyway, a new CPU is coming and the first generation of these chips will have a per core performance that is, now get this, 50X faster than the best available today.
With CPU power being practically unlimited for today's programs and games, what changes would we see in the software and gaming world to make use of it?
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Other then HPC, Video Editing/Encoding and the like, who would even notice?
 

[DHT]Osiris

Lifer
Dec 15, 2015
14,076
12,172
146
With CPU power being practically unlimited for today's programs and games, what changes would we see in the software and gaming world to make use of it?

Exactly what happened last time CPU power went up 50x the 'best available', programs would get less efficient and bloat to fill the processing void created. We'd have browsers eating up 10's of ghz (or hypothetical equivalent).
 
Feb 25, 2011
16,788
1,468
126
Somebody would write a compiler for it, there'd be a few months of delay until somebody got Linux running on it, then Windows, then Java, then a bunch of badly ported software. There would be a lot of op-eds about how it could never displace x86.

In 2-3 years, people would actually be familiar enough with the new architecture to write decent software for it, and it would come down to cost/benefit.
 
Feb 25, 2011
16,788
1,468
126
Exactly what happened last time CPU power went up 50x the 'best available', programs would get less efficient and bloat to fill the processing void created. We'd have browsers eating up 10's of ghz (or hypothetical equivalent).
Also that.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I'd like to think the tech would make it's way to GPU's, and transfer rates and ram would catch up and the resulting ecosystem would yield the long awaited photorealistic gaming scenario. Also higher resolutions. Also better physics. Probably a few more "also's" in there somewhere. But you guys say nope, just lazier programmers. I guess I can accept that.
 

superstition

Platinum Member
Feb 2, 2008
2,219
221
101
Other then HPC, Video Editing/Encoding and the like, who would even notice?
AI
big data (high-frequency trading, other financial analysis, etc.)
the military
encryption (making and cracking)
natural language processing
server market
weather/seismic modeling
scientists doing various other types of research

etc etc

Your comment is in the league of "640K ought to be enough for anyone", along with the comment implying that the main thing we see over time is bloat rather than features being added. Usually, what is called bloat is actually the feature of quick development time (which lowers cost and provides the ability to add more features) although sometimes it can be things that are features that benefit others more than they benefit the individual user (all the processing required for telemetry, file scanning and indexing — arguably more useful for snooping than the benefit it provides for user searches, distributed processing for updates, etc.)

Basically look at what supercomputing is doing and put it on the desktop, something that's been going on since microcomputing started. In addition to that, take into account the additional opportunities offered by the small form factors afforded by microprocessors — and the benefits of low-cost deployment options.

If I were to have the funding and the right programmers I could create a video game that would use that processor to the max — and not just by making the code inefficient. It's not that hard to think of ways to do that.
 
Last edited:

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
AI
big data (high-frequency trading, other financial analysis, etc.)
the military
encryption (making and cracking)
natural language processing
server market
weather/seismic modeling
scientists doing various other types of research

etc etc

Your comment is in the league of "640K ought to be enough for anyone", along with the comment implying that the main thing we see over time is bloat rather than features being added. Usually, what is called bloat is actually the feature of quick development time (which lowers cost and provides the ability to add more features) although sometimes it can be things that are features that benefit others more than they benefit the individual user (all the processing required for telemetry, file scanning and indexing — arguably more useful for snooping than the benefit it provides for user searches, distributed processing for updates, etc.)

Basically look at what supercomputing is doing and put it on the desktop, something that's been going on since microcomputing started. In addition to that, take into account the additional opportunities offered by the small form factors afforded by microprocessors — and the benefits of low-cost deployment options.

If I were to have the funding and the right programmers I could create a video game that would use that processor to the max — and not just by making the code inefficient. It's not that hard to think of ways to do that.
Yes I know many folks would see huge benefits from this. My point is that the majority of users at this time wouldn't. Most modern CPUs are already more powerful then most users need.
 

Nothingness

Platinum Member
Jul 3, 2013
2,405
735
136
With more CPU power, CPU designers would design, simulate and validate even more complex and powerful CPU :D
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Imagine all the adware, viruses, and bloatware your computer could run without any noticeable lag.
 
  • Like
Reactions: NTMBK

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,830
136
With a CPU that much faster than . . . let's say KabyLake, we'd be back to the days when there was a huge disconnect between CPU power and BUS speed/RAM speed. Arguably you could say we're there now. It would just get worse. That super-fast CPU would be (at first) wholly-inadequate for consumer-level systems since the necessary interconnects to support even one of them would be prohibitively expensive.

In the short term it would kill GPUs once system builders/integrators could agree on a new memory standard to replace DDR4 or once they just started rolling out quad channel (or better) setups for consumer-level systems.
 

daxzy

Senior member
Dec 22, 2013
393
77
101
In the desktop space, nothing really dramatic would happen for years, IMO. Desktops have already reached a position where additional CPU power is only marginally beneficial.

In the mobile space, it would be a game changer. 50X the performance a Goldmont Atom with the same TDP would probably get a lot of phone companies tempted to change to x86. Tablets and laptops would also get a big boon.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Intel will charge an arm and leg for it even if it is cheap to produce, just because they can!
 

master_shake_

Diamond Member
May 22, 2012
6,430
291
121
no one would care because they use tablets now for everyday computing.

but put it in an ipad and woohooo you've got something
 

cytg111

Lifer
Mar 17, 2008
23,175
12,837
136
Cyberwarfare domination.
VR/AR 2.0
AI 2.0

But will it get us to Mars? Then who cares :).
 

superstition

Platinum Member
Feb 2, 2008
2,219
221
101
In the desktop space, nothing really dramatic would happen for years, IMO. Desktops have already reached a position where additional CPU power is only marginally beneficial
This is only true because of artificial market bottlenecks:

1) Consoles have weak CPUs so big developers are focusing on pretty graphics instead of deep AI and such. This isn't new. Remember the Cell processor? More powerful than PCs for pretty graphical streaming and worse for things like AI and physics.

2) It's easier, traditionally, to sell games to people with pretty graphics and comparatively vapid gameplay. Games like DOOM are about walking/running around and shooting things. Thrilling.

3) Things like high-frequency trading haven't trickled down to the masses, even though one trader told Samantha Bee that it works just like your e-mail program.

4) Corporations are focused on delivering the least product for the most money so they haven't been very aggressive in putting demands on CPUs. They want all the people with low-end and midrange CPUs to buy their product. They're more focused on getting people's files and profiling than on expanding and deepening the desktop microcomputing software sector, along with the expansion of subscriptions and microtransactions. Cloud computing, at its height, turns a person's PC into a dummy terminal. The Internet seems to be what has stalled CPU gains in large part.

This is similar to the way Hollywood and television are focusing on recycling rather than new content. Shows like The Middle are blatant rip-offs of previous TV shows and rather than getting new content we're getting Carrie Fisher 2.0. Homogenization and increased blandness of content is also the name of the game, as with the way Abrams' Star Trek is a bland conflation of Star Trek and Star Wars and The Middle is much blander than Malcolm and the Middle was. My spouse thinks the Writers' Strike is the cause, that most of the good writers changed fields and few people pursued writing as a result.

Even gaming, particularly Nintendo, has long been like this. Once upon a time it only released a single Metroid game for the NES and only a single one for the SNES. Only a single Zelda game for the SNES, too. But since then Metroid and Zelda have been proliferating all over the place, along with Mario. Square has been pushing Final Fantasy forever and gamers have been more interested in a remake of FFVII (or VI) than in their more recent content. The Terminator franchise is a perfect example of how the remake culture degrades content. Only the first film was any good.

5) With wealth consolidation, which has been the name of the game since wages started falling in the 70s and mergers (corporate consolidation) became the name of the game, it's harder and harder for the little guy to make anything creative that has a budget beyond indie. When it comes to making software that pushes the CPU envelope, indie is unlikely to get that done. Bigger companies are generally more risk-averse, resulting in bland me-too copycat content.

etc.
 
Last edited:

daxzy

Senior member
Dec 22, 2013
393
77
101
This is only true because of artificial market bottlenecks:

1) Consoles have weak CPUs so big developers are focusing on pretty graphics instead of deep AI and such. This isn't new. Remember the Cell processor? More powerful than PCs for pretty graphical streaming and worse for things like AI and physics.

2) It's easier, traditionally, to sell games to people with pretty graphics and comparatively vapid gameplay. Games like DOOM are about walking/running around and shooting things. Thrilling.

3) Things like high-frequency trading haven't trickled down to the masses, even though one trader told Samantha Bee that it works just like your e-mail program.

4) Corporations are focused on delivering the least product for the most money so they haven't been very aggressive in putting demands on CPUs. They want all the people with low-end and midrange CPUs to buy their product. They're more focused on getting people's files and profiling than on expanding and deepening the desktop microcomputing software sector, along with the expansion of subscriptions and microtransactions. Cloud computing, at its height, turns a person's PC into a dummy terminal. The Internet seems to be what has stalled CPU gains in large part.

This is similar to the way Hollywood and television are focusing on recycling rather than new content. Shows like The Middle are blatant rip-offs of previous TV shows and rather than getting new content we're getting Carrie Fisher 2.0. Homogenization and increased blandness of content is also the name of the game, as with the way Abrams' Star Trek is a bland conflation of Star Trek and Star Wars and The Middle is much blander than Malcolm and the Middle was. My spouse thinks the Writers' Strike is the cause, that most of the good writers changed fields and few people pursued writing as a result.

Even gaming, particularly Nintendo, has long been like this. Once upon a time it only released a single Metroid game for the NES and only a single one for the SNES. Only a single Zelda game for the SNES, too. But since then Metroid and Zelda have been proliferating all over the place, along with Mario. Square has been pushing Final Fantasy forever and gamers have been more interested in a remake of FFVII (or VI) than in their more recent content. The Terminator franchise is a perfect example of how the remake culture degrades content. Only the first film was any good.

5) With wealth consolidation, which has been the name of the game since wages started falling in the 70s and mergers (corporate consolidation) became the name of the game, it's harder and harder for the little guy to make anything creative that has a budget beyond indie. When it comes to making software that pushes the CPU envelope, indie is unlikely to get that done. Bigger companies are generally more risk-averse, resulting in bland me-too copycat content.

etc.

That was a pretty in depth response. I'll make some counter points (I don't disagree with you completely).

1 and 2. Although consoles are to blame for a lot of stagnation, it isn't the only thing at fault. To create a high fidelity graphical and physical world (with good story) in a game costs millions of $. Look at Star Citizen. It's raised $130M and it isn't even complete yet. It's no wonder a lot of game companies take the easier route and make mainstream games. LoL, DotA2, CS: GO, WoT, are mainstream games and probably make more in monthly gross than the AAA titles during their life time.

3. From my understanding of HFT, it requires a very low latency connection to whatever network the trading system is on. That's not something your average (or even technical) person can do. I wouldn't doubt its rather simple to operate, as the computer does the work and the trader reaps the benefits.

4. Gaming companies are focused on their target audience. If you make a super awesome game that only 1% of your client base can run, you either assume your game is so damn awesome that people will be compelled to upgrade, or you assume they won't buy the game (or wait years in the future when they have the hardware, but the game is only selling for a fourth of the launch price). From historical trends, its always been the latter.

5. Generally yes. But now even the big gaming studios have seen the potential for small projects that reap large rewards (as a function of % return). Look at Blizzard's Hearthstone. Literally took a small 15 person team and I bet its raked in more money than a lot of AAA titles. Ubisoft had Child of Light (sold quite well and won a lot of critical praise). EA has Unravel. Microsoft has Ori.

But OTOH, those small time indie games won't push the CPU or GPU envelope. So we're stuck with the question 'why do we need a 50X better desktop to run games like Hearthstone?'

....

I think the technology that will reap the most rewards with an overnight 50X CPU speed increase is Augmented Reality. Imagine Hololens, but about the size of a normal pair of glasses. The potential for that is just staggering - it won't just have one killer app, but dozens.