How will CPU processing power be increased when transistors can't be shrunk farther?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 25, 2011
16,997
1,626
126
I am currently self-employed in the industry of foreign currency exchange. Designing algorithms that autonomously trade (buy/sell) foreign currencies without human involvement in the process.

Huh. I know somebody that lost their job to you.

In fifteen years, he went from a department of half a dozen people, to working alone - manually typing data from a couple sources into a spreadsheet, that automatically generated a report that he'd manually email out to clients.

He and his manager were axed the same day. (About a year ago.)
 
Last edited:

Compman55

Golden Member
Feb 14, 2010
1,241
0
76
If we put a freeze on hardware right now, there is no doubt performance gain could be improved on the software coding end of things. Although I know nothing about programming, I have seen someone show me how he cleaned up the code and reran the program. There was tons of increase. This was back in the XP days so it doesn;t count, but im sure it is the same today. An i7 2nd gen, w/ SSD's, and 32GB of ram should never ever slow down, yet it can. Open a zillion tabs of flash video, let them play until they end and leave them open for a week, it will slow to a crawl....... sloppy coding.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Huh. I know somebody that lost their job to you.

In fifteen years, he went from a department of half a dozen people, to working alone - manually typing data from a couple sources into a spreadsheet, that automatically generated a report that he'd manually email out to clients.

He and his manager were axed the same day. (About a year ago.)

I feel sorry for them :( I am pro-technology but at the same time I am loathe to how much technology is turning out to be anti-society.

Everybody has to eat, I hope those folks found their ensuing lifestyles to be fulfilling and happy ones (turmoil aside).

Ironically I got into forex because my R&D job went to Taiwan. And now I am moving to Taiwan because life just works in wondrous and mysterious ways :)

I didn't mean to come off as attacking your message, I was just trying to clarify it a bit for some of the other readers. I admire your knowledge quite a bit, the showing off part was a bit tongue-in-cheek and I apologize if it didn't come off that way.

My bad :oops:, I misread your post, no need to apologize for something you didn't intend to do. My apologies for mischaracterizing your post :oops:
 
Last edited:

cytg111

Lifer
Mar 17, 2008
26,744
16,029
136
I am currently self-employed in the industry of foreign currency exchange. Designing algorithms that autonomously trade (buy/sell) foreign currencies without human involvement in the process.

- What kind of algos, if you dont mind me asking :)
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Not all software algorithms can be improved. We make a lot of trade offs today due to insufficient hardware, most of which people never really see but as a professional I make countless performance trade offs every day. Simply put there isn't enough performance to do a lot of things we use daily properly so we do something that is good enough and computationally cheaper. We cheat on angle calculations in games, we reduce the colour quality, we approximate the lighting with a post processing phase and we compress things poorly because it can be done.

The software as it exists today is the set of software that can run on todays hardware. By definition the future software that does not yet run usably on today's machines will not arrive until the computational power also exists to run it, not least because no programmer could test it worked.

There are so many programs I would like to give users on the desktop that are just plain impractical on todays machines, its a real shame that anybody today thinks that computers are good enough and we are done innovating. With faster hardware I assure you much more smarts are to come (rather than pushing our computation and rights our into the internet which is bad for all of us).
 

cytg111

Lifer
Mar 17, 2008
26,744
16,029
136
But to a degree it is true, if hardware hit a brick wall for a decade, focus would shift towards more efficient code constructs. We have seen this many times allready as you can call those enthuiasts that still code for C64 compos hardware constrained .. what they can do with so limited resources are friggin amazing.. and here we are running our stuff in flash, jvm's and cli's, js and in browsers..
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
I feel sorry for them :( I am pro-technology but at the same time I am loathe to how much technology is turning out to be anti-society.

Everybody has to eat, I hope those folks found their ensuing lifestyles to be fulfilling and happy ones (turmoil aside).

That's a whole can of worms in its own right. Our social order is lagging behind the reality of modern economics.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
But to a degree it is true, if hardware hit a brick wall for a decade, focus would shift towards more efficient code constructs. We have seen this many times allready as you can call those enthuiasts that still code for C64 compos hardware constrained .. what they can do with so limited resources are friggin amazing.. and here we are running our stuff in flash, jvm's and cli's, js and in browsers..


Necessity is the mother of invention. When the cheap and easy performance gains are tapped out then the next level of slightly less cheap and slightly less easy performance enablers will be tapped.

I feel though that in the area of HPC apps, the code is already pretty well squeezed IMO. There is only so much you can do when preparing generic code that needs to be compiled and ran on a whole spectrum of hardware configurations.

- What kind of algos, if you dont mind me asking :)

I don't mind you asking, and I'd be happy to answer to the best of my abilities.

FWIW, I post on a forex forum (forum.mql4.com to be specific) under the username 1005phillip. As you can imagine from my CPU-related threads here, my posts over there are "enthusiast" rated ;) (albeit towards forex rather than CPUs :D)

At any rate, my algo's are not HFT (high frequency trading), I don't build scalper trading strategies. I design trend followers, reversion to the mean arbitrage models, and other channel models.

It is decidedly pedestrian type stuff because the business of foreign currency trade has been rather stymied by the low national interest rates in the major countries since the 2008 banking meltdown. That's all about to change though :)

It is really fun though. I thought process node development was my dream job, but this is definitely more fun. A different brain teaser every day.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
I think fpga will be integrated onto chips for custom app specific optimization. Also, eventually we'll move to graphene and some sort of quantum trasistor with lower leakage. An architecture radically different than the von neauman architecture currently in use will probably be developed at some point. Performance gains will be very slow and the computer revolution will be over after shrinking ends. Eventually, 3d chips will be invented and they will usher in a new era for moore's law.
 

A5

Diamond Member
Jun 9, 2000
4,902
5
81
If one were interested in getting involved, would it be better to go with materials science or EE? Or is a background in both equally important?

Either one would work, honestly. But if you really, really want to get deep into this stuff, plan on an MS at the bare minimum. This stuff is all still far enough away to where it isn't crammed into anyone's undergrad curriculum at this point. Physics, MSE, or EE would all work as long as you take the right classes. I don't think there are a ton of MSE undergrad programs that focus on semiconductors, FWIW. You'd probably want to start with Physics or EE and tailor your course load towards materials.

I'd take serious issue with Cogman's characterization of EEs - I'm about to get my MSEE and haven't solved an analog circuit since my junior year of undergrad (and honestly I couldn't do anything beyond the most basic circuit right now if you put it in front of me). If you go to a major university, the possibilities within EE alone are so broad that it's ridiculous.
 
Last edited:

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
Yep, in the Tianmu area. We go house hunting in 3 weeks :)

...That is awfully close to where my parents grew up. My advice: look for air conditioning (going outside is like an instant bath in swaet during the summer due to the the humidity) and bring a handkerchief or somethinto wrap around your face for the smog. If you're going around, try to freeze a 2/3 full water bottle ahead of time (top it off!) so you can have something cold to drink. Remeber to use a filter!

You eventually get used to it, but it's not pretty in the interim.
 

MisterMac

Senior member
Sep 16, 2011
777
0
0
Necessity is the mother of invention. When the cheap and easy performance gains are tapped out then the next level of slightly less cheap and slightly less easy performance enablers will be tapped.

I feel though that in the area of HPC apps, the code is already pretty well squeezed IMO. There is only so much you can do when preparing generic code that needs to be compiled and ran on a whole spectrum of hardware configurations.



I don't mind you asking, and I'd be happy to answer to the best of my abilities.

FWIW, I post on a forex forum (forum.mql4.com to be specific) under the username 1005phillip. As you can imagine from my CPU-related threads here, my posts over there are "enthusiast" rated ;) (albeit towards forex rather than CPUs :D)

At any rate, my algo's are not HFT (high frequency trading), I don't build scalper trading strategies. I design trend followers, reversion to the mean arbitrage models, and other channel models.

It is decidedly pedestrian type stuff because the business of foreign currency trade has been rather stymied by the low national interest rates in the major countries since the 2008 banking meltdown. That's all about to change though :)

It is really fun though. I thought process node development was my dream job, but this is definitely more fun. A different brain teaser every day.

OT sorry:

Extremely interesting - and a massive change of fields.

I'm in AdTech - more specifically RTB fields.


Isn't Forex\Trading in general just Big Data now - and pretty much huge databases with machine learnings?
Which is what ad-buying is desperately trying to become.

PS: MQL4 must seem so damn simple - if you've been used to knowing how C becomes ASM :)
 

Cogman

Lifer
Sep 19, 2000
10,286
147
106
I'd take serious issue with Cogman's characterization of EEs - I'm about to get my MSEE and haven't solved an analog circuit since my junior year of undergrad (and honestly I couldn't do anything beyond the most basic circuit right now if you put it in front of me). If you go to a major university, the possibilities within EE alone are so broad that it's ridiculous.

Are you currently working in the field? There is nothing wrong with starting as an engineering grunt. People don't generally stay at that level. Rather that is what I observed happened to my classmates that graduated with a BS in EE or CompE.

As for "Solving an analog circuit". AFAIK, analog engineers are a much more rare specialty (and they generally get paid better). Analog work is not grunt work. (Though, you won't be doing the simple "calculate the current through this RLC circuit" stuff).
 

Ajay

Lifer
Jan 8, 2001
16,094
8,115
136
I'm sure you'll like this slide deck freshly released at IDF Spring :
BJ13_SPCS006_104_ENGf.pdf, downloadable from https://intel.activeevents.com/bj13/scheduler/catalog.do

Thanks for the link!

Very interesting, wish there was a video of the actual presentation. There is a very brief paper associated with it, which was referred to as "text", but it's too short to be the actual verbal presentation, unless Intel wanted it that way.

I can see why fabs will be sticking to CMOS for a while - there are still major problems that needed to be overcome in either the new technologies themselves and/or their implementation.
 

tipoo

Senior member
Oct 4, 2012
245
7
81
The fabrication shrink limitation is just for current silicon based processors, if I'm not mistaken. Other materials hold promise, like graphene, which they may move to if they can shrink silicon no more even with fancy tricks like 3D transistors. And as always they can add in new layers of memory hierarchy to make things faster, like Haswells eDRAM.

I think they would start researching that well before the time came, because if you can't shrink performance gains would stall. Architecture rejiggering is a large part of it, sure, but nothing beats die shrinks in making it possible to simply add more logic and cache etc to a CPU and make it faster as a result.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
OT sorry:

Extremely interesting - and a massive change of fields.

I'm in AdTech - more specifically RTB fields.


Isn't Forex\Trading in general just Big Data now - and pretty much huge databases with machine learnings?
Which is what ad-buying is desperately trying to become.

PS: MQL4 must seem so damn simple - if you've been used to knowing how C becomes ASM :)

Personally I enjoy the R&D in the field of forex more so than the actual trading of the currencies in question. There is so much going on in terms of algorithms, pseudo-AI, etc. And then there are the statistics, just in terms of understanding what the statistics capture and convey in terms of trading performance.

In a lot of ways it is research and development the same as I did in process node development, thinking about math, applying the same problem solving techniques that I was taught in grad school, rinse and repeat.

Oh, TSMC then!

;) :sneaky:

The fabrication shrink limitation is just for current silicon based processors, if I'm not mistaken. Other materials hold promise, like graphene, which they may move to if they can shrink silicon no more even with fancy tricks like 3D transistors. And as always they can add in new layers of memory hierarchy to make things faster, like Haswells eDRAM.

I think they would start researching that well before the time came, because if you can't shrink performance gains would stall. Architecture rejiggering is a large part of it, sure, but nothing beats die shrinks in making it possible to simply add more logic and cache etc to a CPU and make it faster as a result.

The potential of graphene is pretty astonishing because you can stand the graphene on end, make it vertical, and then pack them laterally like a stack of pancakes laid on its side.

The xtor density can get amazingly high because the drain (or source) can be buried under the graphene, with the other residing over top.

Think finfet with the fin width shrunk to ~0.15nm. Crazy thin.