It's all Apple's Fault.

MisterMac

Senior member
Sep 16, 2011
777
0
0
(Rant: whine: Apologies in before locking:)

...Well it is - isn't it?

If that stupid touch phone and that stupid tablet hadn't come out - neither intel nor gf nor tsmc would be tuning they're processes to low leakage CMOS designs.

There'd be a push for getting more absolute performance within the power envelopes - we wouldn't be fantasizing about sexy smartphones and on the fly tablets.

(I don't disagree these inventions aren't useful mind you ).

But still - now we're in a slump were 10% over 2 years performance increase (Looks like it both for CPU \ GPU from the looks of r2xx :C) is AAAA-OKAY.

What the hell.


Empowering the top end - also lifts the bottom end.
There's probably milllllllllllions of slow corporate 5400 RPM slow XP machines around - that could have had another upgrade cycle were it for tablets and expensive as <edited for language> phones.

No profanity in the tech forums
Markfw900
Anandtech Moderator


They started the movement - they started the focus on flashy and design centric rather than performance and useful.

Remember nokia brick phones? 12 days of active time? no problem!
Yes they were simple - but they did the job.


This is Apple's fault for starting a movement - that hurts the absolute performance per dollar afficionados.

....now if you'll excuse me my better half is calling me on my iPhone -_-.
 
Last edited by a moderator:

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Somehow people never remember that Intel was promoting Atom for MIDs before iPhone came out. A lot of players were moving the industry in this direction (Nokia phones were no doubt becoming more and more smart, as well as their internet devices like N770, then there was the evolution of PocketPCs that were also starting to merge with phones, and Android wasn't merely a reaction to Apple). Apple gets way too much credit - phones and tablets may not have been quite like they are today but there certainly would have still been a huge drive for smarter mobile devices. People who think that ARM had no relevance before iPhone are dead wrong too - it was already in most dumb/feature phones and steadily getting stronger in them. Not to mention it was in the Nintendo DS (and GBA).

Desktop processors haven't stagnated because of the emphasis on mobile, they've stagnated because exponential growth isn't sustainable in anything. They've hit very real frequency and power walls. People need to let it go, the golden age is over and nothing was going to stop that.
 
Last edited:

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
CPUs for desktops unforunately are fast enough. The focus is now in the IGPs which are getting impressive. Btw, laptops are seeing some pretty nice exponential gains as are desktop GPUs.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Well I would think the increased purchase of flash memory for phones also drives down SSD prices overall? That's good at least.

As for iPhone/Android (and WP boo), that was inevitable I suppose. There are a good number of people who really need mobile communication centers, most of those people already had blackberries. I know those people well in my line of work though supporting them, and the tiny screen of the BB was due for an overhaul. Going to a larger (touch) screen with higher resolution required more horsepower, so hence the new focus on mobile.

And then there are the bulk of the customer base for these phones, me-too vanity customers who just like the easy text, social media, and entertainment features. They can be incredibly annoying, but they do make these companies a lot of money. My wife's younger sister is one of those people. She has a droid something or other that she can't afford ($90/mo through one of her family member's plans, and she often can't pay her full $90 that she owes). All she uses the phone for is facebook and texting, 23 hours per day. And she wants to upgrade to ANOTHER phone early.

I told her : you can get a great phone for cash if you save a tiny bit of money, then get a prepaid wireless service for $35/mo unlimited everything if she wants. That 'free' upgrade or whatever is a pure scam, because all the fees and BS that come along with dealing with ATT/Sprint/etc are costing you MORE per year than if you just bought the phone cash and went prepaid. But she doesn't listen because she's borderline retarded. Hence, she will get rid of a perfectly fine phone to replace it with another overpowered device that sucks batteries like mad, and that's more $$ in the bank for ATT and the associated makers of those devices.

It's kind of like SUVs, probably 5% of smartphone users actually 'need' the device for work purposes. And probably 90% of the remaining users are wasteful tards that could save tremendous money not buying vanity devices. Only text and make phone calls? Get a flipphone + prepaid. Only use facebook and light angry birds type gaming with some youtube in there? Get a $149 android or used iphone 4 + prepaid. But no, people love buying crap they don't need, and paying out the ass for it.
 

MisterMac

Senior member
Sep 16, 2011
777
0
0
Somehow people never remember that Intel was promoting Atom for MIDs before iPhone came out. A lot of players were moving the industry in this direction (Nokia phones were no doubt becoming more and more smart, as well as their internet devices like N770, then there was the evolution of PocketPCs that were also starting to merge with phones, and Android wasn't merely a reaction to Apple). Apple gets way too much credit - phones and tablets may not have been quite like they are today but there certainly would have still been a huge drive for smarter mobile devices. People who think that ARM had no relevance before iPhone are dead wrong too - it was already in most dumb/feature phones and steadily getting stronger in them. Not to mention it was in the Nintendo DS (and GBA).

Desktop processors haven't stagnated because of the emphasis on mobile, they've stagnated because exponential growth isn't sustainable in anything. They've hit very real frequency and power walls. People need to let it go, the golden age is over and nothing was going to stop that.


Your basicly saying if the volume was there - and the power increase was there you don't think we'd be seeing some of that 15 CORE IVY EX and 12 CORE IVY EP goodness in Desktop?

Really?

If we had mainstream 8 core - you don't think the Eco system would "up the man BEEP" if it knew it could gain absurd gains from proper threading thru most code?


Yes we've had pocket pc's and the toilet Mac'books.
We've had "tablet pocket" stuff since the 90's.


No-one pushed as far as it did - before apple sex'ed it all up.


It's a rant - i can't blame them when there's many variables - but i sure as hell can blame them for causing the massive shift towards mobile, lp, and simple graphics\general compute intensity type workload needs..

If you don't believe everyone bet on more threading\more cores - go ask Dirk Meyer & Mike Butler about Bulldozer.
Go ask Intel why 15 Core EX Ivy is gonna get released for the server world
(Where ironicly - they dedicate ginormous resources to perfecting thread scaling).

The cycle of upgrades still exists - it's just been moved to different product categories.

And Apple has played a major factor in this.
 

inf64

Diamond Member
Mar 11, 2011
3,703
4,034
136
Desktop processors haven't stagnated because of the emphasis on mobile, they've stagnated because exponential growth isn't sustainable in anything. They've hit very real frequency and power walls. People need to let it go, the golden age is over and nothing was going to stop that.
Well said :thumbsup:
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Your basicly saying if the volume was there - and the power increase was there you don't think we'd be seeing some of that 15 CORE IVY EX and 12 CORE IVY EP goodness in Desktop?

Really?

If we had mainstream 8 core - you don't think the Eco system would "up the man BEEP" if it knew it could gain absurd gains from proper threading thru most code?

So your problem is that Intel isn't putting enough cores in mainstream processors? Because it sounded more like your complaint was that the if the manufacturing process didn't become "mobile tuned" we'd see faster single thread performance improvements.

It doesn't add up. You wanted them to focus on peak performance at the expense of power efficiency instead of the opposite. Guess what - number of cores on a chip is a power efficiency problem. Focusing on peak single thread performance would make that worse, not better.

Intel could put more cores on their mainstream chips, but they don't because they don't see the market value in it. I don't have the faintest idea what Apple or phones has to do with this decision.

You've changed your complaint to the usual griping about software not exploiting more threads, and how if Intel would just give us more everyone would learn how to utilize it better. Sorry, but I don't think so, and chances are Intel doesn't think so either, or they'd be doing it. AMD may think so, or maybe they're just determined to capture a niche where they only get an advantage for some software. You'd may as well take this line of reasoning to its logical conclusion and say that all software should have been heavily optimized for GPUs by now. This is the usual kind of complaining about programmers not being good enough from people who probably don't write an awful lot of software.

If Intel needed to make their enthusiast processors $400 instead of $600 to get people to keep buying processors they would. They don't. The market has spoken. And by the way, it's not just phones and tablets (and servers, and embedded) where power consumption matters. There has been a continual shift away from desktops and towards laptops for the past several years, and focusing on improving power consumption at every level of the perf/W curve has absolutely had an impact on laptops.

Yes we've had pocket pc's and the toilet Mac'books.
We've had "tablet pocket" stuff since the 90's.

No-one pushed as far as it did - before apple sex'ed it all up.

What we've had is continual improvement towards more performance and better efficiency in the mobile space, year after year, coming from several different places. Apple changed a bit how phones were designed and perceived, but they did not invent demand for mobile processing improvement. And Intel wasn't lured to follow this path by Android.

Go ask Intel why 15 Core EX Ivy is gonna get released for the server world
(Where ironicly - they dedicate ginormous resources to perfecting thread scaling).

I don't have to ask because it's obvious, servers run different workloads than desktops. Workloads that scale well with thread count.
 
Last edited:

MisterMac

Senior member
Sep 16, 2011
777
0
0
So your problem is that Intel isn't putting enough cores in mainstream processors? Because it sounded more like your complaint was that the if the manufacturing process didn't become "mobile tuned" we'd see faster single thread performance improvements.

It doesn't add up. You wanted them to focus on peak performance at the expense of power efficiency instead of the opposite. Guess what - number of cores on a chip is a power efficiency problem. Focusing on peak single thread performance would make that worse, not better.

Intel could put more cores on their mainstream chips, but they don't because they don't see the market value in it. I don't have the faintest idea what Apple or phones has to do with this decision.

You've changed your complaint to the usual griping about software not exploiting more threads, and how if Intel would just give us more everyone would learn how to utilize it better. Sorry, but I don't think so, and chances are Intel doesn't think so either, or they'd be doing it. AMD may think so, or maybe they're just determined to capture a niche where they only get an advantage for some software. You'd may as well take this line of reasoning to its logical conclusion and say that all software should have been heavily optimized for GPUs by now. This is the usual kind of complaining about programmers not being good enough from people who probably don't write an awful lot of software.

If Intel needed to make their enthusiast processors $400 instead of $600 to get people to keep buying processors they would. They don't. The market has spoken. And by the way, it's not just phones and tablets (and servers, and embedded) where power consumption matters. There has been a continual shift away from desktops and towards laptops for the past several years, and focusing on improving power consumption at every level of the perf/W curve has absolutely had an impact on laptops.



What we've had is continual improvement towards more performance and better efficiency in the mobile space, year after year, coming from several different places. Apple changed a bit how phones were designed and perceived, but they did not invent demand for mobile processing improvement. And Intel wasn't lured to follow this path by Android.



I don't have to ask because it's obvious, servers run different workloads than desktops. Workloads that scale well with thread count.

I dare say - that IF there was no mobile expansion and 99,9% would still be using brick mortars with no tablets mainstream in sight nor cheap nettops - then yes.

I do believe Intel, AMD\GF, TSMC and other would spend more time finetuning process nodes dedicated to High Power.
Cause where else would those resources go?

Did we forget the Westmere\Sandy Bridge process that was and is a tank alot more than 22nm is?


If there was no incentive to do LP processes and design around a perf\watt within a Mobile Envelope - then they'd design around the more traditional desktop envelope.

This is beyond the fact they've hit a ST Scaling Wall.
This a absolute performance rant\whine - and again i must iterate - do you with all knowledge not think that the eco-software side would adapt and learn better threading if a 8 core was standard mainstream - while 4 real cores was low end?

If the hardware giants iterated more support for more cores - cause that is the way forward for giant leaps (Just as GHZ scaling has done for us in the 90\00s) - short of some granite\carbon miracle on-coming.

You seem to argue that because Intel\AMD\TSMC\ARM don't slap up extra cores at every given possibility - the market has spoken.
And it has - i don't deny it.

Please try and gripe what i'm dreaming\ranting about.

What...if that mobile expansion wasn't there?
What if apple hadn't pushed iphones up everyone's behind.

Would Silvermont\Baytrail be that impressive - as it is today?

I don't think so.
Doesn't make logic sense - why would anyone focus there if it wasn't for the shift?

What if the shift was still 5 years off - and the only way to increase performance would mean more scaling ala GPUs.

If the hardware was out there, pushed and backed by manufacturers and OEMs alike - you think we'd still have the same eco system?


It's kind of like saying we don't need more power and no have use for it.
I gaurentee you people will find uses for it ;)

Just as i gaurentee - if volume expanding in the desktop segment for Intel\AMD for "Moar Coarz" - there'd by software people taking advantage of the possible added performance per release.

You missed the whole point of the argument\rant\whine.
And did your usual "Market proves you wrong" - well yes it does.


The whole point was to try rant why there was momentum shift in these short last 5 years (For which i still blame apple, majorly) - for intel to practicly do a 180 on it's developement focus.


If you feel the need to point out the market has spoken - go ahead.
But then you'd be out of context.

PS:
Are you well versed enough in Game Engine's vs Server workloads to say that - if the code development resources we're the same - that there's simple ZERO gain to be had from a 64 threaded Frostbite engine?

In short you do not believe that if DICE could make money by getting to 125 fps instead of 90 - like business can from shaving off compute time - they'd invest in it?
 

Virgorising

Diamond Member
Apr 9, 2013
4,470
0
0
(Rant: whine: Apologies in before locking:)

...Well it is - isn't it?

If that stupid touch phone and that stupid tablet hadn't come out - neither intel nor gf nor tsmc would be tuning they're processes to low leakage CMOS designs.

There'd be a push for getting more absolute performance within the power envelopes - we wouldn't be fantasizing about sexy smartphones and on the fly tablets.

(I don't disagree these inventions aren't useful mind you ).

But still - now we're in a slump were 10% over 2 years performance increase (Looks like it both for CPU \ GPU from the looks of r2xx :C) is AAAA-OKAY.

What the hell.


Empowering the top end - also lifts the bottom end.
There's probably milllllllllllions of slow corporate 5400 RPM slow XP machines around - that could have had another upgrade cycle were it for tablets and expensive as fuck phones.


They started the movement - they started the focus on flashy and design centric rather than performance and useful.

Remember nokia brick phones? 12 days of active time? no problem!
Yes they were simple - but they did the job.


This is Apple's fault for starting a movement - that hurts the absolute performance per dollar afficionados.

....now if you'll excuse me my better half is calling me on my iPhone -_-.


Above, thought provoking, viable, brilliant. High marks.

For all his genius, Mr. Jobs is now gone. And, the bottom line is a classic one: FORM MUST HONOR AND AUGEMNT FUNCTION.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
We would have had touchscreen devices without apple. They may have sucked in a bunch of R&D money from dumb yuppies which helped speed the process up a bit. But they are by no means responsible for the creation of devices that were featured on prominent sci-fi tv shows from 25 years ago. All apple has done is ensure that the devices are completely stupefied. Hell you cant even plug a standard charging cable into the damn thing. omgwtfruserious? Cant make this crap up. I'm surprised they let you use bluetooth devices... If they had their way you'd have to buy "special" proprietry versions of bluetooth devices. If they did, the iSheep would eat it up just the same.
 
Last edited:

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
We would have had touchscreen devices without apple. They may have sucked in a bunch of R&D money from dumb yuppies which helped speed the process up a bit. But they are by no means responsible for the creation of devices that were featured on prominent sci-fi tv shows from 25 years ago. All apple has done is ensure that the devices are completely stupefied. Hell you cant even plug a standard charging cable into the damn thing. omgwtfruserious? Cant make this crap up. I'm surprised they let you use bluetooth devices...

WAAAAAHHHHHH!!! :'( :'( :'( Did you get all your whining out?

Touch screens have existed for decades, but they weren't convenient to use until Apple added the swipe and pinch gestures. There would be no Android if it weren't for iOS.
 

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
Cant make this crap up. I'm surprised they let you use bluetooth devices... If they had their way you'd have to buy "special" proprietry versions of bluetooth devices. If they did, the iSheep would eat it up just the same.

What the hell does Bluetooth have to do with CPUs and Overclocking?
 

Virgorising

Diamond Member
Apr 9, 2013
4,470
0
0
What the hell does Bluetooth have to do with CPUs and Overclocking?

Bingo! Soooo many of my friends are all Apple in all things. Zealots. Including my BF. Way back, I delved, got that my way, what is right for me, is fiddling/upgrading, having those options always.

I know all the flaws in all Windows OSes.....but still choose Windows.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
I dare say - that IF there was no mobile expansion and 99,9% would still be using brick mortars with no tablets mainstream in sight nor cheap nettops - then yes.

I do believe Intel, AMD\GF, TSMC and other would spend more time finetuning process nodes dedicated to High Power.
Cause where else would those resources go?

Did we forget the Westmere\Sandy Bridge process that was and is a tank alot more than 22nm is?


If there was no incentive to do LP processes and design around a perf\watt within a Mobile Envelope - then they'd design around the more traditional desktop envelope.

This is beyond the fact they've hit a ST Scaling Wall.
This a absolute performance rant\whine - and again i must iterate - do you with all knowledge not think that the eco-software side would adapt and learn better threading if a 8 core was standard mainstream - while 4 real cores was low end?

If the hardware giants iterated more support for more cores - cause that is the way forward for giant leaps (Just as GHZ scaling has done for us in the 90\00s) - short of some granite\carbon miracle on-coming.

You seem to argue that because Intel\AMD\TSMC\ARM don't slap up extra cores at every given possibility - the market has spoken.
And it has - i don't deny it.

Please try and gripe what i'm dreaming\ranting about.

What...if that mobile expansion wasn't there?
What if apple hadn't pushed iphones up everyone's behind.

Would Silvermont\Baytrail be that impressive - as it is today?

I don't think so.
Doesn't make logic sense - why would anyone focus there if it wasn't for the shift?

What if the shift was still 5 years off - and the only way to increase performance would mean more scaling ala GPUs.

If the hardware was out there, pushed and backed by manufacturers and OEMs alike - you think we'd still have the same eco system?


It's kind of like saying we don't need more power and no have use for it.
I gaurentee you people will find uses for it ;)

Just as i gaurentee - if volume expanding in the desktop segment for Intel\AMD for "Moar Coarz" - there'd by software people taking advantage of the possible added performance per release.

You missed the whole point of the argument\rant\whine.
And did your usual "Market proves you wrong" - well yes it does.


The whole point was to try rant why there was momentum shift in these short last 5 years (For which i still blame apple, majorly) - for intel to practicly do a 180 on it's developement focus.


If you feel the need to point out the market has spoken - go ahead.
But then you'd be out of context.

PS:
Are you well versed enough in Game Engine's vs Server workloads to say that - if the code development resources we're the same - that there's simple ZERO gain to be had from a 64 threaded Frostbite engine?

In short you do not believe that if DICE could make money by getting to 125 fps instead of 90 - like business can from shaving off compute time - they'd invest in it?

Still not seeing a coherent argument here. You want a focus on performance instead of perf/W yet you want more cores. You don't understand that more cores needs BETTER perf/W. Sacrificing perf/W wouldn't give you more cores. It would give you marginally better peak single threaded performance, probably just for overclockers. You also really have no idea if the process is to blame for less clocking headroom on Haswell, when that could just as much be down to the design of the CPU (for Ivy Bridge it's down to the cooling).

You also are somehow not acknowledging the role of power efficiency in laptops which is what most people are using and have been using for a while. They pushed past desktops a long time ago. Do you know when Intel realized they needed to stop focusing on performance at all costs and that power efficiency was king? It was after Pentium 4 and long before iPhone was released. They already established a rule that dictated that they only made changes that improved perf/W. Now what do you think was driving this? It wasn't phones, it was the realization that the limits in power consumption are a lot lower than they thought and that their 5+GHz Prescotts were never going to happen.

And no, I don't think that software would be a lot different if mainstream CPUs had 8 cores instead of 4. When Itanium rolled around the mantra was that it had all this amazing performance potential, you just needed to do the software a little differently. Same story with Cell. Same thing with GPUs. It's getting pushed like crazy but these changes don't happen like these people want because they're naive about software and software development.

That doesn't mean more stuff can't be threaded better, but blaming this on a lack of cores isn't right - if software all grew to take advantage of whatever the mainstream hardware was then we'd be seeing everything using 8 threads already - why? Because 8 threads IS fairly mainstream (4C8T from Intel) and is becoming moreso. Or they'd at least be using 4 pretty effectively, but a lot of stuff isn't. Because a lot of software does NOT scale well with more threads.

Rakehellion said:
Touch screens have existed for decades, but they weren't convenient to use until Apple added the swipe and pinch gestures. There would be no Android if it weren't for iOS.

A lot of players added stuff to the continual development of touch interfaces. People like to give all this credit to Apple but ignore Nintendo who really kicked off touch AND motion gaming long before Apple did.

It's amazing to think that if Apple didn't do swipe and pinch that it's guaranteed no one would have come up with a similar idea.
 

Eug

Lifer
Mar 11, 2000
23,611
1,019
126
I haven't wanted a faster desktop CPU for years. Well, maybe I'd like a CPU to be able to encode full-length HD video in seconds, but other than that it's not a big deal. I'm still using my Core i7 iMac from 2010, and my Core 2 Duo MacBook Pro from 2009. Actually, my Core i7 was from 2009, but the reason I have a 2010 now is just because the machine was busted so they gave me a free upgrade.

To put that in perspective, I'm using CPUs that are four friggin' years old and I don't really feel a big need to upgrade them.

The only reason I've been considering upgrading is for other components, like SSD, USB 3, 802.11ac, and in the case of the MacBook Pro, lower weight. Oh and for my wife's ancient 2008 MacBook, it would have been nicer to have hardware H.264 video decode acceleration on the GPU, but otherwise it's fine for what she does with it, and it's fast enough to decode 1080p via the CPU anyway. In fact, I just bought this ancient MacBook used this year... in 2013, 5 years after it was released. She thinks it just flies, with the 4 GB RAM and SSD I put in it.

OTOH, I want my iPhone 5S asap.

BTW, out of interest's sake I tried running my Motorola RAZR HD Android phone like a desktop for internet surfing, and for basic stuff it was actually as good or better for the most part than my Atom Win 7 machine (3.5 GB RAM plus SSD), and of course it fit in my pocket until I plugged it into the external screen. Speed was an issue in areas, but overall it wasn't. The main problem was that the Android OS isn't built for desktop use, so there are glitches in terms of the UI and it's missing basic features like easy mouse-controlled cut-and-paste. No Flash either.

I spend a heluvalot more time surfing the net than I do encoding HD video.

You can check my sig for the hardware I own.
 
Last edited:

mavere

Member
Mar 2, 2005
186
0
76
Between the title and specious analysis, I think you would have excellent career in tech blogging.
 

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
A lot of players added stuff to the continual development of touch interfaces. People like to give all this credit to Apple but ignore Nintendo who really kicked off touch AND motion gaming long before Apple did.

The DS uses the same touch screen used in an ATM from 1992. That isn't new. And gaming consoles never brought rise to any kind of high-performance CPUs.

It's amazing to think that if Apple didn't do swipe and pinch that it's guaranteed no one would have come up with a similar idea.

Well, they didn't.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
The DS uses the same touch screen used in an ATM from 1992. That isn't new. And gaming consoles never brought rise to any kind of high-performance CPUs.

Yeah and you think Apple invented multi-touch displays? I thought we were talking about who popularized a technology for some application, but I guess we're just talking about how we should thank Apple for everything.

Well, they didn't.

So what?
 

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
Yeah and you think Apple invented multi-touch displays? I thought we were talking about who popularized a technology for some application, but I guess we're just talking about how we should thank Apple for everything.

I think we were talking about the explosion in ARM performance.
 

MisterMac

Senior member
Sep 16, 2011
777
0
0
And no, I don't think that software would be a lot different if mainstream CPUs had 8 cores instead of 4. When Itanium rolled around the mantra was that it had all this amazing performance potential, you just needed to do the software a little differently. Same story with Cell. Same thing with GPUs. It's getting pushed like crazy but these changes don't happen like these people want because they're naive about software and software development.

That doesn't mean more stuff can't be threaded better, but blaming this on a lack of cores isn't right - if software all grew to take advantage of whatever the mainstream hardware was then we'd be seeing everything using 8 threads already - why? Because 8 threads IS fairly mainstream (4C8T from Intel) and is becoming moreso. Or they'd at least be using 4 pretty effectively, but a lot of stuff isn't. Because a lot of software does NOT scale well with more threads.
.


ARGH !! - come on man.
Do not compare across different micro-architectures.

Designing HIGH level code around some fixed hardware parameters - IS NOT the same as designing highlevel or low level code for an ENTIRELY new micro-arch.

Beh on you for comparing that.



If you really believe the majority of people that BUY games have 8 threads( I don't, for one) - then why on earth would a game studio not cater to majority market?

That makes no logical sense.
Are you seriously suggesting that if we had 32 cores of even p4 uARCH - we'd still have the single thread bottleneck as the most major problem?

Are you daft when you keep referring to the current market - when i tried ranting\dreaming up a scenario where the market forces would be vastly different?

Come on.

Even TDPs of p4 do not come close to modern day nehalem's or sandy's or even Haswell's.

And if they did not have the feature states modern processers have - so it's STUPID to even compare power consumption to them.

The feature sets and sleep states and downclocking - modern CPUs support - cuts a beehole of consumed power with very little wafer real-estate.
And some of those features ironicly also require Software to work proper ;)

Please wake up and see what i'm arguing.

I'm not blaming software - i'm saying software has no ROI for doing it atm.
But if we had double, triple threads in the low end pentium\celeron\i3 crowd - your still saying stuff would not be threaded more than it is today.

Which is like saying we don't need more Bandwidth for the internet unless latency is under 10 ms.

I'm lost how you (I like alot of your posts btw) - can't follow that train of thought.
 
Last edited:

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
ARGH !! - come on man.
Do not compare across different micro-architectures.

Designing HIGH level code around some fixed hardware parameters - IS NOT the same as designing highlevel or low level code for an ENTIRELY new micro-arch.

Beh on you for comparing that.

And what, do you think everyone wrote assembly for these other uarchs? They used compilers and used the same languages. But for good performance they had to write code differently at a high level. Making code highly parallelizable and amenable to different types of CPUs (or GPUs!) has more in common than you realize.

If you really believe the majority of people that BUY games have 8 threads( I don't, for one) - then why on earth would a game studio not cater to majority market?

That makes no logical sense.

I definitely didn't say the majority of people that buy games have 8 threads. But why aren't they well utilized? Because not all software threads well and a lot that does is a very difficult undertaking with a lot of compromises. That's what I keep trying to say.

Are you seriously suggesting that if we had 32 cores of even p4 uARCH - we'd still have the single thread bottleneck as the most major problem?

Yes, I absolutely believe the single threaded bottleneck in this scenario would be the most major problem.

Are you daft when you keep referring to the current market - when i tried ranting\dreaming up a scenario where the market forces would be vastly different?

Come on.

I think you have a much different idea of what "market" means than I do. When I say market I mean what people WANT. You seem to think Apple created the iPhone market. I see it the other way around entirely - as far as I'm concerned, it's the market that encouraged Apple to create iPhone. That they got this way by realizing market trends and realizing what people would want.

Even TDPs of p4 do not come close to modern day nehalem's or sandy's or even Haswell's.

They all hit close to the same limit over 100W. The only reason the newer CPUs have grown to fill up that same limit is because they have more cores.

And if they did not have the feature states modern processers have - so it's STUPID to even compare power consumption to them.

Is that what you think I'm doing? If you don't understand how Pentium 4 convinced Intel that they needed to change their design philosophy then there's a lot you don't understand about the history of their CPU development...

The feature sets and sleep states and downclocking - modern processor supports cuts a beehole of consumed power with very little real-estates.
And some of those features ironicly also require Software to work proper ;)

These features require proper OS support, not that everyone starts writing completely different applications. I really don't see what you're getting at anyway. These power saving features have nothing to do with peak power consumption.

Please wake up and see what i'm arguing.

I'm not blaming software - i'm saying software has no ROI for doing it atm.
But if we had double, triple threads in the low end pentium\celeron\i3 crowd - your still saying stuff would not be threaded as much as possible.

Which is like saying we don't need more Bandwidth for the internet unless latency is under 10 ms.

I'm lost how you (I like alot of your posts btw) - can't follow that train of thought.

You're making several (more) strawman arguments here. But I'll start off with - no, I don't think that software will ever be threaded as much as possible. Or optimized as much as possible. That's outrageous. All software is limited by fixed schedules, budgets, and design targets - and past that it's limited by the skill of the people writing it.

But the argument has never really been about whether or not software will utilize threading as much as possible, it's been an argument about what "as much as possible" means. You seem to imply that most software can be lot more parallel than it is. Or, at the very least, you're saying that it could be the extent that justifies more cores. This is little more than a wild assumption on your part, an overly optimistic dream coming from someone who probably has no experience in making general purpose software more parallel. And the question has never been about whether or not you could use more cores, but whether or not they're worth their cost in die area.

You're blaming Intel for having poorer judgement than you claim, because they're not putting as many cores on devices as YOU think would be beneficial. When I say that the market dictates their decision, that doesn't mean that they're basing this merely on what exists now. You think that they don't evaluate potential? You think that they don't respond to where software development can go? They ARE moving along with changes in software - by making their vectors wider and more capable (AVX2) and by making their IGPs wider, more general purpose, and more tightly coupled. And guess what, they make these changes by evaluating the market and where they think it makes sense to move things.

AMD puts in more cores and they bet stronger on stuff like HSA, but it's not because they're more progressive/forward thinking than Intel, it's because they're playing in a different market. One called a niche, where it's better to be faster in 10% of software than more moderately slower in 100% of software. If they could do what Intel is doing they'd most likely have a different strategy.

What's really, REALLY puzzling is how you think Apple's products have ANYTHING to do with any of this. Intel has been putting more cores in server and enthusiast products only before iPhone. They have been focusing on increasing perf/W over increasing peak perf before iPhone. They've been working on a lower power (and cheaper) processor variant before iPhone. And somehow, somehow you think iPhone convinced Intel to not sell you a 6 core (but with no more single threaded performance..) CPU for $300 instead of $600. Please tell me how this works. Please tell me how you think developing SoC friendly processes has dictated how many cores they slap down on their high end mainstream parts.
 
Last edited:

MisterMac

Senior member
Sep 16, 2011
777
0
0
My whole gripe with all of this was - the root makers of our dear computing technology seems to have lost interest in pushing the max performance within market higher envelopes.
Or pushing Envelopes with great performance gains (Please no one mention the centurion stuff - absolute performance increase was miserable albeit admirable attempt).

Wether thru one thing or the other - and instead gone for raising performance in the smallest of envelopes.

I don't particularly like this turn of events.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
The performance the OP is asking for is already there. Intel could easily give you twice the performance today. The platform performance segmentation is not so much about process now. A few things have dictated where we are now:

1. Component Integration
2. Artificial lowering of tdp
3. Market segmentation

All these are driven by profitability.

On the first point, do you remember how Intel shafted Nvidia by moving the memory controller on-die? How about the incorporation of igps on-die, though this was inspired more by the threat of AMD?
Again, four years ago mainstream desktop quads were operating at 130watts with decent heatsinks. Today, we have 84watt quads with skimpy heatsinks and paste under the ihs. Imagine an igpless 22nm chip from Intel with a 130watt tdp! Easily 12 cores (as they have in server now), and you have the performance you're asking for. Intel won't do it. You know why?
Makes no economic sense (cost); market segmentation; competition, etc. There are smart people at Intel determining how much performance is "enough" for desktop taking many all these variables into consideration. Basically, look past the desktop for the performance you're looking for, if you can find the software to take advantage of it. Be prepared to pay through the nose, unless of course, someone lights a fire under Intel's a ss!
 
Last edited: