Go Back   AnandTech Forums > Hardware and Technology > CPUs and Overclocking

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals with Free Stuff/Contests
· Black Friday 2014
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 02-09-2013, 11:58 AM   #101
Idontcare
Administrator
Elite Member
 
Idontcare's Avatar
 
Join Date: Oct 1999
Location: 台北市
Posts: 20,492
Default

Quote:
Originally Posted by sontin View Post
Quote:
Originally Posted by Idontcare View Post
ARM can do all it wants, develop the worlds most impressive 64-bit server processor design money can buy, but if they can't manufacture it on a process node that is remotely competitive to what Intel is pumping out then the final product is destined to be standing in Intel's shadow.
Intel needs a one node advantage over ARM. Krait has a much better perf/watt than Clover Trail.

The first 22nm ATOM Soc appears only next year and only a short time before we see the first 20nm ARM SoCs on TSMC's 20nm process.
My post was specific in discussing 64-bit server applications. Yours appears to be specific in discussing mobile SoC applications.

You can see how/why that would make your post, while accurate, irrelevant to the context that mine was addressing, yes?
Idontcare is offline   Reply With Quote
Old 02-09-2013, 12:04 PM   #102
Phynaz
Diamond Member
 
Phynaz's Avatar
 
Join Date: Mar 2006
Posts: 6,658
Default

Quote:
Originally Posted by sontin View Post
Look at the Surface Pro: Even with a 60% bigger battery, runtime of low performance task is much more worse.

A Transformer Infinity with a 25 w/hr battery is running twice as long as the Surface Pro with a 41 w/hr battery. And the Infinity has only a 40nm Tegra 3 chip.
Surface pro isn't running a ULV CPU. So battery life could be doubled right there.

And that Tegra 3 is slower than slow. It's not anywhere near the same performance class.
Phynaz is offline   Reply With Quote
Old 02-09-2013, 12:30 PM   #103
Exophase
Platinum Member
 
Join Date: Apr 2012
Posts: 2,263
Default

Quote:
Originally Posted by Idontcare View Post
ARM can do all it wants, develop the worlds most impressive 64-bit server processor design money can buy, but if they can't manufacture it on a process node that is remotely competitive to what Intel is pumping out then the final product is destined to be standing in Intel's shadow.
What impact can other IP on a server SoC outside of the processor (interconnects etc)? You'd think companies like AppliedMicro, SeaMicro via AMD, Calxeda, etc have enough server pedigree that they can offer at least something unique and useful that Intel isn't. If not out of special experience at least out of being able to address more markets.
Exophase is offline   Reply With Quote
Old 02-09-2013, 12:38 PM   #104
SickBeast
Lifer
 
SickBeast's Avatar
 
Join Date: Jul 2000
Posts: 14,224
Default

Quote:
Originally Posted by blackened23 View Post
ARM devices are only competitive in terms of efficiency, they currently aren't in the same league in terms of performance and likely never will be. Obviously for cheap devices that are (mostly) used for media consumption, this isn't a huge deal.

Going forward, ARM devices will improve slightly in performance while efficiency really won't change in meaningful ways. Performance has a price, and as ARM tries to improve performance their efficiency will be worse. Intel is improving both performance and making massive jumps in efficiency - I can't see ARM devices doing well in the premium 500$ tablet market if the claims with Haswell and especially Broadwell (which will be the true game changer) are true.

If intel matches ARM on battery life with the same form factor parameters, there is simply no reason for a 500$ premium tablet to use an ARM device, period. An intel chip opens up a wide variety of additional capabilities that ARM devices just can't match; the only advantage ARM devices have is efficiency and efficiency alone - if they lose the efficiency advantage they will compete only in the bargain bin low margin market - much like AMD.
You're forgetting about price. Since when does Intel sell CPUs for less than $20 each? Not only that, but ARM has established a massive software library for iOS and Android. I think if and when Apple switches their entire platform (laptops, desktops, everything) to ARM, it will be very bad for Intel.

I agree that in certain aspects Intel holds a massive performance advantage over ARM. That said, how fast do you need your smartphone's CPU to be? I have a dual core snapdragon and I don't really feel the need for more processing power. I don't even know what I would do with it. Price and battery life are most important for phones and tablets, and ARM simply owns this market. It's got to be affecting Intel's bottom line by now and IMO they are too late to respond with Haswell which looks by all accounts to be a very efficient desktop processor. It will be interesting to see how it does in a tablet, but really if you look at the Surface Pro with Ivy Bridge, it barely gets 4 hours of battery life whereas the ARM tablets can run all day if not longer.
__________________

SickBeast is online now   Reply With Quote
Old 02-09-2013, 06:51 PM   #105
dagamer34
Platinum Member
 
Join Date: Aug 2005
Location: Houston, TX
Posts: 2,548
Default

Quote:
Originally Posted by sontin View Post
Intel needs a one node advantage over ARM. Krait has a much better perf/watt than Clover Trail.

The first 22nm ATOM Soc appears only next year and only a short time before we see the first 20nm ARM SoCs on TSMC's 20nm process.
Krait is competing against a 5-year old Atom design that I'm sure Intel has been purposefully gimping/neglecting in order to sell more Core chips. Imagine if they had actually dedicated even 1/2 the amount of resources their Core line gets to Atom chips, churning out improvements every year, the ARM threat would be toast and the iPad wouldn't be quite so dominant. But it's hard to let go of selling $225 mobile chips...
__________________
iMac with Retina 5K Display | 15" MacBook Pro with Retina Display| Mac mini | iPad Air 2 | iPhone 6 Plus
dagamer34 is offline   Reply With Quote
Old 02-09-2013, 07:09 PM   #106
RU482
Lifer
 
RU482's Avatar
 
Join Date: Apr 2000
Posts: 12,597
Default

Quote:
Originally Posted by Phynaz View Post
Surface pro isn't running a ULV CPU. So battery life could be doubled right there.

And that Tegra 3 is slower than slow. It's not anywhere near the same performance class.
last time I checked, the Core i5-3317U is a ULV CPU
RU482 is offline   Reply With Quote
Old 02-09-2013, 07:12 PM   #107
Charles Kozierok
Elite Member
 
Join Date: May 2012
Posts: 6,762
Default

Intel is hyping up Haswell for the obvious reason -- new product coming out and they want to sell chips.

But I think Broadwell is the uarch that's really going to make people sit up and take notice. Just a hunch, but my guess is that more power efficiency tweaks plus the shrink to 14 nm are going to make a huge difference.

I'm way, way overdue for a new laptop but might wait another year because of this, depending on how good Haswell devices actually perform.
__________________
"Of those who say nothing, few are silent." -- Thomas Neill
Charles Kozierok is offline   Reply With Quote
Old 02-09-2013, 07:13 PM   #108
Charles Kozierok
Elite Member
 
Join Date: May 2012
Posts: 6,762
Default

Quote:
Originally Posted by RU482 View Post
last time I checked, the Core i5-3317U is a ULV CPU
It is, but they have an even lower class now, the "Y" chips.
__________________
"Of those who say nothing, few are silent." -- Thomas Neill
Charles Kozierok is offline   Reply With Quote
Old 02-09-2013, 09:18 PM   #109
Roland00Address
Golden Member
 
Join Date: Dec 2008
Posts: 1,377
Default

Quote:
Originally Posted by Charles Kozierok View Post
It is, but they have an even lower class now, the "Y" chips.
No devices are out yet in the wild. They will be soon, probably in only 3 months, but still not yet.

That said we will probably get ulv haswell in only 6 to 7 months.
Roland00Address is online now   Reply With Quote
Old 02-10-2013, 05:52 AM   #110
carop
Member
 
Join Date: Jul 2012
Posts: 40
Default

Quote:
Originally Posted by Exophase View Post
What impact can other IP on a server SoC outside of the processor (interconnects etc)? You'd think companies like AppliedMicro, SeaMicro via AMD, Calxeda, etc have enough server pedigree that they can offer at least something unique and useful that Intel isn't. If not out of special experience at least out of being able to address more markets.
According to the SeaMicro article by Marcus Pollice at BSN, Intel did not like the idea of putting Atom CPUs into servers when Andrew Feldman approached Intel. Eventually, they were allowed to design a microserver board based on Atom CPUs. Feldman is now running the Data Center Service Division at AMD.

http://www.brightsideofnews.com/news...-business.aspx

It seems that this reluctance of Intel may have left the door open for ARM servers:

http://www.eetimes.com/electronics-n...r-open-for-ARM

Intel will probably lead. However, there might be an ASP disruption ahead.
carop is offline   Reply With Quote
Old 02-10-2013, 08:03 AM   #111
Piroko
Senior Member
 
Join Date: Jan 2013
Posts: 393
Default

Quote:
Originally Posted by Phynaz View Post
Surface pro isn't running a ULV CPU. So battery life could be doubled right there.
Pretty far fetched that a slightly more agressive bin of the same die will lead to doubling the battery life.
Haswell might improve on this, but I doubt we'll see a lot of those Ivb Y chips in devices until then.
Piroko is offline   Reply With Quote
Old 02-10-2013, 08:18 AM   #112
jihe
Senior Member
 
Join Date: Nov 2009
Posts: 410
Default

Quote:
Originally Posted by SickBeast View Post
You're forgetting about price. Since when does Intel sell CPUs for less than $20 each? Not only that, but ARM has established a massive software library for iOS and Android. I think if and when Apple switches their entire platform (laptops, desktops, everything) to ARM, it will be very bad for Intel.
LOL I assure you it will be worse for apple.

Quote:
I agree that in certain aspects Intel holds a massive performance advantage over ARM. That said, how fast do you need your smartphone's CPU to be? I have a dual core snapdragon and I don't really feel the need for more processing power. I don't even know what I would do with it. Price and battery life are most important for phones and tablets, and ARM simply owns this market. It's got to be affecting Intel's bottom line by now and IMO they are too late to respond with Haswell which looks by all accounts to be a very efficient desktop processor. It will be interesting to see how it does in a tablet, but really if you look at the Surface Pro with Ivy Bridge, it barely gets 4 hours of battery life whereas the ARM tablets can run all day if not longer.
jihe is offline   Reply With Quote
Old 02-10-2013, 09:02 AM   #113
blackened23
Diamond Member
 
Join Date: Jul 2011
Posts: 8,556
Default

It all boils down to whether the capability and "slow but good enough" performance of ARM devices is enough for some people. It is for some, but not for others.

I am one who wants more functionality, I don't want two devices - I want one; I want one device that I can use for media consumption and ipad-esque toy value, and I want the same device to double up for photoshop, excel, powerpoint, and steam games every now and then. Currently I have both a tablet and an ultrabook. I recognize that there are some interesting uses of the ipad for productivity related usage, but I think everyone can agree that x86 is far more versatile with respect to productivity, all other things being equal.

Cheap, slow, and "good enough" ARM devices are a sizable market chunk, sure I can agree to that. But there are some who definitely yearn for more out of their mobile computing experience, and intel will deliver that WITHOUT any sacrifice to battery life (if rumors are true). That is why I believe intel will dominate the premium 500$+ tablet market once broadwell, possibly even haswell hits. Again -- if intel delivers on all fronts, ARM will only be competing for price; Intel will dominate the premium 500$+ tablet area if they deliver on battery life.

Last edited by blackened23; 02-10-2013 at 09:06 AM.
blackened23 is offline   Reply With Quote
Old 02-10-2013, 12:17 PM   #114
Phynaz
Diamond Member
 
Phynaz's Avatar
 
Join Date: Mar 2006
Posts: 6,658
Default

Quote:
Originally Posted by Piroko View Post
Pretty far fetched that a slightly more agressive bin of the same die will lead to doubling the battery life.
Haswell might improve on this, but I doubt we'll see a lot of those Ivb Y chips in devices until then.
14w vs 35w.
Phynaz is offline   Reply With Quote
Old 02-10-2013, 01:01 PM   #115
NTMBK
Diamond Member
 
NTMBK's Avatar
 
Join Date: Nov 2011
Posts: 4,594
Default

Quote:
Originally Posted by Phynaz View Post
14w vs 35w.
i5-3317U has a 17W TDP.
__________________
Quote:
Originally Posted by Maximilian View Post
I like my VRMs how I like my hookers, hot and Taiwanese.
NTMBK is offline   Reply With Quote
Old 02-10-2013, 03:07 PM   #116
jvroig
Super Moderator
Elite Member
 
Join Date: Nov 2009
Posts: 2,397
Default

The exact TDP actually doesn't matter in this context. Even if the CPU TDP's in question were 10W vs 35W, instead of 17W vs 35W, I don't think it would lead to a doubling of battery life.

Other components draw more power than the CPU's contribution to the total power consumption, and unless all these other components also decrease significantly (screen/display, wi-fi, radio, storage subsys, etc), battery life won't double from where it is now just because the CPU component consumes 50+% less power.
jvroig is offline   Reply With Quote
Old 02-10-2013, 03:19 PM   #117
Piroko
Senior Member
 
Join Date: Jan 2013
Posts: 393
Default

Quote:
Originally Posted by Phynaz View Post
14w vs 35w.
17W TDP probably clocked 1 Ghz @ 1V when playing back hd streams vs. 35W TDP, clocked 1 Ghz @ 1-1.1V. Shouldn't take you more infos to figure out the potential savings @ low load scenarios.
Piroko is offline   Reply With Quote
Old 02-10-2013, 03:49 PM   #118
Phynaz
Diamond Member
 
Phynaz's Avatar
 
Join Date: Mar 2006
Posts: 6,658
Default

Quote:
Originally Posted by NTMBK View Post
i5-3317U has a 17W TDP.
You're correct. It's not the 7W.
Phynaz is offline   Reply With Quote
Old 02-11-2013, 04:55 PM   #119
IntelUser2000
Elite Member
 
IntelUser2000's Avatar
 
Join Date: Oct 2003
Posts: 3,516
Default

First, stop with the AMD vs Intel posts!! Do it in a thread that actually discusses about it. I know there are tons of them. I swear, half of the population nowadays have ADD.

-TDP is absolutely irrelevant for low load battery life.
-In real life though, 17W chips are better binned than 35W ones so they do exhibit lower power use in all scenarios. But the difference other than TDP is very little
-When they say "TDP" it means when its running at TOP frequency ranges, not meager 1GHz ones. TDP is for thermal design, which means its worst case, which means it actually is for heavy load power use but not much worth for anything less

Modern CPUs already use few hundreds mW of power at deep C states, and it reaches it often, so even eliminating that won't help more than 5%.
Fortunately, despite all the focus about Haswell on the CPU side, its really about the platform power management.

Of course, you won't see outrageous gains like 3x, but it should be really good. Caveat is though, it only applies for Ultrabook platforms, not the ultra high end quad core chips and mainstream laptops.

2012 2nd Generation Ultrabook "Ivy Bridge" guidelines state: Minimum 5 hours of battery life

2013 3rd Generation Ultrabook "Haswell" guidelines state: Minimum 8 hours of battery life

-Battery life claims are always in light load scenarios, like optimized streaming, or local video playback or web browsing
-Yes, I've seen slides that also claim 9 hours. But I believe 8 hours is for lower cost lower capacity battery ones while premium systems are for 9+ hours with better optimization and more capacity battery.
-You should realistically see gains in heavy load as well, because we're going from 17W TDP + chipset to a 15W SoC. But probably not enough to matter for most.
__________________
Core i7 2600K + Turbo Boost | Intel DH67BL/GMA HD 3000 IGP | Corsair XMS3 2x2GB DDR3-1600 @ 1333 9-9-9-24 |
Intel X25-M G1 80GB + Seagate 160GB 7200RPM | OCZ Modstream 450W | Samsung Syncmaster 931c | Windows 7 Home Premium 64-bit | Microsoft Sidewinder Mouse | Viliv S5-Atom Z520 WinXP UMPC

Last edited by IntelUser2000; 02-11-2013 at 04:57 PM.
IntelUser2000 is offline   Reply With Quote
Old 02-11-2013, 05:01 PM   #120
Charles Kozierok
Elite Member
 
Join Date: May 2012
Posts: 6,762
Default

I wish Intel or someone would put one tenth the R&D dollars into new battery technologies that they do into lower-powered chips. How long has it been since a new battery technology went mainstream? Seems like we've been using lithium ions for at least a decade.
__________________
"Of those who say nothing, few are silent." -- Thomas Neill
Charles Kozierok is offline   Reply With Quote
Old 02-11-2013, 05:06 PM   #121
IntelUser2000
Elite Member
 
IntelUser2000's Avatar
 
Join Date: Oct 2003
Posts: 3,516
Default

Quote:
Originally Posted by blackened23 View Post
That is why I believe intel will dominate the premium 500$+ tablet market once broadwell, possibly even haswell hits. Again -- if intel delivers on all fronts, ARM will only be competing for price; Intel will dominate the premium 500$+ tablet area if they deliver on battery life.
It may not be that simple. The biggest roadblock to their success is now Microsoft, with their Windows 8 OS.

I've bought a Tablet for my parents based on Windows 8, and while they are fortunate that they don't have to figure out themselves, other people do. Various factors are putting pressure on PC manufacturers to go for lower quality(lower ASPs for example), and skimp on things they can't sacrifice on.

-Things like how Touchpad sucks on Windows devices, and Windows 8 makes it worse with default "gesture" on the touchpad interfering with regular touchpad operation.
-Modern UI is a useless version of the Desktop, with worse quality, feature-cut, and less compatibility. Case in point is Skype, where Desktop shows a 2 megapixel camera as a 2 megapixel quality but as blocky textures in the Modern UI version. "Modern" UI my ass! IE is a flash-disabled version on the Modern UI, while the Desktop UI is a full featured one with home buttons, forward buttons, etc.

My parents find it complicated to remember 3 gestures to log into Windows 8. How would they find everything else?

Unfortunately, that's what I believe most people are. And even for techies like us, we do want an easier to use system, not a more complicated one.
__________________
Core i7 2600K + Turbo Boost | Intel DH67BL/GMA HD 3000 IGP | Corsair XMS3 2x2GB DDR3-1600 @ 1333 9-9-9-24 |
Intel X25-M G1 80GB + Seagate 160GB 7200RPM | OCZ Modstream 450W | Samsung Syncmaster 931c | Windows 7 Home Premium 64-bit | Microsoft Sidewinder Mouse | Viliv S5-Atom Z520 WinXP UMPC
IntelUser2000 is offline   Reply With Quote
Old 02-11-2013, 05:14 PM   #122
IntelUser2000
Elite Member
 
IntelUser2000's Avatar
 
Join Date: Oct 2003
Posts: 3,516
Default

Interesting thing. I've heard a conversation of my co-workers talking about PC games. They say new games are "too complicated". Look at the games people play on their Smartphones. It may be "3D", but the popular ones are side-scrollers! It's back to the future in real life!

Also read about how StarCraft 2 isn't as popular in Korea as StarCraft was and some state its because its too complicated. StarCraft 2 has definite counters to units, so its not as viable to mass one unit win as in the first StarCraft.

I'm gradually lowering my expectations on Windows devices being the key for PC. Statistics show that the decline is just like the time when Windows took over other operating systems 20-30 years ago. That's not going to stop unless MS really pulls their head out of their asses. And that's very much relevant for Intel. I suggest they look at funding into alternative operating systems even for their Core chips, like Android and Chrome. It may not be a viable contender now, but things are changing, very fast.

Simple, wins.
__________________
Core i7 2600K + Turbo Boost | Intel DH67BL/GMA HD 3000 IGP | Corsair XMS3 2x2GB DDR3-1600 @ 1333 9-9-9-24 |
Intel X25-M G1 80GB + Seagate 160GB 7200RPM | OCZ Modstream 450W | Samsung Syncmaster 931c | Windows 7 Home Premium 64-bit | Microsoft Sidewinder Mouse | Viliv S5-Atom Z520 WinXP UMPC

Last edited by IntelUser2000; 02-11-2013 at 05:16 PM.
IntelUser2000 is offline   Reply With Quote
Old 02-11-2013, 05:59 PM   #123
Idontcare
Administrator
Elite Member
 
Idontcare's Avatar
 
Join Date: Oct 1999
Location: 台北市
Posts: 20,492
Default

Quote:
Originally Posted by Charles Kozierok View Post
I wish Intel or someone would put one tenth the R&D dollars into new battery technologies that they do into lower-powered chips. How long has it been since a new battery technology went mainstream? Seems like we've been using lithium ions for at least a decade.
Battery tech is limited by the periodic table and it is brutally limited in options

There are few, truly few, options beyond lithium ion for next gen battery tech. Hence the discussion on fuel-cell alternatives.

Energy density is rather limited when it comes to chemical bonds. Now if you go nuclear - fissile or fusion - you gain a few orders of magnitude in energy density.

I'm hoping for a back-to-the-future moment with smartphones/tablets/laptops in that we will return to those solar-cell strips that our desktop calculators had all the way back in the late 70's and early 80's. Now that innovation was forward thinking
Idontcare is offline   Reply With Quote
Old 02-11-2013, 06:15 PM   #124
ShintaiDK
Lifer
 
ShintaiDK's Avatar
 
Join Date: Apr 2012
Location: Copenhagen
Posts: 11,123
Default

Quote:
Originally Posted by Idontcare View Post
Now if you go nuclear - fissile or fusion - you gain a few orders of magnitude in energy density.
That would be some interesting news when a battery blows up as usual...
__________________
Quote:
Originally Posted by Idontcare
Competition is good at driving the pace of innovation, but it is an inefficient mechanism (R&D expenditures summed across a given industry) for generating the innovation.
ShintaiDK is online now   Reply With Quote
Old 02-11-2013, 08:13 PM   #125
Charles Kozierok
Elite Member
 
Join Date: May 2012
Posts: 6,762
Default

Well, there are people working on potential breakthroughs. But it's taking a long time for them to get here.
__________________
"Of those who say nothing, few are silent." -- Thomas Neill
Charles Kozierok is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 05:19 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.