• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

The Intel Atom Thread

Page 18 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Given that Windows 8.1 tablets are around 300$, I have to question why Windows RT still exists, and why MS is releasing a Surface RT 2. That's just going to be a complete disaster. Nobody wanted to buy RT last year, and now they have even LESS reason to do so since Windows 8.1 tablets are the same price.

Well its a bad product for sure.
But MS wants the Intel tax. And Intel wants the MS tax.
The consumers dont want to pay for the tax.
Neither MS or Intel seems to understand that.
Their business models is getting old.
They can continue on the markets where they have monopoly. In Android and arm world its not going to work.
 
My thoughts too. Bay Trail will bring a wave of very interesting ~$250-350 devices. Just look at this ASUS, an inexpensive x86 convertible with Windows 8.1 + Office and a quad-core processor that puts Clover Trail to shame. ASUS describes Bay Trail-T as the perfect balance between performance and energy efficiency. 🙂

@ podspi: $349 with the dock.

Microsoft better price the next Surface RT at $399 or lower.

Intel is practically giving bt away for free as they did with ct. It will make them sell until they get tired of doing so. But wilm it make win take off? I doubt it. But at least Intel is going 100% for it now.
 
The price is good, but active cooling on a 7.5W TDP? Seriously? A decent heatsink design (perhaps integrated into the case) could easily handle that passively.
Fact is that TDP doesn't represent max power usage, the chip will go much higher for a few seconds. My non OC 4770K, with HT and iGPU disabled reaches more than 140W while doing heavy computations (according to Intel sensors); and that chip TDP is supposed to be 84W. Of course BT is a different beast, but I nonetheless expect TDP to be sometimed exceeded.
 
No argument that Bay Trail looks much better than Clover Trail - I buy my own Christmas presents these days and this year I'm thinking I need new Bay Trail tablet. That 8" Dell looks interesting to me also.

But .. I also currently have a Clover Trail HP Elitepad, - it does have some lags occasionally when opening a x86 program but all in all I'm not really disappointed in it. Prefer it to my android tablets and my lady's ipad.

Ct on win is not a bad cpu outside of gaming. Bt for sure will help with the experience but i simply dont see the difference.

For me the difference between s3 and s4 cpu power was zero. Same experience. I couldnt care less for performance. Hell even the camera is more or less the same. What improved was a stellar screen in a light and small body.
 
It's outperforming the A4-5000 in multi-threaded tasks while using under 2.5W.
1.jpg
2.jpg
3.jpg

4.jpg
5.jpg

yep, outperforms it in single threaded scenarios. but gpu is where it is at for a4-5000, not to mention the rest of the platform i/o
 
yep, outperforms it in single threaded scenarios. but gpu is where it is at for a4-5000, not to mention the rest of the platform i/o

Overall it also outperforms it in multithreaded. And thats with the A4 5000 being a 15W part.

The only advantage left is the GPU again. Something that is drifting away fast.
 
It's outperforming the A4-5000 in multi-threaded tasks while using under 2.5W.
1.jpg
2.jpg
3.jpg

4.jpg
5.jpg

The 2.5W is an illusion...
DT variants are speced 7W despite frequencies that are
not higher , and it s not the PCIE and SATA fusing off
as well as single mem channel that will make a big difference.
 
No way. On my laptop with a 1080p 15.6" panel platform power use goes from 16 to 20.5 watt from highest to lowest brightness.

http://www.notebookcheck.net/Lenovo-IdeaPad-Y580-20994BU-Laptop-Review.78974.0.html

As per the review here the max brightness on my screen is 230-240 nits.

Notebookcheck measures power as follows for notebooks. I'm assuming tablet is similar.


Idle: (Measurements under Windows 8 desktop)
energy saving profile, minimum brightness, Wi-Fi off:
balanced profile, maximum brightness, Wi-Fi off
maximum performance profile, maximum brightness, Wi-Fi on

Ipad 4 is respectively 2.6/8/8.4 watts. Max brightness for ipad 4 is 290-300 nits.

Toshiba Excite Pro is 2.5 / 4.7 / 6.1 watts. (310 nits)

nexus 10 is 3.7 / 8.4 / 9 watts (390 nits).

This thing at 200 nits is using more than 1W.

The number i provided , about 1W , is the one given
by Intel in their HW mobile plateform power comsumption
evaluation for ultrabooks , i posted a slide in a thread
by there about HW but then no one seemed to point
that the panel was draining not enough power....

Funny , isnt it , that now panels are supposed to consume
conveniently more on another intel power comsumption estimation...

Edit : actualy it s 2 to 2.3W/150nits but with an ultrabook screen wich has significantly more area than a 10" that should be close to 1W...

haswell-18b.jpg
 
Last edited:
The price is good, but active cooling on a 7.5W TDP? Seriously? A decent heatsink design (perhaps integrated into the case) could easily handle that passively.

It could be that the desktop parts just consume more power and require more cooling. However,after looking at the Celeron 887 based nettops,once taxes and other bits are added(including Windows even) where I live the cost is not that different with the NUC. This is a bit of a shame as Bay Trail is an SOC,has a much smaller die,less supporting chips,less complex PCB and less cooling requirements than the Celeron 887,so I expect it is more of a case Intel wanting to reduce costs at this price-point,to increase margins.


Overall it also outperforms it in multithreaded. And thats with the A4 5000 being a 15W part.

The only advantage left is the GPU again. Something that is drifting away fast.

It is not really drifting anywhere TBH,as the A4 5000 has a GCN based IGP,not the ancient VLIW based ones found in older AMD parts. GCN was a sea change for AMD especially in compute performance.
 
Last edited:
The number i provided , about 1W , is the one given
by Intel in their HW mobile plateform power comsumption
evaluation for ultrabooks , i posted a slide in a thread
by there about HW but then no one seemed to point
that the panel was draining not enough power....

Funny , isnt it , that now panels are supposed to consume
conveniently more on another intel power comsumption estimation...

Edit : actualy it s 2 to 2.3W/150nits but with an ultrabook screen wich has significantly more area than a 10" that should be close to 1W...

haswell-18b.jpg

A 10.1" 1600p screen is likely consuming a similar amount (maybe a bit less but more than half) of energy as a 13" 768p screen.

I'm not sure why the display is going to consume more energy under office than while watching video.

At 150 nits I'd expect a 10" 1600p screen to consume around 1.2-1.6 watts depending on the screen.

(My laptop's TN screen 15.6", 1080p consumes about 6-7 W between off and 240 nits). Power measured as the delta between power before entering the charger at screen deactivated and screen max brightness.
 
Last edited:
It is not really drifting anywhere TBH,as the A4 5000 has a GCN based IGP,not the ancient VLIW based ones found in older AMD parts. GCN was a sea change for AMD especially in compute performance.

Compute performance havent (essentially) sold anything yet. The wast majority doesnt care. And the very wast of software doesnt support it. Also its yet another point where AMD loses advantage fast. R

And AMD is not expanding or holding their IGP lead, its dimminishing. AMD simply flopped in the attempt of utilize the strengths they bought from ATI. Even execution of the integration they failed at. But again, it cant be fun either to have to give IGPs for free, at the expense of the dGPUs that they could get money from. In short, internal conflicts, both of interest and economics.
 
Last edited:
It is not really drifting anywhere TBH,as the A4 5000 has a GCN based IGP,not the ancient VLIW based ones found in older AMD parts. GCN was a sea change for AMD especially in compute performance.

Its drifting away. Look at power numbers and performance. Under games the difference between the two will decrease even more due to Baytrail's dual channel RAM and slightly better CPU performance (kabini is held back by CPU a lot).

I don't expect a successor to jaguar for more than a year. Looking at the time it takes for AMD to integrate its gpu ip into its APU designs we are not going to see anything better than GCN on its APUs for some time.

And compute sucks on kabini due to power restraints and is of limited importance in that device class.
 
Its drifting away. Look at power numbers and performance. Under games the difference between the two will decrease even more due to Baytrail's dual channel RAM and slightly better CPU performance (kabini is held back by CPU a lot).

BTW,Enigmoid the NUC(designed by Intel) I linked to uses a single channel memory controller and also has active cooling(you said neither would be the case in this class). It would make sense due to lower BOM and a cheaper lower binned part,which is what many companies are interested in anyway,just with a number of cheaper laptops and desktops which come with only a single DIMM to save on costs. Quite common in this part of the market really. Moreover,Intel has other low power parts which it wants to sell,so good old tiers also add to the equation.

I don't expect a successor to jaguar for more than a year. Looking at the time it takes for AMD to integrate its gpu ip into its APU designs we are not going to see anything better than GCN on its APUs for some time.

And compute sucks on kabini due to power restraints and is of limited importance in that device class.

Compute performance havent (essentially) sold anything yet. The wast majority doesnt care. And the very wast of software doesnt support it.

And AMD is not expanding or holding their IGP lead, its dimminishing. AMD simply flopped in the attempt of utilize the strengths they bought from ATI. Even execution of the integration they failed at.

rs6ki9.jpg
 
Last edited:
The 2.5W is an illusion...
DT variants are speced 7W despite frequencies that are
not higher , and it s not the PCIE and SATA fusing off
as well as single mem channel that will make a big difference.

Notebook variants with 2GHz base clock (not 1.46GHz), higher clocked GPU and PCIe + SATA top out at 7.5W TDP. I'll give you an illusion, the illusion that any AMD APU right now would deliver comparable CPU performance (vs Z3770) at 1-2.5W under CPU load. 🙂
 
For Bay Trail-T it's reality, probably a hard reality for AMD fans.


The DT version at 1.46/2.4 is 10W TDP , only a gullible
can think that stripping a few functions will drop said
power to 2.5W with all four cores running at 2.4Ghz.

Actualy Intel didnt provide any TDP numbers , the chip
being demonstrated used much in excess of 2.5W
since it was run at 2.4 even in MT , hence Intel is just
trying to induce that it s 2.5W using cluless reporters
as mouthpiece since what they want is the perception
that it s 2.5W even if it s three times more actualy.

Once real products hit the shelves we ll know better
and be prepared to bury your 2.5W myth.
 
The DT version at 1.46/2.4 is 10W TDP , only a gullible
can think that stripping a few functions will drop said
power to 2.5W with all four cores running at 2.4Ghz.

Actualy Intel didnt provide any TDP numbers , the chip
being demonstrated used much in excess of 2.5W
since it was run at 2.4 even in MT , hence Intel is just
trying to induce that it s 2.5W using cluless reporters
as mouthpiece since what they want is the perception
that it s 2.5W even if it s three times more actualy.

Once real products hit the shelves we ll know better
and be prepared to bury your 2.5W myth.

10W including the GPU, which is significantly higher clocked in the desktop than in the tablet.
 
10W including the GPU, which is significantly higher clocked in the desktop than in the tablet.

The plateform demonstrated by intel was running at 2.4
in both ST and MT tests wich do not use the GPU while
BT CPU can use the available power budget if the GPU
is iddling...
 
The DT version at 1.46/2.4 is 10W TDP , only a gullible
can think that stripping a few functions will drop said
power to 2.5W with all four cores running at 2.4Ghz.

Actualy Intel didnt provide any TDP numbers , the chip
being demonstrated used much in excess of 2.5W
since it was run at 2.4 even in MT , hence Intel is just
trying to induce that it s 2.5W using cluless reporters
as mouthpiece since what they want is the perception
that it s 2.5W even if it s three times more actualy.

Once real products hit the shelves we ll know better
and be prepared to bury your 2.5W myth.

What?

Every review shows Soc power consumption under general tasks as comparable to clovertrail (which used middling to low amounts of power compared to the ARM competition). I haven't seen SOC power consumption over 2.5w anywhere. Intel measured power consumption probably similarly to the x86 clover trail articles.

Techreport
What we saw was very similar power consumption from one generation to the next. Both tablets tended to idle at about 2 W of power draw, and both used 3-4 W during video playback. Total system power draw is probably a bit higher during CPU- and GPU-intensive workloads, but we didn't get any full-platform power use numbers for such scenarios.

We also got a look at the individual power consumption of our tablets' graphics and CPU components in two scenarios: gaming and SunSpider testing.

While gaming, the Clover Trail system's graphics drew about 650 mW, and the CPU drew 700 mW. The Bay Trail system's total power use wasn't far from Clover Trail's, but the mix was very different, with 1.2W going to the IGP and 100-150 mW heading to the CPU. To be fair, though, the Bay Trail IGP was driving a much higher-resolution display.

In SunSpider, the CPU/GPU split on Clover Trail was 900/350 mW, while Bay Trail's was 1000/475 mW—again, comparable total power use. Of course, Bay Trail finished the SunSpider test in half the time and then dropped back to idle, so it was easily the more power-efficient solution overall.

I think the big takeaway here is that Bay Trail's power consumption habits should make it suitable for eight-hour-plus battery run times in tablets, much like Clover Trail before it. The big change is that you'll be getting substantially higher performance at the same time.

The 2W and 3-4W during playback are platform figures which are very good with a 1600p screen (perhaps too good?) on baytrail.

We clearly have gaming power numbers and it seems that baytrail under gaming loads has tons of CPU power to spare allowing turbo to be deactivated (similar to how my laptop deactivates turbo when gaming as a 2.4 ghz ivy quad cannot limit a 660m in almost any game at 1080p). I'm not sure about max loads but I doubt that we will see larger than 2.5 W (cinebench) + 1.2 W (igp gaming loads) = sub 4 W power use. Baytrail under heavy loads may use a bit more power but under light/idle loads it should be fine.

http://www.anandtech.com/show/6529/busting-the-x86-power-myth-indepth-clover-trail-power-analysis/5

When you read this article you realize how impressive baytrail is. Similar amounts of power at more than twice the speed.

I also don't see any tests where an inability to keep turbo becomes readily apparent. In every 3dmark test intel's score is dragged upwards by its CPU.
 
Last edited:
What?

Every review shows Soc power consumption under general tasks as comparable to clovertrail (which used middling to low amounts of power compared to the ARM competition). I haven't seen SOC power consumption over 2.5w anywhere. Intel measured power consumption probably similarly to the x86 clover trail articles.

.

Power figures were provided by intels hardware , actualy
all the reviewers tested theses gears in the same place
at intel forum....
 
Power figures were provided by intels hardware , actualy
all the reviewers tested theses gears in the same place
at intel forum....
Marketing exaggeration and trying to confuse people with different ways of representing data (SDP) is one thing. What you're insinuating is in another league and would seriously hurt Intels credibility.

I think the much more likely case is that Intel is simply really proud of Baytrail and want to show it off as much as possible.
 
Power figures were provided by intels hardware , actualy
all the reviewers tested theses gears in the same place
at intel forum....

Yes. And almost without exception, that kind of behavior would have had the exact opposite reaction if it were AMD, as we've seen in many other threads. That's viral marketing for ya.

Now you've likely got the lawyers coming out to publicly defend these practices as well. 😛
 
I'm not sure why the display is going to consume more energy under office than while watching video.

Dynamic backlight control. It's traditionally been used to boost the effect contrast ratio, which is why it comes into play slightly on video content where dark scenes occur occasionally but not in office applications where there's almost always a white background. Note that the Baytrail platform actually turns this into a much greater power saving effect through DPST which essentially reduces the backlight such that the brightest point on screen is full-on and then adjusts the remaining pixels to compensate.
 
Back
Top