The Intel Atom Thread

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Khato

Golden Member
Jul 15, 2001
1,257
327
136
he stated that...

I know. He said that I "didn't look very far" and then provided platform power consumption figures to try and refute my statement that the Baytrail SoC never used more than 2.5W in any of the published previews that I'm aware of.
 

Abwx

Lifer
Apr 2, 2011
11,855
4,832
136
I know. He said that I "didn't look very far" and then provided platform power consumption figures to try and refute my statement that the Baytrail SoC never used more than 2.5W in any of the published previews that I'm aware of.

Do you realize than all thoses "previews" were made
under control of intel for all tests and that the comsumption
numbers quoted were provided by intel s hardware
and that no one did himself a measurement..??.

During part of our briefing we were taken to a room where a Clover Trail and Bay Trail system were working side by side and connected up to advanced power monitoring hardware and software.

Anyway , if the system is using 2.9W just browsing it s impossible
that when fully loaded it will stay at 3W , that s just being either
gullible or clueless about basics electric laws.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
There is no practical difference between TDP and SDP. SDP measures the typical power consumption during average application use. TDP measures the versatality required by the cooler during average application use.
AMD a4-5000 17 TDP
i3-3217U 17 W SDP
power-gaming.png


Who is more accurate?

Have fun explaining that one. You act like TDP and SDP are completely different things. They aren't.
Clearly the same...

Every silicon manufacturer has their own, different, definitions for TDP and they are not directly comparable.
So now TDP and SDP is not the same thing? Good we have that cleared already.

SDP measures platform power.
Power? What do you mean? GFLOPS? Because clearly not power consumption (graph above)

For all we know AMD is (and likely are) lying through their teeth. This is why their power consumption is always worse than other competing solutions.
...
AMD can and is dishonest about it - why else would their load power be so terrible.
WTF?
 

mikk

Diamond Member
May 15, 2012
4,292
2,382
136
Do you realize than all thoses "previews" were made
under control of intel for all tests and that the comsumption
numbers quoted were provided by intel s hardware
and that no one did himself a measurement..??.


Anand did himself.
 

liahos1

Senior member
Aug 28, 2013
573
45
91
He did not as far as I am aware.

read the article

"Intel left me to install and run anything I wanted to during a period of a few hours at their campus in Santa Clara."

listen to the podcast also. they were given free reign to do whatever they wanted. install the apps, run the benchmarks they wanted.

consumption numbers were provided by intel hardware on sight. but they werent "given numbers" by intel without seeing the draw themselves. this is getting ridiculous.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
read the article

"Intel left me to install and run anything I wanted to during a period of a few hours at their campus in Santa Clara."

listen to the podcast also. they were given free reign to do whatever they wanted. install the apps, run the benchmarks they wanted.

He didn't do any measurements is what I meant.
 

Khato

Golden Member
Jul 15, 2001
1,257
327
136
Do you realize than all thoses "previews" were made
under control of intel for all tests and that the comsumption
numbers quoted were provided by intel s hardware
and that no one did himself a measurement..??.

Yes, and I can only wish that other companies would step up and provide the same level of support to reviewers as Intel... but they won't, because it would show that they're losing.

Without Intel going through the effort we'd only have as much information as we did back when Qualcomm did their preview of Snapdragon 800. Which is to say that we'd see some awesome benchmarks and have no idea about how much power it actually required to achieve that performance. At least not until actual products came out, which they now have, and surprise surprise the performance doesn't measure up to Qualcomm's preview.

I'd be all for review sites purchasing the equipment necessary to do low level power measurements, or at the very least do one battery life run while running a benchmark loop so that we could have an idea how much power SoCs actually draw under load. The closest information to that currently available comes from notebookcheck with their idle and load power consumption numbers, but it's not clear exactly where they're measured so it's difficult to say how accurate they are. Much less what percentage of load power consumption is due to the SoC.

Edit: Just for fun, assuming that notebookcheck numbers are roughly the same as what they do for traditional notebook reviews then the Toshiba Tegra 4 tablet - http://www.notebookcheck.net/Review-Toshiba-Excite-Pro-AT10LE-A-108-Tablet.98828.0.html - goes from 6.1W idle with maximum brightness/everything enabled to 13.2W with the same settings under maximum load. So it's a pretty safe bet that the SoC is using over 5W under full load. (Sadly notebookcheck doesn't have any Snapdragon 800 reviews yet.)
 
Last edited:

Abwx

Lifer
Apr 2, 2011
11,855
4,832
136
Anand did himself.

Are you Anand.?.


read the article

"Intel left me to install and run anything I wanted to during a period of a few hours at their campus in Santa Clara."

listen to the podcast also. they were given free reign to do whatever they wanted. install the apps, run the benchmarks they wanted.

consumption numbers were provided by intel hardware on sight. but they werent "given numbers" by intel without seeing the draw themselves. this is getting ridiculous.

He didnt measure power comsumption , he did read it in intel
s monitoring hardware like everybody.

This was a meticulously controled reviews when it comes
to TDP , the most single important parameter in mobile ,
how surprising..

repost , from the link provided by Atenra :

During part of our briefing we were taken to a room where a Clover Trail and Bay Trail system were working side by side and connected up to advanced power monitoring hardware and software.
 

Nothingness

Diamond Member
Jul 3, 2013
3,294
2,362
136
Anand did himself.
And the hardware set up and associated software were done by? Yes, by Intel.

I don't mean Intel cheatad, but what I want to know isn't instantaneous power consumption, I want to have battery life under various workloads. And that on end-user device.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
GLBench: http://www.heise.de/newsticker/meldung/IDF-Intels-Tablet-Prozessor-Baytrail-startet-1953269.html

Atom Z3770 100
A6-1450 106
A4-5000 146

A4-1200 would be way slower here.

An even if it had a slight GPU advantage the Z3770 would still beat it in actual games thanks to its huge CPU performance advantage (>3x in MT!), likely using less power too (especially platform power). At the end of the day both would suck at demanding Windows games (console ports) but one would have >3x higher CPU performance for other tasks. Easy choice.
 

liahos1

Senior member
Aug 28, 2013
573
45
91
Are you Anand.?.



He didnt measure power comsumption , he did read it in intel
s monitoring hardware like everybody.

This was a meticulously controled reviews when it comes
to TDP , the most single important parameter in mobile ,
how surprising..

repost , from the link provided by Atenra :

i mean they were running any bench they wanted on those devices while measuring draw. are you inferring somehow that the devices were recalibrated to give better results? your cynicism level is high. i guess we'll see when product comes out next month. again i would point you to the anand videocast he makes a few comments describing how much more transparent this process is vs anything from qcom and nvidia.
 

Abwx

Lifer
Apr 2, 2011
11,855
4,832
136
i mean they were running any bench they wanted on those devices while measuring draw. are you inferring somehow that the devices were recalibrated to give better results? your cynicism level is high. i guess we'll see when product comes out next month. again i would point you to the anand videocast he makes a few comments describing how much more transparent this process is vs anything from qcom and nvidia.

As i already wrote , they could bench everything but power
comsumption was benched exclusively by intel.

Be sure that if a reviewer did brought a full bag of measurement
instruments he wouldnt had been allowed to plug them , be it if
it was only between the main and the device PSU.

As for cynicism , well , ask the firm that faked real
time playing with a video when publicly demonstrating
their HD4000 at the same event , they know better..
 
Last edited:

liahos1

Senior member
Aug 28, 2013
573
45
91
As i already wrote , they could bench everything but power
comsumption was benched exclusively by intel.

Be sure that if a reviewer did brought a full bag of measurement
instruments he wouldnt had been allowed to plug them , be it if
it was only between the main and the device PSU.

let me clarify. he was running any bench he wanted on the tablet while it was being measured. in real time. these results weren't just "given to him" in an email or something. in the videocast anand mentioned how much more transparent this process is vs other reference platforms he tested. do you hold qualcomm and nvidia to the same standard? because by your standards they would be even more nefarious. Are you inferring intel is lying because and their transistors and tech are structurally inferior? i dont get it.
 

Abwx

Lifer
Apr 2, 2011
11,855
4,832
136
let me clarify. he was running any bench he wanted on the tablet while it was being measured. in real time. these results weren't just "given to him" in an email or something. in the videocast anand mentioned how much more transparent this process is vs other reference platforms he tested. do you hold qualcomm and nvidia to the same standard? because by your standards they would be even more nefarious. Are you inferring intel is lying because and their transistors and tech are structurally inferior? i dont get it.

Read my edit , intel already tried to fool people ,
it was only two years ago , quite a standard , isnt it.?..

No better than nvidia or qualcomm , actualy , perhaps
even worse sometime , as said , ask mooly eden..
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
let me clarify. he was running any bench he wanted on the tablet while it was being measured. in real time. these results weren't just "given to him" in an email or something. in the videocast anand mentioned how much more transparent this process is vs other reference platforms he tested. do you hold qualcomm and nvidia to the same standard? because by your standards they would be even more nefarious. Are you inferring intel is lying because and their transistors and tech are structurally inferior? i dont get it.
I dont think that is what he is getting at...maybe the tested components were chosen and optimize for low power consumption, far greater than an average system would be...or atleast that is what I think he is referring to.
 

Skolomerija

Junior Member
Sep 6, 2010
9
0
0
Again, power consumption is not the TDP.
We also dont have numbers of TDP for Z-3770 and we certainty dont have power consumption numbers for both Z-3770 and A4-1200/1250.

We really have to see those numbers first and then we can conclude.

September 11, 2013 | 12:15 PM - Posted by brians (not verified)


What is the clock speed of the z3770 you are testing?
How could let Intel not talk about TDP... is it that bad?
-------
September 11, 2013 | 12:25 PM - Posted by Ryan Shrout


It's not great. The Z3770 bursts up to 2.4 GHz with a base clock of 1.8 GHz.
http://www.pcper.com/reviews/Proces...rail-and-Silvermont-Arrive/Bay-Trail-Power-Co
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I am sorry for the tone. But this thread has gone like this ----> Bay Trail T benchmarks revealed. Response? Comparing it to AMD laptop and convertible chips (wrong comparison). And then latching on the SDP. Come on. These aren't valid counter arguments.

AMD has some nice stuff, but the counter-arguments against Bay Trail in this thread have been largely ridiculous. Yes, Bay Trail T has shortcomings. The graphics need to get better. Integrated LTE (intel will have this next year) I agree with all of that. But to say it isn't quite a feat - I don't agree with that. Intel overnight has trounced the CPU performance of every ARM SOC in existence, so I find that highly impressive even if the graphics could use a little work.

Anyway, again, apologies for the tone.

Apology accepted, no problem. ;)
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Your the one who is getting massively overexcited and talking about usage patterns of Atom in laptops and desktop and you know very well I was talking about that. I have built loads of SFF systems even down to those using pico-PSUs,etc.

Unfortunately,for you will find desktop and laptop parts will consume more power,as they far more functionality enabled,and are worse binned. They also use cheaper components to hit the lower price-points meaning greater power consumption. That has been shown for the last decade,with such parts.

Moreover for desktop usage,SATA and PCI-E are far more important,especially SATA,unless you think eMMC will be fine for your desktop or laptop. I thought not.

You do not seem to understand,that power consumption is a function of the entire platform not just one part,ie,the CPU.

Moreover,you seem have no clue of the desktop SFF systems I am talking about like the Revo,etc which have Atom,Brazos and Celeron offerings which are found worldwide at the low end NOW. These offerings will have Bay Trail and Jaguar based replacements. The Bay Trail low end desktop and laptop offerings are now being marketed as Celeron and Pentium derivatives,so it is very obviously meaning Intel is making sure they plonk in a cheaper to make Bay Trail offering over what they have now.

Maybe,in the US you have MicroCenters having Core i7 desktops for $100,but guess what?? In the scheme of things the US is not the world.

Talking about a Core i7 is of no relevance,because virtually every other AMD and Intel laptop chip is under a 45W TDP and more importantly the pricing of such chips are far higher than either Bay Trail or Jaguar. They are not SOCs meaning they require more complex motherboards and more overall cooling which adds cost.

Not what I was saying. I'm simply saying that in a desktop form factor (which even the smallest have more volume than a typical 15.6" notebook, cooling and therefore power is not so much an issue. If power was truly so important to the consumer then AMD would be completely busted on the desktop. In reality, lowering power is nice but not if it introduces more cost or is harder to do. No one is going to care if their SFF PC uses 25 watts or 45 watts because it can easily be cooled and hence is not a problem. Performance and price will then take the front seat. Sure I understand that power consumption matters but if a a10-6700 works fine in a SFF system then cost and performance matter the most. Against even celeron/pentium chips baytrail and kabini lose out massively in terms of singlethread performance and ivy and haswell pentium GT1 is going to equal or beat kabini let alone baytrail. For desktop I can't see baytrail/kabini as anything more than a poor man's alternative (given how cheap pentium chips are) unless you are talking about a form factor the size of a Kleenex box which is a definite niche market.

For cheap office/basic use machine PCIe isn't important (especially as baytrail/kabini will CPU bottleneck pretty much any dgpu and this isn't the target market for PCIe based storage or expand-ability). Sata is important.

And I'm not in the states.

That i7 reference was in context of the cooling power a 45 watt chip needs and nothing to do with it being an i7 chip.