"Law of physics"?
Hm. 3800 points against 3200 points with 5W is better with 1/3 of the power. 😵
3 x the perf/watt is impossible , moreover with both
chips using TSMC 28nm process , think a little...
"Law of physics"?
Hm. 3800 points against 3200 points with 5W is better with 1/3 of the power. 😵
How did you calculate Tegra 4's TDP ? AFAIK ARM SoC's don't reveal these numbers, Nvidia might be an exception,
But you believe the 15W number from AMD?! :sneaky:Also 5W is not under load, in that SDP analysis on AT I saw a fully loaded Exynos 5 pulling well over 8W(it was dual core I believe) so 5W is not a number you should be waving around ! Better do your homework next time 🙄
3 x the perf/watt is impossible , moreover with both
chips using TSMC 28nm process , think a little...
Such claims should be considered with a load of cautiousnessIt is possible: Two different architectures with different characteristic.
It seems you are mentioning the figure that appears in the top right side. That is a net applications estimate. Below in the same page you can find a table with different estimations based in web clients. Contrast the 1% with the 2.73% given by W3Counter. It is only 0.5% behind Windows 8 share in the same row. Now look to the text before the table (bold mine):
Now take a look to
http://en.wikipedia.org/wiki/Linux_adoption
for other estimations of share. You can find from the high 93.8% of share for supercomputers to a low 8% on desktops:
Check also
http://en.wikipedia.org/wiki/List_of_Linux_adopters
Nobody is saying that all windows benchmarks are biased, only some are: the names of some of those benchmarks were given.
Finally, look to this AMD announcement
http://www.pcworld.com/article/2040...exclusivity-adopts-android-and-chrome-os.html
I think that the era of measuring performance using only windows-based benchmarks has gone.
Umm no I don't but hey you're the one claiming 3x performance/watt not me, so I think it's up to you to prove that Tegra4 uses 5W vs AMD's A4 at 15W running geekbench.nVidia published the number a few months ago.
But you believe the 15W number from AMD?! :sneaky:
It is possible: Two different architectures with different characteristic.
Intel isn't any better in this regard D:Or:
Jaguar is not competitive with ARM.
The 8% figure you are fantasizing yourself would be pretty nice if it was true, since it's a great OS, but unfortunately it's not true.
In estimating true worldwide desktop adoption and accounting for the Windows-distorted environment in the USA and Canada he indicated that at least 8% of the world desktops run Linux distributions and possibly as high as 1012% and that the numbers are rising quickly.
So what is Linux real market share on the desktop? The best estimate for present sales is around 8%
Intel isn't any better in this regard D:
You are right. And thx to that we see how bad AMD's performance level really is. Even nVidia is beating AMD's x86 SoCs with 1/3 of the power consumption:
Tegra 4 - 3800 points in Geekbench 2
A4-5000 - 3200 points in Geekbench 2
And we all know TDP isn't power consumption 🙄I'm using the official TDP number from both companies.
So it's more up to you to prove that the actual power consumption while running Geekbench is not close to it.
The 8% figure is not from mine. I gave you the full quote from the wikipedia link in my previous post:
Dangerous question on a very Intel minded forum. Here we have a saying: shooting the ball in an open goal(from socker). For me its obvious: I prefer my 8350 over my 3770K. But thats personal:biggrin:
And we all know TDP isn't power consumption 🙄
System and Maximum TDP is based on worst case scenarios. Actual TDP may be lower if not all I/Os for chipsets are used.
Next is a real power consumption of Kabini 15W against intel chip with a claimed maximum TDP of 17W
http://ark.intel.com/products/65697/
The Intel-based system consumes near the double than the Kabini-based
![]()
I also suspect the 15W TDP is perhaps a bit conservative, total platform power consumption with all CPU cores firing never exceeded 12W (meaning SoC power consumption is far lower, likely sub-10W).
Couple things to remember.
There is 14 watts separating the two chips.
Intel's 17 watt ULV chips do not include the PCH (3.6 watt tdp).
tdp != power consumption
http://www.anandtech.com/show/6981/...ality-of-mainstream-pcs-with-its-latest-apu/2
Kabini doesn't really use 15 watts so it wouldn't be wrong to assume that the 17 watt intel chip + 3.6 watt PCH would use quite a bit more power than the sub 15 watt kabini chip.
![]()
With that said perf/watt is similar; 69% more power for 50% better performance on different systems isn't that different (you can find just as great a difference on notebooks with the same cpu).
The absolute difference is unimportant. And the 3.6 do not account for the observed relative difference in performance.
Yes, the figure that I gave above is precisely showing that TPD != power consumption.
I doubt that Kabini TDP is overestimated. At contrary, I believe that it is Intel who underestimated the TDP of its ULV, because avoids its aggressive turbo.
The low model A4-5000 is targeting Pentiums performance. It makes no sense their comparison with an i3. Other models from AMD will be competing with the i3.
![]()
Moreover, those scores were obtained in a kabini prototype. Wait improvements with final hardware and the new drivers.
In any case a 69% more power consumption for a mere 50% gain in performance for the i3 gives a mere 0.72 factor. And finally F1 2012 seems to be one of those games optimized for intel.
I am ignoring the 99% of posts like this. But I will explain my signature.
I can build a top gaming PC using only AMD parts (e.g. FX and Radeon), but I cannot do that using only Intel (lacking dGPUs) or only Nvidia (lacking X86 CPUs). You need Intel plus AMD or Intel plus Nvidia.
I highly doubt that anand is wrong.
It doesn't matter where amd positions their chips. The market decides where it will compete and right now there are a low of i3 ULV notebooks on sale for $400.
Your math is wrong (have to include the base performance).
amd: 30 fps for 20 watts or 1.5 fps/watt
intel 45 fps for 34.3 watts or 1.3 fps/watt.
Using those numbers kabini is 15% more efficient per fps. (i3 is 87% as efficient 1.3/1.5=0.87 not 0.72).
Then again its the difference between playable and unplayable.
F1 2012: i3 is 50% faster
Skyrim: i3 is 54% faster
Tomb raider (720p) i3 is 76% faster
Metro LL: i3 is 66% faster
I think with other games it would have been worse from a fps to watt perspective.
Join date April 2013, posts pro AMD, anti Intel....I know, you are one of the new AMD advocates aren't you!...
If you want ~200ish cpu/mobo combo, fx6300 is better than i3, at $300 mobo/cpu combo the i5 is better.
They tested a prototype. The market did not decide anything. The A4 is not aimed to competing with the i3, that is a task for the A6.
We are talking about different stuff, you are comparing the relative efficiency A4/i3 or i3/A4, I was obtaining the efficiency factor for the i3 alone.
Apart from the driver issue mentioned before, the gaming selection does not look well-balanced. E.g. Skyrim is one of those games optimized for Intel graphics. Moreover, if you look to Tomb Raider (768p) the gap is reduced to 21%.