Go Back   AnandTech Forums > Consumer Electronics > Mobile Devices & Gadgets

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals with Free Stuff/Contests
· Black Friday 2014
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 01-16-2013, 04:27 AM   #101
djgandy
Member
 
Join Date: Nov 2012
Posts: 78
Default

Quote:
Originally Posted by Rayb View Post
All of this speculation about an alleged benchmark on a pre-development board has no real meaning until the silicon is final. It can take multiple samplings before it is, just a working silicon sample at 80% can have a cascade effect before the last 20% is finally optimized.
The fact that Nvidia is cramming that much performance in T4 (>50% less than A6X) silicon and still will match or surpass the A6X performance while reducing power consumption compared to T3 (>45%) should be a plus for battery life.

Well according to BenSkyWalker silicon is final and T4 is a shipping product that can be compared to A6. We can only hope one day we will all have read as many EE books as him. For now I'll just have to continue working in the industry itself.

The fact is everyone compares GLBenchmark, and if you have silicon back you are not in early development stages. Either Nvidia have underclocked their dev boards massively (which is odd since you can stick active cooling on dev boards very easily) or must have some seriously broken hardware/drivers at this stage if they are going to pull out a 50% performance increase on such a simple benchmark that they will have been working on optimising since day 1. Generally if silicon is that bad it would never leave the premises either, in fact you wouldn't even bother going to silicon at that stage

BSW: Where are you getting these poor iPhone battery figures from? It is consistently near the top on every test Anand has run?
http://www.anandtech.com/show/6330/t...ne-5-review/13
The only weak point of the iPhone is talk time, something that is irrelevant in terms of CPU/GPU and completely due to its smaller battery. This is Apple's design choice, and they clearly pull it off with a far superior SoC. If Apple decide to go down the large battery route they have far more headroom to play with than the current battery monsters.

Look how crap the T3 based HTC One X does though in GPU tests. Hilarious. Completed dominated in performance and power. And guess what is on top with 3x more battery life than the T3, an SGX 540. iPhone 4S completely destroys T3 and is a faster GPU too and was available before T3 also!

Now I await your logic that because the Razr i has a slower GPU that is why it has a 3x advantage over T3. Extending that though should T3 have at least 3x the battery life of the iPhone 5?
djgandy is offline   Reply With Quote
Old 01-16-2013, 05:23 AM   #102
djgandy
Member
 
Join Date: Nov 2012
Posts: 78
Default

Quote:
Originally Posted by BenSkywalker View Post
We have a bunch of Apple fans in this thread. When the Tegra 3 came out and roflstomped Apple's latest and greatest their discussion just swapped around to only fandroids cared about specs, until the second Apple got a more up to date SoC then it was the most important thing ever of course.
.
What exactly did T3 come out and beat, Transformer prime could match the iPad 2 on a few things and lost on the rest, is that your definition of winning?

BSW: When do you expect T4 to actually launch and be available for Project Shield and further to that when do you expect non Nvidia customers to have it? The reason I ask is because something weird happened with the T4 announcement. A chip that is supposedly next generation and available for mass production by March/April in my estimates was announced, but there was no mention big licensees or devices outside of camp Nvidia. Don't you find that a little odd? Nvidia's usual stance is to shout from the rooftops about how great they are at every opportunity.

Also since your electronic engineering knowledge is so good, how do you expect T4 to provide memory bandwidth to all 72 shader cores? Wider memory buses, higher memory bus clocks?

Last edited by djgandy; 01-16-2013 at 05:37 AM.
djgandy is offline   Reply With Quote
Old 01-16-2013, 08:00 AM   #103
MrX8503
Diamond Member
 
Join Date: Oct 2005
Posts: 4,530
Default

Quote:
Originally Posted by BenSkywalker View Post

So according to you, A5X was old news as soon as it came out. Funny, that sure as hell isn't close to what you were saying when it happened. I detest hypocrisy, have a backbone and stick by your stance.
Why would the A5X be old news? It beat Tegra 3's GPU by a fair margin. T4 looks like it barely beats A6X, if it does at all.

T4 is newer, it should beat A6X by the same margins or more to be impressive.

Quote:
Originally Posted by BenSkywalker View Post
And? Apple designs/specs the whole phone out. They chose to have sub optimal battery life compared to the top tier phones.
Let me break it down for you since you have a hard time following.

You originally said that the iPhone's battery strengths are due to its screen size.

My rebuttal is that, yeah it has a smaller screen, but it also has a smaller battery due to its size.

Now the iPhone has sub optimal battery life? Lol!

Quote:
Originally Posted by BenSkywalker View Post
We have a bunch of Apple fans in this thread. When the Tegra 3 came out and roflstomped Apple's latest and greatest their discussion just swapped around to only fandroids cared about specs, until the second Apple got a more up to date SoC then it was the most important thing ever of course.
Too bad T3 didn't stomp Apple. T3's CPU was impressive, but its GPU wasn't. T4 seems to follow the same path.
MrX8503 is offline   Reply With Quote
Old 01-16-2013, 08:12 AM   #104
sontin
Platinum Member
 
Join Date: Sep 2011
Posts: 2,184
Default

Quote:
Originally Posted by djgandy View Post

BSW: When do you expect T4 to actually launch and be available for Project Shield and further to that when do you expect non Nvidia customers to have it? The reason I ask is because something weird happened with the T4 announcement. A chip that is supposedly next generation and available for mass production by March/April in my estimates was announced, but there was no mention big licensees or devices outside of camp Nvidia. Don't you find that a little odd? Nvidia's usual stance is to shout from the rooftops about how great they are at every opportunity.
So, what's the difference to Samsung and Qualcomm?
sontin is offline   Reply With Quote
Old 01-16-2013, 11:03 AM   #105
BenSkywalker
Elite Member
 
Join Date: Oct 1999
Posts: 8,955
Default

..

Last edited by BenSkywalker; 01-22-2013 at 08:30 PM.
BenSkywalker is offline   Reply With Quote
Old 01-16-2013, 01:12 PM   #106
MrX8503
Diamond Member
 
Join Date: Oct 2005
Posts: 4,530
Default

Quote:
Originally Posted by BenSkywalker View Post
Name a percentage that something has to be so it isn't old news. Go ahead and prove you have some credibility, before T4 official benches hit, you name the exacting, to the decimal point so you can't backpedal later when it doesn't make Apple look good, percentage that something has to be in order to not be old news.
Now you're backpedaling. Not sure how you came to the conclusion the A5X is old news compared to T3. IF Tegra4 is marginally faster than A6X, I'm not impressed.

Quote:
Originally Posted by BenSkywalker View Post
I stated someone could make a mini screen phone like the iPhone using the Tegra 4 that could easily compete with Apple's battery life. Biggest drain on batteries? Screen. Are you really going to try to deny that?
I never said that the screen isn't the biggest drain.

Quote:
Originally Posted by BenSkywalker View Post
Compared to the Maxx? Absolutely.
Because of its huge battery. *golf clap*.

Quote:
Originally Posted by BenSkywalker View Post
There is another member of your faith in this thread trying to put forth that using non Apple SoCs gives inferior battery life.
Never said that either. I only said that iPhones generally have better battery life as a whole.

[QUOTE=BenSkywalker;34503588

Given that the iPhone isn't tops on the battery charts has reality disagreeing with that. I understand, in the RDF Apple wins every bench, all the time, but that is why we have an Apple only sub forum, for people to sit around slapping high fives about how great Apple is without getting called out.
[/quote]

The majority of Android devices are below the iPhone in battery life tests. If the iPhone had terrible battery life it would be at the bottom and if it was ok, it would be in the middle.

In tests the iPhone battery life is near the top. Far from sub optimal.

Last edited by MrX8503; 01-16-2013 at 01:29 PM.
MrX8503 is offline   Reply With Quote
Old 01-16-2013, 01:36 PM   #107
BenSkywalker
Elite Member
 
Join Date: Oct 1999
Posts: 8,955
Default

..

Last edited by BenSkywalker; 01-22-2013 at 08:29 PM.
BenSkywalker is offline   Reply With Quote
Old 01-16-2013, 04:03 PM   #108
MrX8503
Diamond Member
 
Join Date: Oct 2005
Posts: 4,530
Default

Quote:
Originally Posted by BenSkywalker View Post
You are the one that keeps changing your standards. I say T4 isn't old news, you say it is because it isn't much faster, I say then A5X was old news because it wasn't much faster
Uh...the A5X's GPU is a lot faster than T3's GPU.

Quote:
Originally Posted by BenSkywalker View Post
When did I say you did? That's right, I didn't.
You just asked me if I'm denying if the screen causes the most battery drain.

I'm telling you that I never said that it didn't.

Quote:
Originally Posted by BenSkywalker View Post
The majority of Android devices have terrible battery life. Never confuse myself with your kind. If something sucks I call it out, doesn't matter who it is. I am not the lapdog of any corporation.
So the iPhone doesn't have good battery life, its just that the majority of Android phones have terrible battery life? Lol ok.
MrX8503 is offline   Reply With Quote
Old 01-17-2013, 02:50 AM   #109
djgandy
Member
 
Join Date: Nov 2012
Posts: 78
Default

Ah whats the point. Apparently MWC is a launch for T4 despite no availability of products. Apparently Tegra 3 was available 6 months before the transformer prime, despite both being launched in Q4 2011.

Also the memory interface question was a trap, but I knew you'd fall for it. Don't you think that Tegra will be slightly starved of bandwidth for intense workloads? 4 CPU cores + a fairly primitive IMR GPU?
djgandy is offline   Reply With Quote
Old 01-17-2013, 05:40 AM   #110
grkM3
Golden Member
 
Join Date: Jul 2011
Posts: 1,398
Default

Mx do you still think the soc in the iphone 5 is not there a15 design?
__________________
Main rig
2600k@4.8 24/7 1.415 WC
ASUS P67 MIVE
8GB Gskill 2200 7-10-7-1t
Corsair Force GT 120GB SSD
2 GTX 560TI in SLI
grkM3 is offline   Reply With Quote
Old 01-17-2013, 05:45 AM   #111
BenSkywalker
Elite Member
 
Join Date: Oct 1999
Posts: 8,955
Default

..

Last edited by BenSkywalker; 01-22-2013 at 08:29 PM.
BenSkywalker is offline   Reply With Quote
Old 01-17-2013, 07:41 AM   #112
MrX8503
Diamond Member
 
Join Date: Oct 2005
Posts: 4,530
Default

Quote:
Originally Posted by BenSkywalker View Post
Uh....the A5X came out after Tegra 3 for starters.
We all know that. Are you really not following? T4 is coming out after A6X and I'm saying that if its GPU is close to the A6X, then its not impressive.

Quote:
Originally Posted by BenSkywalker View Post
Want to compare the top cell phones over the last ten years and plot where the iPhone is? I'm thinking it wouldn't make the top 100.
That makes absolutely no sense. When you do comparisons, you compare the same class, in this case, SMARTPHONES. If you want to talk about dumb phones, make a thread about how it has better battery life than smartphones. I'm sure people won't laugh at ya.
MrX8503 is offline   Reply With Quote
Old 01-17-2013, 10:33 AM   #113
djgandy
Member
 
Join Date: Nov 2012
Posts: 78
Default

Quote:
Originally Posted by BenSkywalker View Post
When the Transformer Prime came out, the iPad 2 didn't exist. The Infinity is the T3 tablet that launched after the iP2.
March 2011: http://en.wikipedia.org/wiki/IPad_2
Dec 2011: http://en.wikipedia.org/wiki/Asus_Ee...nsformer_Prime

It's just too easy. And apparently it is everyone else in a RDF.


Quote:
Originally Posted by BenSkywalker View Post
5.3GB/sec-6.4GB/sec versus ~12GB/sec-~25GB/sec. That is the generational difference in bandwidth between T3 and T4. Is it possible T4 could still be bandwidth limited under certain situations? Of course. Does it have more the twice the bandwidth of the previous generation? Yep. It is trivial to make a 7970GE bandwidth limited with an order of magnitude more to play with. These are SoCs we are talking about. The upper limit is ~300% faster then the T3 was. If T3 was 100% bandwidth limited, then we should expect a 300% performance increase for T4, I don't think too many people would consider that poor.
Well thats all lovely. Let's actually come back to the real world again, where T4 has dual 32-bit LPDDR3 @ 800 MHz i.e 12.8 GB/s. How is it going to feed all those ALUs? I guess at least they have cut bandwidth requirements by ~35% by chopping all the precision from the Pixel Shaders, haha.

Last edited by djgandy; 01-17-2013 at 10:38 AM.
djgandy is offline   Reply With Quote
Old 01-17-2013, 01:27 PM   #114
runawayprisoner
Platinum Member
 
Join Date: Apr 2008
Posts: 2,496
Default

Quote:
Originally Posted by grkM3 View Post
Mx do you still think the soc in the iphone 5 is not there a15 design?
A15 is more power-hungry than Apple's Swift cores. Apple made a custom core that's more akin to Qualcomm's Snapdragon.

Thus the SoC in the iPhone 5 is not A15. It's just faster than whatever they had in there before.

But even going by preliminary benchmarks (of early T4 samples), it looks like A6 in the iPhone 5 is still about on par with T4 at least with graphics performance. Everything else be damned until we get more coverage.
__________________
Poking through JPSX...
runawayprisoner is offline   Reply With Quote
Old 01-17-2013, 04:21 PM   #115
grkM3
Golden Member
 
Join Date: Jul 2011
Posts: 1,398
Default

waiting to see how t4 does against the exynos 5 as its said samsungs 28nm will consume 70% less power then the exynos 5 dual core in the nexus 10.

its also getting a povervr gpu that is clocked at 533 vs the iphone 5 266 speeds.

If anything the exynos octo core is the soc of 2013 to get
__________________
Main rig
2600k@4.8 24/7 1.415 WC
ASUS P67 MIVE
8GB Gskill 2200 7-10-7-1t
Corsair Force GT 120GB SSD
2 GTX 560TI in SLI

Last edited by grkM3; 01-17-2013 at 04:26 PM.
grkM3 is offline   Reply With Quote
Old 01-17-2013, 10:33 PM   #116
lothar
Diamond Member
 
Join Date: Jan 2000
Posts: 6,473
Default

Quote:
Originally Posted by grkM3 View Post
waiting to see how t4 does against the exynos 5 as its said samsungs 28nm will consume 70% less power then the exynos 5 dual core in the nexus 10.

its also getting a povervr gpu that is clocked at 533 vs the iphone 5 266 speeds.

If anything the exynos octo core is the soc of 2013 to get
Way too early to speculate for the entire year of 2013.
My personal guess is that if Samsung doesn't launch Mali T658 with Exynos, they'll be behind somebody.
Don't know if that sombody will be Nvidia, Qualcomm, or both.

Of course if Mali T658 isn't ready on time for the Galaxy S IV launch, Samsung may "partially" redeem themselves by launching it with the Galaxy Note III.

Where are we on this chart?
__________________
Quote:
Originally Posted by MagnusTheBrewer View Post
If you make it to age 30 and have never been arrested you're either still living in your parents basement, part of the 1% or, both.
lothar is offline   Reply With Quote
Old 01-17-2013, 10:51 PM   #117
dagamer34
Platinum Member
 
Join Date: Aug 2005
Location: Houston, TX
Posts: 2,545
Default

Quote:
Originally Posted by lothar View Post
Way too early to speculate for the entire year of 2013.
My personal guess is that if Samsung doesn't launch Mali T658 with Exynos, they'll be behind somebody.
Don't know if that sombody will be Nvidia, Qualcomm, or both.

Of course if Mali T658 isn't ready on time for the Galaxy S IV launch, Samsung may "partially" redeem themselves by launching it with the Galaxy Note III.

Where are we on this chart?
The chart isn't really accurate anymore. We've already got quad-core A7 chips which shouldn't come out until next year and the 4xA15/4xA7 doesn't even exist.

Thing is, you can stick as many cores as you want on a chip (within reason) because die area is largely a function of cost, not chip design.
__________________
15" MacBook Pro with Retina Display (Mid 2012) | Mac mini (Late 2012) | iPhone 5 | iPad Air w/ LTE
dagamer34 is offline   Reply With Quote
Old 01-17-2013, 11:23 PM   #118
lothar
Diamond Member
 
Join Date: Jan 2000
Posts: 6,473
Default

Quote:
Originally Posted by dagamer34 View Post
The chart isn't really accurate anymore. We've already got quad-core A7 chips which shouldn't come out until next year and the 4xA15/4xA7 doesn't even exist.

Thing is, you can stick as many cores as you want on a chip (within reason) because die area is largely a function of cost, not chip design.
Which quad core A7 chips did we get? If you're referring to the one in the Galaxy Note II and International Galaxy S III, I think that is quad core A9 not A7.

Isn't 4xA15/4xA7 Samsung's new rumored Exynos Octacore SoC?
If so, then it does exist in some lab somewhere out there and will probably be launched in Korea sometime in April.
__________________
Quote:
Originally Posted by MagnusTheBrewer View Post
If you make it to age 30 and have never been arrested you're either still living in your parents basement, part of the 1% or, both.
lothar is offline   Reply With Quote
Old 01-17-2013, 11:37 PM   #119
BenSkywalker
Elite Member
 
Join Date: Oct 1999
Posts: 8,955
Default

..

Last edited by BenSkywalker; 01-22-2013 at 08:29 PM.
BenSkywalker is offline   Reply With Quote
Old 01-18-2013, 03:13 AM   #120
djgandy
Member
 
Join Date: Nov 2012
Posts: 78
Default

Quote:
Originally Posted by lothar View Post
Way too early to speculate for the entire year of 2013.
My personal guess is that if Samsung doesn't launch Mali T658 with Exynos, they'll be behind somebody.
Don't know if that sombody will be Nvidia, Qualcomm, or both.

Of course if Mali T658 isn't ready on time for the Galaxy S IV launch, Samsung may "partially" redeem themselves by launching it with the Galaxy Note III.

Where are we on this chart?
I think that chart is dead. http://www.arm.com/products/multimed...pute/index.php

Also it is kinda odd that they thought Mali-658 would still be going in 2016. You'd generally assume next gen by that point
djgandy is offline   Reply With Quote
Old 01-18-2013, 04:33 AM   #121
djgandy
Member
 
Join Date: Nov 2012
Posts: 78
Default

Quote:
Originally Posted by BenSkywalker View Post
So they have a new revision of Tegra 4 that removed not only the low end LPDDR2 support, but also support for DDR3L and furthermore have it all set at a fixed frequency? Got any links to that? That is actually fairly major news that Tegra 4 has your limitation, I'm sure the major tech sites would like to share that information with everyone as it is kind of a big deal
You think they'll spin another SoC with a DDR2 memory interface? Doesn't that defeat the object of creating a next generation chip? Maybe they will in a years time, but we're talking about what is being launched initially here. Who said they dropped DDR3L?


Quote:
Originally Posted by BenSkywalker View Post
I think it is a safe bet that most devices are going to use LPDDR3, nothing I have ever seen indicates that they are all going to be operating at precisely the clock speed you have come up with.
I don't think that clock speed is the maximum memory clockspeed for all eternity. The launch devices will be at those clocks though. Lets focus on what is going to be launched and the product that was announced, not what might exist in 5 years time eh? LPDDR3 is in its infancy, do you think we are ready to ramp clocks already? The power penalty of running at 1066MHz will be huge.




Quote:
Originally Posted by BenSkywalker View Post
Yeah because there was a good chance someone was going to be using FP32 shaders on a mobile device anytime soon, heh. Even using the specs we are all waiting to see the source of, T4 worst case has double the bandwidth of T3. You speak as if T4 is going to be entirely bandwidth limited which if true, would mean clearly T3 also was indicating we should have a direct 100% performance improvement if we believe your numbers are the final word on clock rate.
This is Nvidias latest chip, you think they could have at least gone FP24. Latest chip won't even support GLES 3.0? Compute is out of the question too, although that is not really a biggy.

How does T4 being bandwidth limited imply that T3 was? T3 has 12 ALUs vs 72 for T4. I am struggling to follow your logic here. Twice the bandwidth, six times as many ALUs. T3 has 3x the bandwidth per ALU.
djgandy is offline   Reply With Quote
Old 01-18-2013, 07:39 AM   #122
MrX8503
Diamond Member
 
Join Date: Oct 2005
Posts: 4,530
Default

Quote:
Originally Posted by BenSkywalker View Post
WHAT PERCENTAGE?

I keep asking, you keep doing an Apple. Say it now, get on the record and show for once you aren't the lapdog of a company. Name the percentage for you not to consider it old news.
You need a percentage to determine the difference between marginal and significant?

Quote:
Originally Posted by BenSkywalker View Post
But not the Maxx, because, you know, they designed it to get long battery life.....
Do you want to include 10 years of phones or the Razr Maxx to prove that the iPhone has sub optimal battery life. Seems like you want it both ways.
MrX8503 is offline   Reply With Quote
Old 01-18-2013, 08:49 AM   #123
BenSkywalker
Elite Member
 
Join Date: Oct 1999
Posts: 8,955
Default

..

Last edited by BenSkywalker; 01-22-2013 at 08:29 PM.
BenSkywalker is offline   Reply With Quote
Old 01-18-2013, 10:44 AM   #124
dagamer34
Platinum Member
 
Join Date: Aug 2005
Location: Houston, TX
Posts: 2,545
Default

Quote:
Originally Posted by lothar View Post
Which quad core A7 chips did we get? If you're referring to the one in the Galaxy Note II and International Galaxy S III, I think that is quad core A9 not A7.

Isn't 4xA15/4xA7 Samsung's new rumored Exynos Octacore SoC?
If so, then it does exist in some lab somewhere out there and will probably be launched in Korea sometime in April.
Quad-core A7: http://www.anandtech.com/show/6604/m...ad-core-a7-soc
__________________
15" MacBook Pro with Retina Display (Mid 2012) | Mac mini (Late 2012) | iPhone 5 | iPad Air w/ LTE
dagamer34 is offline   Reply With Quote
Old 01-18-2013, 01:37 PM   #125
BenSkywalker
Elite Member
 
Join Date: Oct 1999
Posts: 8,955
Default

..

Last edited by BenSkywalker; 01-22-2013 at 08:28 PM.
BenSkywalker is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 07:16 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.