Sneak Peak of Tegra 4 (Codename Wayne)

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,011
136
Just realized that I should probably factor in the number of pixels that are present in each device as that's not necessarily the same.

I'll have to hunt down that information and then add that into the calculation.

Edit: The Note 2 and GS3 have the same resolution, so this might not change anything. Also, a screen with a lower resolution needs bigger pixels, which probably require more power so it might not actually matter. Then again, I've no formal education on the matter, so perhaps someone with an EE degree or otherwise studied display technology could comment.
 
Last edited:

djgandy

Member
Nov 2, 2012
78
0
0
DDR3L hitting 2.2GHZ last April in a consumer product-

http://www.overclockersclub.com/reviews/samsung_green_ddr3l/3.htm

Another hilarious example. Did you even read what you posted? An example of DDR3L running at exatly the speed I said it would run at in T4. 800MHz! You are confused once again. DDR3-1600 = DDR3 @ 800 MHz = 6.4GB/s. Bless. The memory is almost irrelevant here. Your I/O bus is only dual 32-bit. That is the limitation and that was the point. It is difficult to ramp the I/O bus past 800Mhz.

Also the test system is not quite a low power device running on a battery.
Testing Setup:

Processors: Core i7 2600K @ 3.4GHz 100 x 34
Motherboard: ASUS Maximus IV Extreme
Memory: Mushkin 996805 Redline PC312800 6-8-6-24 1600MHz 4GB
Video Card: XFX Radeon HD6970
Power Supply: Mushkin 1000W Joule Modular power supply
Hard Drive: 1 x Seagate 1TB SATA
Optical Drive: Lite-On Blu-Ray
Case: Cooler Master HAF 932
OS: Windows 7 Professional 64-bit

So this is your proof that DDR3L is ready to run at high clocks speeds and meet the power requirements for mobile devices? Is Tegra 4 going to come with a 1TB SATA hard drive, 1000W power supply and a Core i7?
 
Last edited:

djgandy

Member
Nov 2, 2012
78
0
0
It is only simple if you are a paid shill for Apple. Other then that, it is complicated to me how society can produce something as pathetic as a person who can't answer what percentage something is *IN THEIR OPINION*.

If you were a person thinking on your own, that would be a very, very simple question.

The only possible way it could be in the slightest bit difficult is if you were going back over all the previous benchmarks reading them over realizing that any number you come up with is going to either make a non Apple SoC not old news, or make Apple's offering old news.

For the sake of humanity I hope you are a paid shill.

Yawn.
 

djgandy

Member
Nov 2, 2012
78
0
0
I don't know what your problem is. i just posted relevant links that prove you are wrong about many things. I get you like Nvidia and hate Apple, but at least post comments that are based on reality. i.e. if you think you are correct, please backup your assertions with some data.

He doesn't have data. He posts links to people bench marking LPDDR3 with 1000W power supplies in SoC discussions. If you look through this thread all he has done is tried to derail it with arguments that are completely irrelevant to the context of the discussion.

Also tries to say GLBenchmark is a PowerVR benchmark or something, a conspiracy or similar, once again with no evidence. GLBenchmark is pretty compute heavy and low overdraw, how does that favour PowerVR?

I just wonder what's next? Is it that T4 actually has infinite web browsing time when connected to a car battery and solar panel and therefore is the most superior SoC ever, ever...ever?

I still want to know where I can buy one, they are clearly life changing. How long is it after paper launch Nvidia products actually appear again?

Memory bandwidth doesn't matter either except it gives a 35% increase in texel fetch on transformer prime infinity.
 

djgandy

Member
Nov 2, 2012
78
0
0
Unlike you it appears, yes. They cleared 1100MHZ, I'd say a 35% increase over what you said is possible, albeit almost a year ago, warrants acknowledgement.

That's exactly what I didn't say. You continue to demonstrate your lack of understanding of context.

No, that was proof that despite your attempts to say otherwise 800MHZ wasn't a limitation or the RAM they were using, obviously we all know that was wrong, that has not only been available in consumer parts for some time, but it has been hitting considerably faster speeds then what you said was possible.

You can always go higher, with more voltage, active cooling, LN2 etc. Does that make it feasible for a consumer tablet/phone

Not once have I stated what speed T4 was going to use for RAM- that is 100% you. You made an absolute claim as to both the bandwidth and clockspeed. I merely pointed out that there are more options available then what you are claiming. Going to be real simple, either every T4 part ships with 800MHZ or slower RAM and you are correct, or some ships faster then that and you are wrong. I made no claim except to point out there are options outside of what you stated was possible.

T4 is limited by its dual 32-bit memory interface. This is the point that was being argued, before you de-railed as usual and used your copy paste article "knowledge". Considering this is for a mobile device, not one hooked up to a 1KW PSU it would make a lot more sense to put down a slower quad channel interface at the cost of a bif of extra die space, and clock in the range of 533 instead of ramping clocks (which is easy if we believe you) to 1GHz and paying every power penalty that comes with it. Also I didn't say every T4 part, I said every T4 launch part. Subtle difference, one again your reading skills fail you.


On T4 it is? I wouldn't say it is or isn't, but I haven't seen anything that backs your claims nor refutes it. I guess the simple solution to that is we shall see.
It's not impossible, of course not. But you'll run into the same physical limitations and power problems desktop parts had. This is why we have multi-channel memory architectures!



That was DDR3L, not LPDDR3- two different things. T4 supports them both.
Sorry, you are right. You linked to DDR3L, which is basically irrelevant in mobile space. Remind me again why people want higher power consuming memory in their mobile devices?

Here is my original quote. Once again you post before you think.
LPDDR3 is in its infancy, do you think we are ready to ramp clocks already?



GLBench is utterly useless, it is a bad sort test, nothing more.
A bit like any non TWIMTBP game on the desktop is irrelevant right?



Prime had one sixth the bandwith that T4 devices are supposed to have according to current whitepapers. It had one sixth the GPU cores. You have repeatedly stated that T4's GPUs were going to be entirely bandwidth limited- I accurately pointed out if that is the case then T3 was also bandwidth limited(which obviously, it wasn't). Six times the computational resources, six times the bandwidth. Where I come from we consider that linear.

For it's screen size it coped, but it was still bandwidth limited under texture fetch, by 35%. Did you read what you linked again? The infinity prime with its much larger screen needed the increased memory bandwidth!
Also don't CPU cores need bandwidth anymore either? Display output, that's 2.25x from 720p to 1080p. You even said it yourself in a post somewhere that larger display resolutions also mean larger renders, larger blits and more data moving around!

With T4's power requirements looking so high it's future in anything under a 1080p screen looks unlikely.

Now that I think a bit more about it though, T4 only has 48 Pixel Shaders, so actually it'll probably be fine on the bandwidth side. Just under powered by 33%. Oh well when they unify they can get a 50% gain and make some fancy marketing slides so say how great unified shaders are.
 

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
mx is an apple fan boy,we all know that so move along.When I read his apple biased posts I always think of this song in my head.

The wheels on the bus go around and around ,around and around,around and around.

You are in an endless loop with him lol.

The GPU of T3 is slower than A5X, that's a fact. What does my preference of the iPhone have anything to do with it, so says the Samsung fanboy? Lol

You do know that the at battery tests have zero credibilty in any real cell forum right.even people on mac rumers say that they cant even get half that time that at got with there iphone 5.

I dont know a single phone that can get 8 hours of screen time on WiFi and the screen brightness all the way down.

Im sorry but I dont believe those results

Anecdotal vs controlled tests....hmmm.

You're not getting the point. If a smaller screen is more efficient, it doesn't matter if the battery is also smaller.

Why doesn't it matter? I know that a bigger battery can't keep up with a larger screen completely. I just think there isn't that big of an advantage from a small screen, small battery phone. The iPhone has a ~1400mah battery, quite small.
 
Last edited:

grkM3

Golden Member
Jul 29, 2011
1,407
0
0
At battery tests have been called out even on this forum. Since you have an iphone 5 please post up an almoat 9 hour of on screen time while staying in 4g.

You wont be able to do it even on wifi and prolly even in airplane mode

Its just not going to happen
 

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,011
136
Why doesn't it matter? I know that a bigger battery can't keep up with a larger screen completely. I just think there isn't that big of an advantage from a small screen, small battery phone. The iPhone has a ~1400mah battery, quite small.

It's useful to know.

Also, the advantage is that you'll probably get better battery life with a smaller phone. Guess it's a trade off between that and a larger screen.
 

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
At battery tests have been called out even on this forum. Since you have an iphone 5 please post up an almoat 9 hour of on screen time while staying in 4g.

You wont be able to do it even on wifi and prolly even in airplane mode

Its just not going to happen

y3yjaraz.jpg


2+ hrs with 75% left over. Half wifi half lte. That works out to about 8hrs if drained down to zero.

You might wanna get your family's iPhone 5 checked if they only get 3-4 hrs on wifi.
 

grkM3

Golden Member
Jul 29, 2011
1,407
0
0
That does not show screen time for all we know you could if been listening to music for 2 hours.at least on android you get a real break down of actual usage.

So your saying you can watch netflix on 4g and get 8 hours of it?
 

thunng8

Member
Jan 8, 2013
152
61
101
There are 2 iphone 5s in our family and they get about 3.5 to 4 hours of web browsing on wifi and with the data modem completely shut off.

8 hours on lte is not real world not even close

3.5-4 hr seems awfully low. I haven't done scientific measurement (I never use the web for that long of a period), but I seem to get much higher than that.

I have done a test while watching a 40min TV show that was encoded in mp4 (so hardware accelerated), with autobrightness on while in public transport during the day (so autobrightness would not have turned the screen down to its low setting, maybe medium, but I didn't check). Battery went down 8% which equates to over 8 hours for full battery.
 

MrX8503

Diamond Member
Oct 23, 2005
4,529
0
0
That does not show screen time for all we know you could if been listening to music for 2 hours.at least on android you get a real break down of actual usage.

So your saying you can watch netflix on 4g and get 8 hours of it?

I'm saying I'm getting way way more than your anecdotal 4 hrs.

My usage was all web browsing, zero music. The screen was at 60% brightness with auto brightness.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
There are 2 iphone 5s in our family and they get about 3.5 to 4 hours of web browsing on wifi and with the data modem completely shut off.

8 hours on lte is not real world not even close

This is bullshit lol.