razers edge pro; perfect usecase for an amd apu?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
it is strange that you think a "mobile" device doesnt need decent battery life. and 2hrs with a huge 80WHr battery is horrendous...what I am debating is the compromise between performance and battery life.
It's portrayed as a mobile device, sure. But we all know that there are varying definitions of mobile---there's a tablet with 10+ hour battery life, an ultrabook with 6 hours, a middle tier casual use laptop at 4-5 hours, and a gaming laptop at 2-3 hours, if you're lucky.

This product is not meant to stand in for a tablet despite the form factor. It is not trying to capture iPad gamers. It is meant to capture people moving from consoles (near zero portability, similar graphical settings) and gaming laptops (more power, same battery life, different, less portable form factor).

You can debate the compromise all you'd like, but I believe I speak for the majority when I say that a 20-80% performance hit is not a fair trade for 15 extra minutes of use, especially since that use is likely to take place near a power outlet.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
It's portrayed as a mobile device, sure. But we all know that there are varying definitions of mobile---there's a tablet with 10+ hour battery life, an ultrabook with 6 hours, a middle tier casual use laptop at 4-5 hours, and a gaming laptop at 2-3 hours, if you're lucky.

This product is not meant to stand in for a tablet despite the form factor. It is not trying to capture iPad gamers. It is meant to capture people moving from consoles (near zero portability, similar graphical settings) and gaming laptops (more power, same battery life, different, less portable form factor).

You can debate the compromise all you'd like, but I believe I speak for the majority when I say that a 20-80% performance hit is not a fair trade for 15 extra minutes of use, especially since that use is likely to take place near a power outlet.

but by your logic why not just get a more performant gaming laptop, with higher specs for cheaper(if they are catering to ex console gamers) and I dont know how much time would it really get by going to 17-25W apu but in any case it will be closer to or slightly higher than an hour.
 

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
Well, by your logic, why not get an Ultrabook instead like the UX32A with Intel HD 4000 graphics for the same price or less?

Additionally, if you look at battery life normalized for battery size (so the units are minutes per watt), you'll see that the Razer Edge is actually ahead of an A10:
53927.png

Two and a half hours of real gaming isn't great, but to be honest, considering the power draw and sheer amount of battery capacity on board with the extended battery, I’m not sure that anything else can top that number right now. There just isn't another system that can hit 1.8 minutes per watt-hour while gaming with a battery larger than 80Wh. The cut-down version of HD 4000 in the ultrabook platform is more power efficient, but the performance tradeoffs are simply too significant to consider it adequate for gaming unless the titles you are playing are quite old. And even then, there aren't any ultrabooks with more battery capacity than the Edge offers.

--Vivek Gowri, A Comment on PC Gaming Battery Life
The problem with the Razer Edge is that it exists, like the Surface Pro, in a gray zone--not quite enough battery life to be considered truly portable, and yet not enough power or optimization to compete with similarly priced gaming laptops/desktops.

On the other hand, it seems to be able to eke out 6 or so hours of web browsing, so it basically is competitive with $1000 Ultrabooks in that regard.
 
Last edited:
Aug 11, 2008
10,451
642
126
it is strange that you think a "mobile" device doesnt need decent wbattery life. and 2hrs with a huge 80WHr battery is horrendous...what I am debating is the compromise between performance and battery life.

Why are you so desparate to prove it should have had an apu. Obviously the makers decided Intel plus discrete was the best choice. Does that decision really hurt you in some way?
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Why are you so desparate to prove it should have had an apu. Obviously the makers decided Intel plus discrete was the best choice. Does that decision really hurt you in some way?

no I am not desperate and this is called a debate so I hear arguments for and against it, it just so happens that most responses so far have been pro intel&nv.

look if you dont care why even ask me, are you the one hurt/threatened by this thead?

my point of view

intel is faster, nvidia is faster, combined they consume alot of power, the combo is expensive and the combo is complex.

amd is slower, amd consumes alot of power but less than intel&nv combo, the amd is much less expensive and much simpler to build.

sleepingforest asked
As for your final question: I have no idea how board complexity matters to the end-consumer. If you can give a concrete reason that board complexity matters in a tablet to the average consumer, I'll give more of an analysis, but it doesn't seem to matter much.
the complexity is a factor in pricing, less complex design is easier to build, costs less and makes more business sense ei higher margins.

the situation seems to be this:

a) i will pay ~$1250 for a MOBILE gaming tablet to have it die on in in an hour on the go but that is ok because I will only play near an outlet and definetly need constant 60fps experience.

b) i will compromise, sure the amd is slower but I get to play tonnes of games at 720p around 30fps -much less than intel&nv combo for sure- but for an hour +(could be 1-3 hours more depending on sku) and pay less for the system(if razer would pass on the savings or keep the higher margins).

And I dont buy the argument that a mobile device isnt a mobile device, that is is catering to cheapo ex console gamers.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Premium products command a premium price. AMD cannot produce premium products, therefore they aren't worthy of the OEM win. This would be a perfect place for an APU, if they actually had an APU worth putting in there.
 

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
the situation seems to be this:
a) i will pay ~$1250 for a MOBILE gaming tablet to have it die on in in an hour on the go but that is ok because I will only play near an outlet and definetly need constant 60fps experience.

b) i will compromise, sure the amd is slower but I get to play tonnes of games at 720p around 30fps -much less than intel&nv combo for sure- but for an hour +(could be 1-3 hours more depending on sku) and pay less for the system(if razer would pass on the savings or keep the higher margins). And I dont buy the argument that a mobile device isnt a mobile device, that is is catering to cheapo ex console gamers.

The Razer Edge, with the larger battery offering (so yes, the $1250 one) actually has 2-3 hours of gaming, depending on the demands of the game. And it's not an "un-mobile" device. It has a compromise that laptop gamers have accepted for years: to get decent performance (60 frames consistently), you have to take only an hour or two of battery. It's much more mobile than, say, a console or a desktop because you don't have to cart around a keyboard/monitor/speakers/power cables/the actual huge tower computer.

Additionally, as shown before, Trinity actually gets far worse per watt-hour than the Razer Edge. So getting Trinity is actually worse for your battery life.
 
Last edited:

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Premium products command a premium price. AMD cannot produce premium products, therefore they aren't worthy of the OEM win. This would be a perfect place for an APU, if they actually had an APU worth putting in there.
ok fair enough but I want to point out that neither intel nor nvida has a better solution for this usecase, and even though the intel&nv combo is faster, the amd apu is competitive, besides premium doesnt mean expensive.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Additionally, as shown before, Trinity actually gets far worse per watt-hour than the Razer Edge. So getting Trinity is actually worse for your battery life.

I am sure that their was a disclaimer saying that was a prototype system, It isn't invalid but we all know those number could easily change from system to system so that may have been the best sample.
 

Torn Mind

Lifer
Nov 25, 2012
12,078
2,772
136
The only APU suitable for the Razer edge is the unbridled A10. Those lower wattage parts are too emasculated as it stands, especially the A6.

In the future, APUs will probably become the superior option once they stick more powerful units on there and fix the memory issues.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
you might be right if if we only compare the a10-4600m(closest match) but if we go down 1 step to the a10-4655m 25W part, would lose 20-80% performance but better battery life.

It actually isn't advantageous to use Trinity 25W at all.

Breaking down few contenders into power used in Watts:

Sony Vaio T13(HD 4000/Dual core ULV): 24.1W
Razer Edge Pro(regular battery/41WHr): 33.5W
Razer Edge Pro(extended battery/83WHr): 34.5W
AMD Llano: 35.5W
Asus NV56M(HD 4000 quad core): 42.6W
AMD Trinity: 43.5W

If you cut Trinity's power use by 10W, by using the 25W A10-4655M, Trinity setup would use 33.5W. That's identical to the Razer Edge Pro with vastly superior CPU and GPU.

Of course, that may not even be true, as it may not result in 10W but little less, like 8-9W.

Well, it is to some. I cannot answer that question for every person here. I'm sure that some would rather get a Trinity-based tablet at say $700 versus the current tablet at $1000

You are exaggerating the price difference. The much cheaper Clover Trail Atom Z2760 based Tablets cost around that range. You would lower the price about $100 by opting for Trinity over Ivy Bridge i5. Then maybe additional $50 for the discrete GPU.

The ark.intel.com price should be a hint, but barely any people gets the point. i3/i5 devices are identical in Intel's pricing, when there's real price difference between the two. The pricing page is only relevant for people who directly buy CPUs, like Desktop CPUs and enthusiasts.
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
I'm fairly confident that Haswell GT3e will impress. We know that it's the same shader design as Ivy Bridge, we know that it's got 2.5x the number of them (40 vs 16), and we know that it's getting eDRAM to fight the bandwidth issues that would have otherwise crippled it. Ivy Bridge is already pretty close to playable performance, and a 2.5x boost will make all the difference. As for wattage- 47W is still a nice number compared to a current Ivy Bridge quad core plus a discrete GPU, which is what system builders like Razer are plumping for at the moment.

I'd cut my expectations, because while paper specs say the max Turbo clock for the GPU is identical to the GT2 parts, leaked infos say the operating frequency is usually at 800MHz or so. And Intel claims only 2x advantage for GT3. They've also stated before that max frequency in the GT3 is only for really needy burst scenarios, rather than anything sustained like with GT2.

Of course, the elephant in the room is cost. Rumour is Intel is asking for seriously big bucks for Haswell GT3e, i.e. more than it would cost to integrate a superior discrete GPU. But we'll have to wait and see.

Price can be explained. The HD 4600 featured Core i7 4900MQ is at 2.8/3.8GHz frequency for the CPU. The HD 5200 featuring Core i7 4950HQ is at 2.4GHz/3.6GHz for the CPU. Considering how much price differs between mere 100MHz speed grades, 4950HQ looks like a much lower end.

I think they merely lowered clock because TDP limits come into play with GT3 HD 5200 parts.
 

NTMBK

Lifer
Nov 14, 2011
10,448
5,831
136
I'd cut my expectations, because while paper specs say the max Turbo clock for the GPU is identical to the GT2 parts, leaked infos say the operating frequency is usually at 800MHz or so. And Intel claims only 2x advantage for GT3. They've also stated before that max frequency in the GT3 is only for really needy burst scenarios, rather than anything sustained like with GT2.

I was under the impression that the dropped base clock was only in the Ultrabook GT3s- there they're using the "lots of slower shaders" approach to get the power savings with respectable performance. But I also thought that the GT3e (i.e. the one in the quad-core mobile i7s) would be using the full-speed shader clock. I could well be wrong though.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
I was under the impression that the dropped base clock was only in the Ultrabook GT3s- there they're using the "lots of slower shaders" approach to get the power savings with respectable performance. But I also thought that the GT3e (i.e. the one in the quad-core mobile i7s) would be using the full-speed shader clock. I could well be wrong though.

I could be wrong as well. But remember couple of points. We're talking about manufacturer claims which oftentimes isn't even met, and then we have rumor sites that inflate such claims to unrealistic heights.

Few times a manufacturer will give conservative estimates, like Intel with Conroe/Merom. But that rarely happens, and in graphics, there were lot greater expectations with numerous Intel graphics(Even then it could be best case, as with Ivy Bridge being said to have 70% gain).

Yes, I am pro Intel, as you pointed out and no doubt lot of others know too. But that doesn't mean I have serious doubts when my hopes are unrealistic compared to what it really may be.

Intel will need their "R300" moment, before any reasonable gamer stops dismissing them. I believe Haswell could, but I know just as well it might not.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
The GFLOP argument is nice, but if you look at actual gaming performance,

Using GFlops to compare performance can work, if you know what you are looking for.

The HD 4000 used in Ivy Bridge ULV has a GFlop rating of 294GFlops at 1.15GHz. But consider that in games, it usually operates at 1GHz, and that the 294GFlops is peak, not sustained.

192GFlops
8GTexels/s
25.6GB/s shared

The Geforce GT 640M LE used in the Razer Edge uses a Kepler core and it has a GFlops rating of 384GFlops.

384GFlops
16GTexels/s
28.8GB/s

For AMD HD 7660G:

382GFlops-527GFlops
12GTexels/s
25.6GB/s shared

The HD 7660G is really only capable of no more than 382GFlops based on gaming benchmarks. In 3DMark11, it seems it can, but none of the gaming benches reflect 3DMark11. I assume its because 3DMark11 workload itself is so demanding to the GPU(3DMark11 would run at single digit frames) that the GPU gets most of the TDP, while CPU is running very low. In games, that wouldn't be the case, so the GPU runs at Base clocks.

I assume the split between Fillrate, Flop, and Memory Bandwidth is about an even 1/3rd each. Shared memory setups when gaming typically behave like they have effectively half the memory bandwidth. So that's why we see that GT 640M LE with 2x the fillrate, flops, and memory bandwidth outperforms the HD 4000 in average by 2x(those that deviate are due to game specific issues or drivers).

GT 640M LE's fillrate advantage should account for 10% advantage over Trinity, while memory bandwidth would account for about 30%. Combined is 40-45%, with greater differences likely due to the GT 640M LE's setup having a superior CPU compared to the Trinity one.
 
Last edited:

erunion

Senior member
Jan 20, 2013
765
0
0
The simple answer is that APUs just aren't up to the task yet. If the advantages were large enough, APUs would see wider adoption rate. But they aren't.

They aren't yet up to acceptable gpu performance to replace discrete GPUs and ULV versions aren't readily available. Once they decided that they need a discrete GPU to deliver the performance they wanted, Intel was the obvious choice to pair with it.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
Stop using the term use case! I can't read a tech article without the author spouting off use case at least once. Its like the techie's "just saying". Unless you have a UML design doc, please refrain from saying "use case." :p
 

Sleepingforest

Platinum Member
Nov 18, 2012
2,375
0
76
Because the Razer Edge Pro isn't a conventional iPad style tablet. It's a fully fledged computer. Think of it as an Ultrabook with no keyboard and a touch screen.