to be honest i really expecting AMD would use sideport memory like thier current inboard GPU but use GDDR5 instead, so it won't be bottlenecked by dual channel ram config
I could see them do something like this, it is technically feasible, but this would not be an inexpensive product (which Llano is supposed to be).
I could see them do something like this, it is technically feasible, but this would not be an inexpensive product (which Llano is supposed to be).
Also, wasn't the sideport only 32 bits wide?
Low-k = reduced power consumption, high-k = better performance, right?
And one problem that shouldn't be overlooked is how to cool that RAM. GDDR5 really isn't in the same league as DDR3 so apart from the more expensive memory you'd also need some cooling solution I'd assume.I could see them do something like this, it is technically feasible, but this would not be an inexpensive product (which Llano is supposed to be).
When do you guys think we'll start seeing reviews?
Is a 10% boost clock-for-clock too high of an expectation?
they could just make a "premium" version and it will have identical or even faster than 5570 performance, and I think right now GDDR5 is cheap enough even HD 6450 use it, and I guess it just cost them $20 to add 512Mb GDDR5 ram on board.
no, gddr5 is not that hot, even hd 6990 just use simple backplate to cool it,And one problem that shouldn't be overlooked is how to cool that RAM. GDDR5 really isn't in the same league as DDR3 so apart from the more expensive memory you'd also need some cooling solution I'd assume.
So even if Llano wasn't a budget solution, what would be the advantage of this over an external GPU? I don't see much.
Btw if adding gddr 5 ram is so expensive then why amd can put it in $40 hd 6450 card?
If adding high speed memory on a board to speed integrated graphics made so much sense, they'd have put a far better one in the 790-class boards. But in reality, they added memory that's slower than last generation single channel memory, all for a reason.
You try to put even a regular bandwidth extra memory in there, I wouldn't be surprised if the costs happen to be equal to buying a $40 video card seperately. Even if it only adds half the price at $20, you significantly reduce the attractiveness of having an integrated video in the first place.
I could see them do something like this, it is technically feasible, but this would not be an inexpensive product (which Llano is supposed to be).
no, gddr5 is not that hot, even hd 6990 just use simple backplate to cool it,
the advantage is to make a very slim and light weight gaming laptop or lan party rig.
Btw if adding gddr 5 ram is so expensive then why amd can put it in $40 hd 6450 card?
Yeah a backplate that's connected with some large cooler. Not exactly the same as having to be cooled passively in a stuffed case with dozens of cables hanging around.no, gddr5 is not that hot, even hd 6990 just use simple backplate to cool it
Just having faster memory with a slow IGP still won't be especially useful as a gaming laptop (and I'd think when designing the llano IGP the engineers tried to hit a good balance) and as soon as you have a external GPU that has enough power you get the memory anyways.wahdangun said:the advantage is to make a very slim and light weight gaming laptop or lan party rig.
Because they already have the design for those cards and buy GDDR in masses. They don't need two different memory controllers (and have to design one first).wahdangun said:Btw if adding gddr 5 ram is so expensive then why amd can put it in $40 hd 6450 card?
An interesting benchmark, would be running F@H on both the Llano CPU + GPU cores, compared to, say, a SB 2600K running F@H on just the CPU cores. (Granted, with the -bigadv WU bonuses, that comparison might be unfair.)What I want to know is....
How much GPGPU power does the Llano have? since its part of the reason their doing integrated GPU (a claim).
And one problem that shouldn't be overlooked is how to cool that RAM. GDDR5 really isn't in the same league as DDR3 so apart from the more expensive memory you'd also need some cooling solution I'd assume.
So even if Llano wasn't a budget solution, what would be the advantage of this over an external GPU? I don't see much.
I don't like the name Lano , wtf is that. No wonder its bunk Ill stick with ASUS and EVGA
Wait we weren't discussing external GPU vs IGP, we were discussing external GPU vs. IGP + external GDDR side port memory. Which is quite different because it more or less gets rid of all your 4 advantages you listed. A IGP is fine for absolutely everyone who doesn't game and even for the latter group it can be "good enough" if they lower their expectations.Advantages of On die GPUs:
1) price (cheaper to have 1 chip vs 1cpu+1discrete gpu with pcb/ram/heat sinks ect)
2) latency (things being closer (on chip) less latency)
3) power useage (again 1 chip vs 2 thingy)
4) avoid bottlenecks (apperntly they can remove bottlenecks this way)
So "llano" > "athlon II + radeon 5550" in quite a few things.
there are weaknesses to this way though:
1) memory bandwidth issues because system DDR3 isnt as fast as a discrete cards GDDR5.
2) everything in 1 place, means more heat issues to deal with.
Wait we weren't discussing external GPU vs IGP, we were discussing external GPU vs. IGP + external GDDR side port memory. Which is quite different because it more or less gets rid of all your 4 advantages you listed. A IGP is fine for absolutely everyone who doesn't game and even for the latter group it can be "good enough" if they lower their expectations.
What you're getting at is integrating higher end GPUs on-die which seems quite likely in the future (although more in form of heterogeneous cores and no strong distinction between CPU and GPU any more)
Will it play Crysis, or Metro 2033, probably not. But it can play HL2, Portal 2, Civ 5, SC2 (both), Wow and hundreds of more popular games perfectly fine (in theory).
Honestly I think the heat issues, and memory bandwidth issues... will mean that high end discrete GPUs wont be going anywhere.What you're getting at is integrating higher end GPUs on-die which seems quite likely in the future (although more in form of heterogeneous cores and no strong distinction between CPU and GPU any more)
