Ivy to be 7-25% faster than Sandy plus 3x GPU performance

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kevmanw430

Senior member
Mar 11, 2011
279
0
76
That DV6 is fairly good stock, I never_ever recommend overclocking them though, they run hot out of the box when gaming, overclocking them is just asking for pain.


Completely disagree. My DV6z is running 2.6GHz on the CPU a 68C after 5 hours of Prime95, and the GPU is at 70C after running MSI Kombuster and Furmark simultaneosly for 4 hours at 880/950. I did repaste with AS5, but still, pretty good. I could probobly get 2.8 on the CPU, but its pointless. As always, though, it depends wholeheartedly on the system.

On another note, anyone think Trinity could make the IVB IGP irrelevant performance wise as soon as its released? If AMD goes 28nm on the IGP, they could get some really awesome IGP performance. If Piledriver has better power consumption, than it could be a great deal. If it dosen't, though, then AMD is just done.
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Haswell now thats suppose to be 15X what present SB is . Thats a game changer.

Please come back down to earth Nemesis. :)

Just so you don't come back in 2013 and claim that it didn't live up to your expectations, they never claimed 15x. As for Sandy Bridge, it was supposed to be 20% performance gain at the same price point, which is different from overall performance or performance per clock/core.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,224
589
126
Can someone please translate 3x the GPU performance of Intel HD2000 to something tangible? What approximate frame rate will you get in e.g. Battlefield 3 @ 1920x1200 with high quality settings?
 
Last edited:

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Can someone please translate 3x the GPU performance of Intel HD2000 to something tangible? What approximate frame rate will you get in e.g. Battlefield 3 @ 1920x1200 with high quality settings?

I'd guess in the 8-15 range with a lot of chugging from terrible bandwidth.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,224
589
126
I'd guess in the 8-15 range with a lot of chugging from terrible bandwidth.

But possibly the Intel HD4000 IGP can reach a playable 30 fps in BF3 if you lower resolution and/or quality? How much approximately do you think the resolution and quality settings will have to be lowered to reach 30 fps?
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
I would guess you'd have to be at least down in 1680x1050, 1600x900 resolutions and on customized mediumish settings.

Personally I think this kick in IGP development is great. It will be a nice pick me up for PC gaming if people can play the more intense PC titles on their bottom barrel OEM boxes, even if it is with lower settings. We've been needing this boost since Quake 2, I remember someone trying their hardest to play that on intel integrated.
 
Last edited:

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
Can someone please translate 3x the GPU performance of Intel HD2000 to something tangible? What approximate frame rate will you get in e.g. Battlefield 3 @ 1920x1200 with high quality settings?

>go to youtube and do screenshots of the game.
>go to powerpoint glue there.
>press f5
>profit
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
GCN is going to have a very big cache system. And ddr4 is like 2 years from now

It won't work because graphics workloads require FAR more memory space than CPU workloads. And SRAM takes big chunk of space. Even eDRAM takes quite a bit, if looking in terms of a GPU workload. GCN has larger caches to improve performance in compute, not 3D.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
It won't work because graphics workloads require FAR more memory space than CPU workloads. And SRAM takes big chunk of space. Even eDRAM takes quite a bit, if looking in terms of a GPU workload. GCN has larger caches to improve performance in compute, not 3D.

yes, but have a ring-bus in the L3 cache like sandy bridge, and the memory controler of llano, we kill both the problems
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
yes, but have a ring-bus in the L3 cache like sandy bridge, and the memory controler of llano, we kill both the problems

The effect of L3 cache sharing on Sandy Bridge according to Linux side benchmarks are about 15%, which isn't ground breaking. Plus Bulldozer has better memory controller performance than K10 chips(and itself is eclipsed by the one in Sandy Bridge).

Better the GPU vendors focus on getting new memory standards like GDDR6, even XDR, and eventually TSV.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
The effect of L3 cache sharing on Sandy Bridge according to Linux side benchmarks are about 15%, which isn't ground breaking. Plus Bulldozer has better memory controller performance than K10 chips(and itself is eclipsed by the one in Sandy Bridge).

15% ? i tought it was way more.

yeah, i don't know were they can squeze more bandwidth...
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,977
1,276
126
Laptops are really starting to shine these days.

Even a lower end laptop with HD4000 graphics and Ivy Bridge is going to be more than enough for 95% of people. It would even be somewhat decent for gaming. Plus the low TDP will mean good battery life.

I'll be getting an IB laptop late next year to replace my wifes power hungry AMD Turion x2
 

Khato

Golden Member
Jul 15, 2001
1,288
367
136
15% ? i tought it was way more.

yeah, i don't know were they can squeze more bandwidth...

Really, they can't. At least not practically when it comes to integrated graphics. The sharing of the L3 is great for some graphics operations - the hierarchical Z buffer and render cache if I'm remembering correctly. Basically there are a few things that don't take up much space but greatly benefit from the low latency and high bandwidth. But the rest of graphics memory operations, aka the majority of them, simply require too much space for an SRAM.

Now the typical response is to this is that AMD/Intel should increase memory speed/channels or add side-port memory... any of which dramatically increase costs. (Not to mention, Intel has no need for more bandwidth on their graphics for awhile anyway.) And for what's meant to be 'entry level' graphics anyway, increasing platform cost doesn't make sense... it's difficult enough to get OEMs to use anything other than DDR3 1333 ya know. The only case where it's meant to potentially be something more than 'entry level' graphics is on the mobile side, and there adding more memory is a bit more feasible, but then you have a separate design from desktop...

The most interesting possibility for integrated graphics in my opinion is edram. Sure you can't get huge memory sizes with it, but getting up to even 64MB of high bandwidth memory through it would alleviate a fair deal of the issues. Better yet since it's an on-package thing I'd expect that a single design could be used either with or without the edram.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
The most interesting possibility for integrated graphics in my opinion is edram. Sure you can't get huge memory sizes with it, but getting up to even 64MB of high bandwidth memory through it would alleviate a fair deal of the issues. Better yet since it's an on-package thing I'd expect that a single design could be used either with or without the edram.

xbox have 10MB of edram, for the vanilla 720p monitor in notebook, it's enought imo...

redoing the memory controler to a 256bit (8x32) wouldn't be better?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I would guess you'd have to be at least down in 1680x1050, 1600x900 resolutions and on customized mediumish settings.

Personally I think this kick in IGP development is great. It will be a nice pick me up for PC gaming if people can play the more intense PC titles on their bottom barrel OEM boxes, even if it is with lower settings. We've been needing this boost since Quake 2, I remember someone trying their hardest to play that on intel integrated.

1920x1200 high in bf3 is a pretty tough objective for a laptop, especially with integrated graphics. In fact, I stuck with the 1600x900 screen on my new dv7tqe because I was worried about how the 6770m gpu would handle 1080p.
 

Khato

Golden Member
Jul 15, 2001
1,288
367
136
xbox have 10MB of edram, for the vanilla 720p monitor in notebook, it's enought imo...

redoing the memory controler to a 256bit (8x32) wouldn't be better?

The 10MB of edram on the xbox was designed for manufacture on a 90nm process. Assuming linear scaling, the same area on 22nm would allow for around 160MB. Hence 64 or 128MB of edram sounds reasonable in integrated graphics 2-3 years from now.

Simply redoing the memory controller isn't difficult in and of itself... but the increased platform cost/complexity is quite prohibitive. Now sure SB-E would be a bit smaller if it didn't have as many PCI-E lanes, but you're still going to have a huge number of pins, which gets to being quite problematic when it comes to package size for mobile offerings - and every pin increases costs quite a bit as well of course. Next, memory signal paths are extremely sensitive at our current speeds (shorter signal paths and being soldered to the same pc are the primary reasons why graphics card memory can be run at higher speeds than system memory), which is why the SB-E motherboards end up being either 8 or 10 layers thick... compared to the typical 4 layers used on the socket 1155 SB motherboard pcb. aka, you practically need to have 2 layers per 64 bit memory channel for routing purposes. More layers on the motherboard pcb ends up being a considerable expense.

Anyway, all of that combined makes on package edram a far more economical choice, especially if there's enough memory space on it to sate the majority of graphics high-bandwidth traffic.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
It would be nice to see AMD deploying a close cousin to next gen consoles in their APUs. Being able to run any game that was designed for consoles would be great for mobile computing.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
It would be nice to see AMD deploying a close cousin to next gen consoles in their APUs. Being able to run any game that was designed for consoles would be great for mobile computing.

This.

Even 70-80% of next-gen consoles initially would be 'good enough' to game on lower resolutions that folks usually do anyway for laptops. If almost every laptop that tom, dick, or harry buy can play most games, it's like being back just before 3D became widespread. You can just see a game, buy it, and play it. Everyday folks don't want to mess with drivers, or tweak resolution/quality controls. Most people struggle just to connect their BD player to their TV successfully. No joke.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
This.

Even 70-80% of next-gen consoles initially would be 'good enough' to game on lower resolutions that folks usually do anyway for laptops. If almost every laptop that tom, dick, or harry buy can play most games, it's like being back just before 3D became widespread. You can just see a game, buy it, and play it. Everyday folks don't want to mess with drivers, or tweak resolution/quality controls. Most people struggle just to connect their BD player to their TV successfully. No joke.

My personal wish, unrealistic as it is, is that game developers would build in a dynamically scalable (kinda like tesselation in philosophy) graphics quality which itself was adjusted by the game engine in realtime in response to framerates with the goal being to keep the eyecandy at a level that enabled 60fps all the time.

Kinda like profiling, only its within the game engine itself and it would be on by default with the option available for turning it off for the geeks who wanted 120fps or don't mind playing at 30fps with better eye candy,

And because its based on monitoring existing fps, its hardware sensitive and thus optimizes the gameplay per the individuals mish-mash of CPU, ram, background processes, etc, and not just based solely on the GPU make and model.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
My personal wish, unrealistic as it is, is that game developers would build in a dynamically scalable (kinda like tesselation in philosophy) graphics quality which itself was adjusted by the game engine in realtime in response to framerates with the goal being to keep the eyecandy at a level that enabled 60fps all the time.
...

The more recent Unreal games had a go at doing this. Rage is also doing some amount of this to maintain frame rates. In both cases however they didn't allow the setting of a target FPS and the effect isn't very noticeable. I like the idea in principle but I also think I want to choose what features it is that turn off. I for example might not mind be dropped from 4x AA to 2xAA between frames but I don't want my rendered resolution to drop below native ever.

Its got to cost CPU time to do this, plus additional memory to hold lower quality assets and hence to some extent the ability to scale the graphics results in more CPU overhead and more memory usage. That actually goes contrary to the primary purpose of scaling as low and as high as possible almost automatically.

I am off the opinion that a lot more functionality should be in Windows/DX for performance testing of cards. That combined with some preferences would allow games to query and set better defaults.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Well, I'm not sure how strictly they will obey in game quality settings. :D

I watched an i5-2105 HD 3000 vs A8-3850 APU BF3 test. Settings were all low and if it was selectable, off (like HBAO and AA), 2x AF, at 1366 x 768, 70 degree field of view. A8 averaged 30 FPS (which I didn't expect), and the HD 3000 averaged 10. I don't know if the APU was running any DX11 stuff or not, though my assumption is yes, since if I remember right low/off settings will not enable DX11 things like tessellation, but don't quote me here.

The video.....

If HD 4000 is supposed to be around 60% faster, you can do the math. Luckily BF3 is well coded, so it seems to take advantage of how actually powerful the A8 really is, at least in comparison to the HD 3000.
 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
This.
Everyday folks don't want to mess with drivers, or tweak resolution/quality controls. Most people struggle just to connect their BD player to their TV successfully. No joke.

Heh. Isn't that the truth. And y'know, who can blame them?