bit-tech.net reveals Ivy Bridge details

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

OVerLoRDI

Diamond Member
Jan 22, 2006
5,490
4
81
Depending on the price and overclocking potential vs 2500k I might grab one of these. I am very much liking how Intel is giving these sneak previews and discussing the technology in their future products.

Also 22nm, plus water cooling, plus lots of voltage makes me happy. Depending on how good the 22nm process is we could see some crazy overclocking headroom on these chips. Again all speculation, but still crossing my fingers.
 

OVerLoRDI

Diamond Member
Jan 22, 2006
5,490
4
81
All of these new CPUs coming out seem to be incremental upgrades. Ivy Bridge looks like it will be 10% faster than Sandy Bridge. Bulldozer looks to perform worse clock-for-clock compared with Phenom II, and then Bulldozer's successor is supposedly 10% faster.

It seems as though the days when we would see CPUs released that were an order of magnitude faster than anything else on the market are gone. It's too bad.

It's looking like my Phenom II setup is going to last me at least another year in terms of gaming.

Depends on your setup and what applications you care about. For me my phenom II setup was bottle necking my video card setup massively. When I swapped for my 2500k setup I saw min frame rates double and averages increase by about 50% in some games (Metro 2033 for example)
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Metro 2033 is a bad game and it just hogs system resources without looking significantly better than other games out there. In fact, I would argue that Crysis 2 has better graphics than it while consuming fewer resources.

I find it hard to believe that a 2500k is a significant boost over a Phenom II in terms of gaming. I find I'm VRAM limited more than anything else in games. Otherwise I find my system to be a potent and balanced gaming platform.

I will admit that Sandy Bridge was a nice CPU upon its release.

The thing is, I don't upgrade my CPU until I'm going to at the very least double my current performance.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
he is certainly exaggerating the differences there. its funny how I used to talk about real cpu bottlenecks and people would argue with me yet now every person with a 2500k can just exaggerate like crazy and not even get criticized.

even at just 1680 and only high detail there is basically zero difference between the Phenom 2 and 2500K.


hosting images
 
Last edited:

OVerLoRDI

Diamond Member
Jan 22, 2006
5,490
4
81
he is certainly exaggerating the differences there. its funny how I used to talk about real cpu bottlenecks and people would argue with me yet now every person with a 2500k can just exaggerate like crazy and not even get criticized.

even at just 1680 and only high detail there is basically zero difference between the Phenom 2 and 2500K.


free image hosting

I said depending on your setup. Single GPU systems are rarely held back by a Phenom II, but crossfire two high end overclocked GPUs and a Phenom II becomes a pretty significant bottleneck.

http://forums.anandtech.com/showthread.php?t=2187849

Thread I made a month or so ago in regards to my performance after the upgrade. I had some issue with Unigine, but my metro before and after results are in that thread also. Those results are from before I overclocked my cpu to 4.5 Ghz.
 
Last edited:

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Yep, that's the thing. My Phenom II X4 at 3.8ghz is pretty potent for gaming, especially with my CPU-NB up at 2.4ghz. If anything I'm GPU limited, even with a pair of GTX 460s.

Now, if I had a pair of 580s in SLI, I would probably want a 2500K at the very least, however I don't know too many people with $1000+ to drop on a rig to play mostly 2nd rate games and console ports.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
All of these new CPUs coming out seem to be incremental upgrades. Ivy Bridge looks like it will be 10% faster than Sandy Bridge. Bulldozer looks to perform worse clock-for-clock compared with Phenom II, and then Bulldozer's successor is supposedly 10% faster.

It seems as though the days when we would see CPUs released that were an order of magnitude faster than anything else on the market are gone. It's too bad.

It's looking like my Phenom II setup is going to last me at least another year in terms of gaming.

Pollack's rule.

Pollack's Rule states that microprocessor "performance increase due to microarchitecture advances is roughly proportional to [the] square root of [the] increase in complexity". This contrasts with power consumption increase, which is roughly linearly proportional to the increase in complexity.

d83de59e10227072a9c034ce10029c39-1.png


d83de59e10227072a9c034ce10029c39-2.png


Changes early on are substantial. Later on, not so much.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Perhaps God does not want us to create a computer that is more powerful than the human brain. :)

It looks as though I'll be using my Phenom II for quite some time into the future. :D
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I said depending on your setup. Single GPU systems are rarely held back by a Phenom II, but crossfire two high end GPUs and a Phenom II becomes a pretty significant bottleneck.

http://forums.anandtech.com/showthread.php?t=2187849

Thread I made a month or so ago in regards to my performance after the upgrade. I had some issue with Unigine, but my metro before and after results are in that thread also. Those results are from before I overclocked my cpu to 4.5 Ghz.
my bad, I thought you had a single 6970 there. as for the minimums in the Metro 2033 benchmark they are almost useless though. with a flyby bench like that they do not reflect real game minimums at all because its usually just ONE dip. heck I can sometimes get lower minimums in that benchmark running lower settings.
 

OVerLoRDI

Diamond Member
Jan 22, 2006
5,490
4
81
my bad, I thought you had a single 6970 there. as for the minimums in the Metro 2033 benchmark they are almost useless though. with a flyby bench like that they do not reflect real game minimums at all because its usually just ONE dip. heck I can sometimes get lower minimums in that benchmark running lower settings.

True Metro isn't exactly the best engine in the world, nor will benchmarks ever 100% reflect real world "gaming". It was the only game I had before and after benches for. It is possible and likely the other games experienced less benefit from the upgrade compared to the significant improvement in metro 2033.
 
Aug 11, 2008
10,451
642
126
CPU progress has seemed to plateau allright. I think pretty much any technology does this though. And even Intel seems to be emphasizing other factors instead of brute CPU power, such as graphics and new instruction sets. Personally, I wonder if Ivy bridge is taking the right track by improving graphics much more the CPU power. For low to mid range, this may be ok, but for the high end, I cannot imagine not using a discrete GPU with such a powerful CPU.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
True Metro isn't exactly the best engine in the world, nor will benchmarks ever 100% reflect real world "gaming". It was the only game I had before and after benches for. It is possible and likely the other games experienced less benefit from the upgrade compared to the significant improvement in metro 2033.
The one game that I find is bottlenecked by a Phenom II is StarCraft 2.

You're probably correct in stating that Metro 2033 is as well, however IMO it's a bad game and it's certainly not worth investing money into in terms of the hardware it needs to run properly.

I've found that my computer will run any game at 1920x1200 as though it's some sort of joke. The FPS stay pegged at 60 pretty much constantly, so long as I'm not memory limited on my GPUs (and that can usually be solved by tweaking a few settings).
 

OVerLoRDI

Diamond Member
Jan 22, 2006
5,490
4
81
The one game that I find is bottlenecked by a Phenom II is StarCraft 2.

You're probably correct in stating that Metro 2033 is as well, however IMO it's a bad game and it's certainly not worth investing money into in terms of the hardware it needs to run properly.

I've found that my computer will run any game at 1920x1200 as though it's some sort of joke. The FPS stay pegged at 60 pretty much constantly, so long as I'm not memory limited on my GPUs (and that can usually be solved by tweaking a few settings).

I don't play SC2, but yes it is cpu limited in general. That game is where SB shines in gaming vs PhII for sure.

My screen is 1920x1200 at 85hz, so I need a bit more than being pegged at 60. I also like my eye candy ;)

Anyways this is off topic.
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
A high post count, then, is indicative of a high number of irrelevant posts such as this one. It is not indicative of the quality of those posts or the intelligence and expertise of the poster.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
A high post count, then, is indicative of a high number of irrelevant posts such as this one. It is not indicative of the quality of those posts or the intelligence and expertise of the poster.
You got it. :thumbsup::thumbsup:
 

greenhawk

Platinum Member
Feb 23, 2011
2,007
1
71
however I don't know too many people with $1000+ to drop on a rig to play mostly 2nd rate games and console ports.

while I was thinking of upgrading to SB from my Q6600, that second part of the post is the reason I am holding off, either IB or something after it. Double the performance is on offer currently, but the list of tasks I do that would benifit from it is rather blue moon in occurance. The games I play do not appear to be effected by it. Even StarCraft2 which is ment to be cpu blocked runs fine as far as I can tell.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
There have been very few CPUs with the longevity of the Q6600. They cost north of $200 when they came out, but if you got 4 years out of one, more power to you.

I almost bought one but I thought they were too expensive at the time. I'm a cheapskate with CPUs. I don't typically pay more than $100 for one.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
while I was thinking of upgrading to SB from my Q6600, that second part of the post is the reason I am holding off, either IB or something after it. Double the performance is on offer currently, but the list of tasks I do that would benifit from it is rather blue moon in occurance. The games I play do not appear to be effected by it. Even StarCraft2 which is ment to be cpu blocked runs fine as far as I can tell.

Erm, in such a case, unless you want to spend a lot on your CPU, wait till Haswell. Brings in a load of new improvements. IB on the other hand, has most improvements in igp, but cpu part will see only a minor speed-up in most cases (read 10 odd percent). You'd be better off buying may be a hd7xxx card. I personally have a hd6970 paired with a e6750.
 

zlejedi

Senior member
Mar 23, 2009
303
0
0
Hmm i wonder if the gains will be worth it over 2500K on desktop front.

But for laptops I have Asus UL30 which still runs 45nm Core 2 CULV so IB ultra mobile chip will be perfect replacement (especially if that IGP can run World of Tanks decently)
 

carnage10

Member
Feb 26, 2010
38
0
0
As usual Intel provides in depth information about their next gen well in advance. Seems they aren't worried about "cannibalizing their own sales" or "handing information to their competitors" as jf amd is.

It's a strategy that works well, because now anyone in the market to build a new pc now will be tempted to hold off buying bd. If bd was gunna be a leap in performance amd would have done the same thing months ago to stem the tide of sb sales. If you have a product worth hyping then intels approach makes for much happier shareholders/ enthusiasts. On the other hand if you have a product that has been delayed and set back and will not likely live up to initial expectations then I guess amds approach would be better off for them.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
2 months, 6 months, or 3 years is still not "delivered on schedule" and that was my point. AMD's delays are pretty ridiculous though.

This.

We need to stop comparing intel to AMD. There is no more competition from AMD, and I don't see that ever changing unfortunately.