• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Ivy Bridge stability testing

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Err... put in however many GPUs you want, your FPS isn't going to go down. CPU requirements don't really scale with res.

The whole point of all the recent microstutter stuff was that you reach a latency that people can't perceive a change. It's above threshold for him.

I'd also argue that, in general, someone's perception of threshold is lower when you're actually playing than when you're sitting there trying to evaluating a threshold. I know that I evaluated my own threshold long ago, and basically couldn't tell a difference until I was skipping more than one refresh when actually playing the game, but I could tell smaller latencies if I wasn't playing and was just evaluating my eyesight.

Certainly there is a difference between people, but I don't believe the BS that people can really tell the difference between anything 60+FPS while actually caring about the game and not specifically testing their eyes. If you're claiming you do, I bet you're a terrible player.
 
"The guy" is a key phrase there, meaning exactly 1 person. There are a lot of other individuals who would notice. I'd say most PC gamers could easily appreciate the 33% improvement. Yes, there is a point where more performance doesn't really do much. 33% from Gulftown to Haswell in BF3 is far from being it IMO.

My experience is the opposite. I've actually tested myself, and a couple friends a few years ago when we were playing Battlefield:Vietnam, and we all couldn't really tell a difference until minimum FPS was near 30. We'd have FRAPS running and make a mental note of when we noticed something and go back and look at FRAPS. For nobody was that above 40.

I'm sorry, but I'm going to have to ask for something more solid than one dude telling me there are "a lot" of people who can tell.
BS.
What there are "a lot" of are people who talk on the internet like they can tell... I mean, they have to justify their spending on hardware. Or think they can, but have never actually tested themselves.

Have you actually tested your capability the way this guy did? I have. I had a matrix of under / over clocked CPU and GPU and had someone else make the settings changes so I wouldn't know what was what. I could only tell in specific portions of certain maps where frame rates really tanked. I definitely could not tell a difference between 60 and 80.
 
was NOT referring to the gameplay experience. you are indeed correct regarding gameplay. once you get to ~45fps and above. very hard to tell the difference between 60fps or 90fps or 120fps. for the few few that claims they can tell such difference. lucky you.

was referring to the actual number of fps. with quad titan for gpu. haswell will walk all over gulftown. fps number difference will be night n day.
 
My experience is the opposite. I've actually tested myself, and a couple friends a few years ago when we were playing Battlefield:Vietnam, and we all couldn't really tell a difference until minimum FPS was near 30. We'd have FRAPS running and make a mental note of when we noticed something and go back and look at FRAPS. For nobody was that above 40.

I'm sorry, but I'm going to have to ask for something more solid than one dude telling me there are "a lot" of people who can tell.
BS.
What there are "a lot" of are people who talk on the internet like they can tell... I mean, they have to justify their spending on hardware. Or think they can, but have never actually tested themselves.

Have you actually tested your capability the way this guy did? I have. I had a matrix of under / over clocked CPU and GPU and had someone else make the settings changes so I wouldn't know what was what. I could only tell in specific portions of certain maps where frame rates really tanked. I definitely could not tell a difference between 60 and 80.

Clearly you have your own biases so what I say doesn't matter. One guy said he couldn't tell the difference, that was gospel. One guy saying many can tell the difference is BS to you... That there tells the whole story and there's no need to go further with the discussion. If you cant tell a difference between a new CPU and one that's 3 generations old, that's great. You'll save a lot of money. To claim everyone else is BSing because they need to justify their purchase however is silly at best.

I could be just as silly and say people who think gulftown gives the same experience as Haswell in gaming are simply trying to convince themselves that if they can't afford or are unwilling to pay for a new system, that it makes no difference.

Now, I won't actually say that, because it would be retarded of me, but it would be along the lines of what you're saying, in the other direction
 
Last edited:
It wasn't enough in 2007 either. In the Barton core AthlonXP days, I had an OC that would pass 24 hours of Prime95, Small FFT / Large FFT... didn't matter, It would pass it. But if I launched a certain application I would reliably blue screen in less than 1 minute. I forget what that application was... I think it was a game.

I don't remember if I stabilized it with voltage or lower clocks. Doesn't really matter, the point is that Prime95 was never enough on it's own. That experience is when I started backing off clock by 5% at same voltage from wherever my stress tests were stable.

Hmm, well maybe I just got lucky (or unlucky now). The thing is I play this RTS Forged Alliance, it's old by now but it's a real cpu killer. I played it first on a Core 2 Duo: Prime stable, no crashes in game. Then I upgraded to i5 750: LinX stable, no crashes in game. Now I have an i5 3570K, LinX stable, lots of crashes in game.

Agreed, on it's own, it was never enough, nor was any one application. It was also never "quick" even when I was stress testing single core Socket A Athlons, a 24hr run was the recommended norm.

I meant quick as in 24H test to prevent having a crash months later.

And was there any noticed game improvements because of the over clock, IE frame rate etc? The OP seems to be asking that question indirectly, someone else posted the answer anything over 4.6 the gains fall off quickly.

Like I said, I play FA and it's purely singlethread perf that matters. If the game gets big 4.6 is still not enough. At stock speed it starts lagging around 30 mins already.
 
Clearly you have your own biases so what I say doesn't matter. One guy said he couldn't tell the difference, that was gospel. One guy saying many can tell the difference is BS to you... That there tells the whole story and there's no need to go further with the discussion.

It's not what one guy said that convinced me.
1) he actually had an experience of a live test. You only issued a statement with no suggestion you had actually tested it.
2) His experience backs up my own experience when I did testing with FRAPS with myself and 2 buddies.
3) everyone WILL have a threshold perception somewhere where they can't really tell a difference in performance above that perception, and I totally believe that's below 60 FPS for the vast majority of people. The question is not whether there is a threshold, but where the threshold exists for the majority of gamers. My current theory is that the median is probably around or less than 40 FPS.

It was less his words that convinced me, and more my own testing experiences. When I tested myself I was surprised as hell that frame drops weren't really noticeable until they got into the mid 30s. I was so surprised, that I asked a couple friends to do the same testing. They were right around the same level of perception.

Now 3 people doesn't mean everyone is, but it's more people than I know who have tested themselves and come up with a threshold well over 60 FPS.

There's a lot of value in testing your threshold and it's not even that difficult. Download FRAPS and record frame rates while you play. When you notice a frame drop, check the frame rate log around the same time and see how low it dipped. Lower your OC until you start noticing drops.

It's pretty useful information. I know it's helped me be a lot more objective and less emotional with my own hardware purchasing decisions. I urge you to do the testing yourself and see what you can perceive. You might be as surprised as I was.

Edit:
I edited to attempt to change the tone. First version came out sounding pretty condescending when I re-read it, which wasn't intentional. I hope this version is much less so.
 
Last edited:
Just thought i would add after many hours of messing around with arena(lots of WHEA errors btw coffee), and lots of encoding i seem to be stable at 0.120v offset 1.320 cpuz.
 
Last edited:
So, can your overclocked Ivy (or Sandy) cpu have these engines battle it out without bsod's, error messages or whea errors? How about when you're doing some heavy duty browsing in the meantime? Or maybe you have yet a different way of testing for stability?

Prime95 and Aida64 did it for me. Prime95 was the hardest and would reliably crash after ~3 hours, where as Aida64 and games were stable. Of course you're also keeping an eye out for unexpected crashes and especially reboots or BSODs during the first couple of weeks or even months after an OC. If it was 24-hour "Prime95 stable", it means it just takes a small tweak to get it stable, should it crash in some other game/app.

Then there are those who simply ignore tests like Prime95 Small FFT's or Aida64 FPU-only because their temps are too high (since they can easily be 20C higher than games), then call it a stable overclock... A stable overclock should be able to handle anything, for any amount of time.
 
Last edited:
Just thought i would add after many hours of messing around with arena(lots of WHEA errors btw coffee), and lots of encoding i seem to be stable at 0.120v offset 1.320 cpuz.

Nice, happy if it helped you. Handbrake worked better than Prime for me too, but not as well as Arena. But maybe I should try encoding one of those 25GB blurays.

Prime95 and Aida64 did it for me. Prime95 was the hardest and would reliably crash after ~3 hours, where as Aida64 and games were stable. Of course you're also keeping an eye out for unexpected crashes and especially reboots or BSODs during the first couple of weeks or even months after an OC. If it was 24-hour "Prime95 stable", it means it just takes a small tweak to get it stable, should it crash in some other game/app.

Then there are those who simply ignore tests like Prime95 Small FFT's or Aida64 FPU-only because their temps are too high (since they can easily be 20C higher than games), then call it a stable overclock... A stable overclock should be able to handle anything, for any amount of time.

That's the thing. For me, 24H Prime didn't get close at all, I still needed a large tweak. Also, different games have different requirements. For example BFP4F (not exactly what one would consider a system killer) required more vcore than COD4 to run without whea errors, although both needed more than Prime/Linx. Anyway, I'd like you to try Arena, it doesn't take long and at the very least gives you another data point.
 
Back
Top