Am I stable?

Vlip

Member
Mar 19, 2007
32
0
0
Hello,

I've got my E4300 running at 2.97 GHz (300 FSB) and 1.3850V. I can run Orthos all day and night without any issues but TAT (100% both cores) causes system reboot in under 10 minutes.
Should I consider this a stable overclock or not? I would include all my other system info but I don't think it is relevant to the question. If anyone thinks otherwise, please let me know and I will post the rest of the BIOS settings and hardware specs.

Thanks,
Vlip
 

SerpentRoyal

Banned
May 20, 2007
3,517
0
0
You don't run Orthos or TAT under normal use. Therefore, if the PC is stable under normal use, then there should not be any stability issue. I only load Orthos for about 1 to 2 hours.

I would also run memtest86 test #5 for 50 loops to check RAMs.
 

themisfit610

Golden Member
Apr 16, 2006
1,352
2
81
I would say that the most brutal, realistic, normal use would be video encoding. It's easy to see if you're stable with this.

Use x264, and encode a DVD overnight using the slowest (highest quality) settings. It will take a _long_ time, and it will stress the bejezus out of your CPU. If you can do that without the program exiting prematurely or hitting thermal shutdown, you're good to go :)

That or a 3 hour long session of supcom :)

~Misfit
 

Vlip

Member
Mar 19, 2007
32
0
0
I have run Memtest86+ full pass without errors.
I am interested in the x264 test. DVD authoring is what I do most on this PC. I will google it as I do not know where to get it.

Thanks,
Vlip
 

SerpentRoyal

Banned
May 20, 2007
3,517
0
0
And if you failed that test, then we're back to the TAT question. You need to define what's considered stable with your rig, not some benchmark program. Some test programs will not work properly with your particular rig.
 

Vlip

Member
Mar 19, 2007
32
0
0
Good point. I actually haven't tried running TAT with no overclocking!

Thanks for the responses.

Now I can move on and try to get to even higher speeds. My target is 3.2 to 3.4 GHz. While I've seen a number of postings that say this is doable I don't know how exaggerated the claims are.

Vlip
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
Keep in mind that there are different ways an OC-related issue can show up.

When your CPU makes a mistake, there is some chance you notice - for example, due to a program crashing, or visual artifacts in a game - but there is also a chance that you don't notice. There are different errors a CPU can make that have different results.

One error that would not cause any problems is a mistake in a branch prediction. Since the CPU knows that the branch predictor is often wrong, it checks the predictions and recovers if the predictions are wrong. Since the branch predictor doesn't affect correctness, the worst case result would be a tiny slowdown. Another error that wouldn't matter is an error that corrupts information that isn't reused. Every time the CPU adds two numbers, it tracks some "flags" that note if the result is positive, negative, or zero. If a flag is miscalculated - but isn't checked - nothing bad happens.

An error that would cause a crash might be a mistake in a TLB, or a mistake reading / decoding an instruction (which could result in the CPU trying to do something illegal). Depending on what code happens to be running when this happens, the result could be a single application crashing or the whole machine crashing.

An error that could silently corrupt data would be a bit flipping in a floating point calculation. If, say, you're doing your taxes and your CPU is overclocked, it's possible that one of the bits of a floating point calculation doesn't get back into the register file before the clock ticks, so the wrong answer is stored. You might not notice so long as the error doesn't cause an obvious discrepancy... that's silent data corruption. Another example would be a media encoding app that sometimes corrupts video frames. You won't notice until you try to watch the video later.

Given a flaky CPU (which intermittently makes a specific type of mistake, e.g. calculating an incorrect multiplication), it's very possible for some applications to crash horribly (e.g. assuming that a multiplication result won't be zero, and dividing by the result of the multiplication) and for another application to appear to work, but produce subtly incorrect results. If there is any test that fails with a certain OC, it's risky (at best!) to consider that OC "stable for other tasks".
 

SerpentRoyal

Banned
May 20, 2007
3,517
0
0
I always use the 1 to 2 hours Prime/Othos rule. Over the last 6 years of overclocking, I have never had any problem with data corruption. The worst case is spontaneous reboot, which would only damage the data that I'm working on. Bad RAMs can create much more harm to data than a CPU that will not pass all benchmarks under the sun. Benchmarking software are designed to stress the CPU to the limit of failure, usually by introducing HEAT. I have yet to see a tax software that would even load a modern DC CPU to 60%.
 

themisfit610

Golden Member
Apr 16, 2006
1,352
2
81
VERY true SerpentRoyal.

Give x264 a try though - look up MeGUI or StaxRip or AutoMKV or HandBrake. These are all nice apps that wrap a GUI around x264 and are designed primarily for DVD backup.

~MiSfit