I have a Radeon 4670.
I used to get errors in OCCT when temperatures went above 70 degrees or so. The errors were more frequent the hotter the GPU got. (And if I kept temps below 70, no errors.)
I though it's clearly a hardware thing, not much I can do there.
Then I updated the video driver and now I don't get any errors, even at higher temperatures. (The temps are the same as before when testing.)
What's the explanation? Did the driver increase voltage or reduce the clock? If so, wouldn't this show in different temperatures at least?
Why would a higher temperature alone produce errors with one driver but not with the other?
I used to get errors in OCCT when temperatures went above 70 degrees or so. The errors were more frequent the hotter the GPU got. (And if I kept temps below 70, no errors.)
I though it's clearly a hardware thing, not much I can do there.
Then I updated the video driver and now I don't get any errors, even at higher temperatures. (The temps are the same as before when testing.)
What's the explanation? Did the driver increase voltage or reduce the clock? If so, wouldn't this show in different temperatures at least?
Why would a higher temperature alone produce errors with one driver but not with the other?
Last edited: