• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[GeForce Forums] Nvidia has officially blocked 900M overclocking

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.
What is up with nvidia these days? It's like they are trying to see how far they can push and how much they can abuse their own customers before they crack


Disabling overclocking? My guess the reasoning for this is they currently stratify mobile GPUs by clockspeed (and often nothing else) so overclocking basically gives you a more expensive gpu. But so what, it should still be binned lower.

I just don't get how a company so beloved by its fans can treat them so poorly.

nothing will happen. some people on forums like these will talk about it maybe rant but in reality nothing will change. nvidia will play their game while their supporters will downplay anything that they think will damage nvidias reputation.
 
This really sucks.

I run my 660m at 1085/2500 (+135 mhz on core) and get a nice 10% gain in fps. Laptop has no trouble cooling because there is no voltage increase. This is especially crappy for maxwell because Maxwell overclocks like a champ with no voltage increases and the power gain from overclocking is minimal compared to the performance gain.

Its really looking like I'm going to stay away from Nvidia for the foreseeable future.
Exactly, it blows chunks.

I've got my 650M SLI overclocked by a 46% on core (1150 MHz vs. 790 MHz) with a +50mV overvolt and 13% on VRAM (4.5 GHz vs. 4 GHz), which nets me a good 40% performance increase across the board. That's like buying a whole 'nother tier of GPU. And how much did temps increase? 5 degrees, from 70C to 75C. Big deal.

This overclocking is the reason I was able to match the performance of 770M and 680M notebooks costing twice as much as mine at its time of purchase!

It's clear this move was influenced in a big way by planned obsolescence and Nvidia's need to sell rebadged GPUs over successive generations.
 
Last edited:
I've been overclocking laptop GPUs (primarily nvidia) for years. The key to prevent throttling is to download ThrottleStop to limit CPU turbo to allow more overhead for the GPU to work with.

It is perfectly safe to OC a mobile GPU as long as you stay within the thermal limits of the notebook. My Macbook Pro's 650m is clocked at 775/2000 MHz but my OC profile is 1050/2700 MHz with zero throttling. If I let my CPU turbo itself, then the GPU may throttle because the dinky little PSU notebooks come with are usually the limiting factor.

Because these chips have locked voltage, temperatures are not an issue unless the notebook OEM did not apply enough TIM to cover the GPU die or it has really crappy design.

Ultrabook form factor can easily house an OCd GPU, as long as you compensate for the increased power usage by lowering CPU clock speed.
 
My Macbook Pro's 650m is clocked at 775/2000 MHz but my OC profile is 1050/2700 MHz with zero throttling.

D: I have the same macbook - and I've had the logic board replaced TWICE (second time was this week, actually) and they also swapped the heatsink out...and I never OCed it. On top of that, I can fry eggs on my Macbook by doing something as simple as playing a flash game. I can't imagine overclocking that thing.
 
I can't believe Nvidia apologists are even supporting this. Wow. I am graphics card agnostic and even though this doesn't affect me I will absolutely be wary of Nvidia. Greedy corporation.
 
I really didn't realize people bothered to OC a notebook GPU - for the same reason nvidia just killed it: limited heat dissipation abilities...

The vendor should be doing the OC, NOT the user. Because if someone burns out their chip, odds are they're going to RMA it for repair, and the vendor will be paying for the user's poor decisions. Don't you think this move by NV might be to benefit the OEMs?

This is my thoughts too, if you really need the best performance, a notebook is not it. FCS, a notebook has enough thermals and power envelope limits to worry about without having some inexperienced user blow the unit!
It just sounds crazy and this thread just looks like another attempt to trash NVidia to me. FFS
How many gamers are out there OC their AMD notebook GPUs? Oh wait, AMD doesnt have a viable notebook GPU...
 
a notebook has enough thermals and power envelope limits to worry about without having some inexperienced user blow the unit!
Nope, not all of them do. This is incorrect, like most assumptions and generalizations. As a matter of fact, the new 800M and 900M Maxwell cards on which OC is disabled run extremely cool and efficient, much more so than your GTX 660.

Notebook users are humans, too. We don't deserve to have this feature suddenly taken away from us in one fell swoop.

I would like to see the rioting when Nvidia disables overclocking on desktop cards. 🙄
 
The fact that most people think overclocking a laptop will decrease its lifespan is the most concerning thing.

Simply increasing the clockspeed does LITTLE for temperatures or power use. You're looking at maybe a 2-3 degree rise on temps which can easily be tamed with a cooling pad or better thermal paste.

I've seen countless comments when I've googled for information on overclocking laptop GPUs: "OMG DONT OC THE LAPTOP YOU NOOB UR GONNA DESTROY THE WORLD"

The fact that so many people think adjusting a clock offset and not touching voltage does anything for heat/power use is simply disturbing.

In fact, why aren't people chastising NVidia for not putting a power limit in mobile GPUs like desktop? Or allowing me to undervolt my mobile GPU so I can figure out the lowest possible voltage for stock clocks?

My girlfriend's laptop has a 760M and I run it at +100 boost clock. It's FREE PERFORMANCE!?!?!?!!!!!?!

If I do something to my 760M that it doesn't like, the driver crashes. That's basically all. Worst case, I corrupt Windows if I'm SUPER unlucky, but almost a 0% chance for hardware damage.
 
Does anyone in this thread own a GTX 980m or GTX 970m?


both.png



Found a few saved benches. When I first purchased it 1490 was my best on stock clocks. I eventually got over 1700 before the Nvidia Nerf Ninja showed up one night..
 
I must say, Nvidia seems to be getting worse and worse. While normally I would be brand agnostic, this really will steer me away from them for some time to come I suspect.

I guess I am not upgrading my Sager anytime soon, as this kind of behavior should not be encouraged. (currently with a GTX 560M)
 
It would be hilarious if after they do this, they release a GTX 980 Ti version that's basically an OCed GTX 980M.

Edit: Even better if people returned their GTX 980M Laptops for GTX 980Ti M Versions after OCing has been disabled to get an "OC".

Of course they will. It will be a rebrand. Just like 780m -> 880m
 
Nvidia needs to be knocked down a notch so they start being consumer friendly again. Seems like they have given up trying in the GPU market.
 
D: I have the same macbook - and I've had the logic board replaced TWICE (second time was this week, actually) and they also swapped the heatsink out...and I never OCed it. On top of that, I can fry eggs on my Macbook by doing something as simple as playing a flash game. I can't imagine overclocking that thing.

After I cleaned off the original TIM and reapplied it, I saw drastic temperature improvements along with higher overclocking stability.

Take your Macbook apart some day and check out how well the TIM is applied. I could see bare die on the CPU and GPU, so I wouldn't be surprised if my Macbook was the only one with poor spread.
 
D: I have the same macbook - and I've had the logic board replaced TWICE (second time was this week, actually) and they also swapped the heatsink out...and I never OCed it. On top of that, I can fry eggs on my Macbook by doing something as simple as playing a flash game. I can't imagine overclocking that thing.
Question. Is this by any chance the 2010 model?
 
Nope, not all of them do. This is incorrect, like most assumptions and generalizations. As a matter of fact, the new 800M and 900M Maxwell cards on which OC is disabled run extremely cool and efficient, much more so than your GTX 660.

Notebook users are humans, too. We don't deserve to have this feature suddenly taken away from us in one fell swoop.

I would like to see the rioting when Nvidia disables overclocking on desktop cards. 🙄

I doubt they do run cooler than my GTX660, but that is not the point. There is no space in a notebook chassis for much cooling. They more heat they generate, the bigger and louder the fan. Everything has limits. I dont see why you think you should know them better than the engineers who built them.
There are trade offs with notebooks, accept them!

Why do you bring desktop clocking into the equation? 🙄 Sounds a bit immature to me!
 
They got sued over this not that long ago (G80). Running a gpu past its thermal limits would do such a thing.
There was never a mobile G80 card.

I doubt they do run cooler than my GTX660, but that is not the point. There is no space in a notebook chassis for much cooling. They more heat they generate, the bigger and louder the fan. Everything has limits. I dont see why you think you should know them better than the engineers who built them.
There are trade offs with notebooks, accept them!

Why do you bring desktop clocking into the equation? 🙄 Sounds a bit immature to me!
Your argument is invalid because notebook GPUs have much lower TDPs. The overbuilt chassis and cooling on Alienware, ASUS, Clevo, and MSI machines handle them more than fine under load even with heavy overclocking. You are making blanket statements stemming from your ignorance and mistaken beliefs/assumptions about gaming notebooks, how they are engineered/built, and the performance and thermals they are capable of. I daresay you have never taken apart and seen the insides of a high-end multi-GPU notebook from Alienware/Clevo or used one for an extended period of time.
 
Last edited:
After I cleaned off the original TIM and reapplied it, I saw drastic temperature improvements along with higher overclocking stability.

Take your Macbook apart some day and check out how well the TIM is applied. I could see bare die on the CPU and GPU, so I wouldn't be surprised if my Macbook was the only one with poor spread.

That's quite possibly true - I'll have to do that once I'm out of warranty. I have apple care on this thing (hence why they're replacing the logic board right now for free) - once that's gone this summer I'll open it up. If I keep getting graphics issues (it looks like switching from on die to discrete GPU causes a hang sometimes) Apple has said the final fix will be a new macbook...

Thanks for the tip!

Question. Is this by any chance the 2010 model?

2012, with a 650M.
 
That's quite possibly true - I'll have to do that once I'm out of warranty. I have apple care on this thing (hence why they're replacing the logic board right now for free) - once that's gone this summer I'll open it up. If I keep getting graphics issues (it looks like switching from on die to discrete GPU causes a hang sometimes) Apple has said the final fix will be a new macbook...

Thanks for the tip!



2012, with a 650M.
Interesting. I know they (and many other manufacturers) had an issue in 2010. When I was working as a PC tech, we had lots of these coming. NV used cheaper TIM IIRC and many motherboards died, or just the GPUs. One way of fixing those was cooking the boards in the oven 🙂
OT, the Macbooks are very thin and don't have room for OCing. There are other options that do allow that as many have mentioned.
I don't understand why people let companies take things they paid for and don't care. When you buy a gasoline car, you can put diesel in it. It will be stupid and will break it, but you can. Because it's yours.
 
This is horrible customer care.
My 880M GTX is just a 780M GTX factory overclocked so i can see a 990M GTX coming
 
They got sued over this not that long ago (G80). Running a gpu past its thermal limits would do such a thing.

How does that have anything at all to do with my post? Please explain this to me.

If I remember rightly, this had more to do with NVidia lying about TDP, the industry as a whole switching to a new type of solder, and laptop manufacturers using cooling systems that weren't up to the task in the first place (partly because NVidia lied about TDP).

Did all those laptops fail because the users added +5-10% to the clock speed? No...

My point was increasing clockspeed to the limit of stability on stock voltages has little to do with heat output and it's basically free performance that this thread suggests NVidia wants to take away from you.

I doubt they do run cooler than my GTX660, but that is not the point. There is no space in a notebook chassis for much cooling. They more heat they generate, the bigger and louder the fan. Everything has limits. I dont see why you think you should know them better than the engineers who built them.
There are trade offs with notebooks, accept them!

Wait a minute, is this another post talking about the things I've just aired my massive frustrations about?

Seriously, go bench the gaming laptops you clearly own and know lots about and do some benches with clockspeeds at stock, lower and higher and report the heat output.

What will make more of a difference to heat output than overclocking at stock voltage is turning the heating up in your house.

So you heard it here first: Do not try to run your gaming laptops in any environments that the engineers have not cited them to run in. Better to be safe than sorry as these GPUs are running at near limits and running your heating will probably lead to your laptop exploding in about 6 months!!!!!111

And, no, I don't know them any better than the engineers who built them. But I understand the basics of how these chips operate, and increasing the clockspeed at stock voltages will not lower the life of your laptop by very much. And if you were that concerned for the long life of your laptop, you'd be running it above a large cooling pad with replaced TIM, which I doubt most people are doing...
 
Last edited:
This is my thoughts too, if you really need the best performance, a notebook is not it. FCS, a notebook has enough thermals and power envelope limits to worry about without having some inexperienced user blow the unit!
It just sounds crazy and this thread just looks like another attempt to trash NVidia to me. FFS
How many gamers are out there OC their AMD notebook GPUs? Oh wait, AMD doesnt have a viable notebook GPU...

So if you buy a notebook you dont want best performance?

So instead people should buy desktops or what?

So for a classical segmentation strategy, litterally used since the stone age and perfectly valid and sound - and i dont blame NV for that exept it came without warning - , we now have every other explanations:

1. OEM wanting it
2. They are going to blow (suddenly they blow up with the new driver??)
3. Dont buy a notebook if you want best performance

What else?
Dont play games?

It still amazes me how a brand can turn even enthusiast brain into smoke.
 
This is my thoughts too, if you really need the best performance, a notebook is not it. FCS, a notebook has enough thermals and power envelope limits to worry about without having some inexperienced user blow the unit!
It just sounds crazy and this thread just looks like another attempt to trash NVidia to me. FFS
How many gamers are out there OC their AMD notebook GPUs? Oh wait, AMD doesnt have a viable notebook GPU...

I know a few of you posters defend ANYTHING Nvidia does, but seriously, what does AMD have to do with this thread?! Baiting and trolling is fun huh.

So now Nvidia dictates who and what laptops are meant for? Please cite your sources for inexperienced users blowing their gaming laptops due to overclocking.
 
There is to effects of segmentation - as here keeping the difference from 970m to 980m to two clearly seperate segment:

Profit maximization because you know a typical little segment will pay much for "little" in more objective terms - because their personal benefit is great because they value money less.

From a society perspective its also smart because - even for similar performance/functionality - rich people pay more. Try imagining you are selling strawberries at a market. If you see some rich tourist comming you give the price a notch, but when the local mama gets there you give some discount because else she will go elsewhere.
For a 980m its smart that the small benefit to a 970m goes to the few that can pay; you get a bigger revenue that can go into r&d and newer product. And in the end everyone wins. The "richer" that get the future 980 so to speak and the ones that get future 970.

Segmentation is not something wrong. It good.

The only problem here imo is it comes without warning. They were basicly selling a different product than what people got. There is no exuse for that. People accepting it by using their fantacy to produce whatever reasons, is just making sure we get treated bad next time also.
 
Personally I think Nvidia locking the gpu on a laptop is a better alternative than a complete noob overclocking and destroying their laptop. For me it is the lesser of two evils.
 
Status
Not open for further replies.
Back
Top