• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

What are the chances of another bumpgate by nvidia?

Barfo

Lifer
My dad just got a laptop with a nvidia (520m) GPU and optimus. It seems like a pretty good balance between performance and batter life and I'm thinking of getting one to replace my aging Core 2 with integrated graphics. However, I'm hesitant because of the nvidia gpu. I don't see them as a pro consumer company, what with their pathetic settlement of the class action and 8800 and 9800 cards dying a lot these days. I don't want to have my computer die an early death a year from now. Do you think they've learned their lesson or should I avoid nvidia cards like the plague?
 
Same chances of the Xbox 720 needing rapidly-deployed "coffins" by MS.

Do we have any proof that G92 cards are dying more than other cards the same age, or could it just be that there was a metric crapton more of them sold than any other card?

I have a feeling you will be just fine.
 
Probably 0% for that particular issue. But there will always be manufacturing or design bugs.
We just saw this with Intel and the sata issue. A area that was in revision 2 at production.
Ford rollover problem that killed passengers.
AMD Phenom TLB bug, where buyers ended up with a substantially slower platform than reviewed, when 'fixed'.
Xbox 360 RROD,
I could go on and on.
 
Last edited:
Yeah, I don't think they will make that mistake again. You should be fine.
Oh they won't make *that* mistake again, but chances of them making *some* mistake are pretty much constant.

There'll always be some problems and nobody knows who will be the next. No, the only thing that would worry me is Nvidia's attitude there - if there's a defect in the product you surely want a reaction like Intel has shown with their SATA bug and not Nvidia's years of denial and costly lawsuits until you get a subpar replacement (lol @single core replacement in 2011 for c2ds)
 
Considering that Nvidia introduced fermi, aka 4xx series, as a hot powerhungry thing with an architecture that they admitted was a mistep, id say the chances for a new bumpgate is high.

Not buying another laptop with nvidia hardware in it, thats for certain. The limited cooling options basicly means your better off with AMD Radeons that are cooler and less powerhungry and WAY less prone to exploding.
 
I wouldn't hesitate to buy their mobile solutions. I have a netbook (Asus 1015PN) with ION2/Optimus and it works great, drivers are good too.
 
The warning bell with I see with the G92s was the reference cooler was single slot. And as soon as I revved up a game it sounded like it was choking hard.
 
The warning bell with I see with the G92s was the reference cooler was single slot. And as soon as I revved up a game it sounded like it was choking hard.
Yes , the single slot coolers are horrible.

sapphire-hd4850-1g,V-2-212366-13.jpg


edit: we pay a premium for single slot coolers now.
 
Last edited:
Yes , the single slot coolers are horrible.

sapphire-hd4850-1g,V-2-212366-13.jpg


edit: we pay a premium for single slot coolers now.

Single slot cooler wasn't adequate for the 4850 either. The 9600GT I used for a while was much louder and hotter than the 5770 I had next to it.
 
nah, I will not buy their GPU again after what they do with the bumpgate, after all amd gpu is more efficient.

Btw nvdia have a track record to kill your gpu with driver.
There's 2 already happening, thank god I rarely install new driver
 
I really think some give credit to marketing way too much. It has been ATi/Amd/nVidia engineering prowess that has created their success when it comes to GPU's, imho!
 
Someone correct me if i am wrong, but AFAIK wasn't bumpgate caused by bad solder which nvidia only switched to (from their tried and tested old solder) due to EU environmental regulations preventing the use of their old solder recipe?

There wasn't the time for long term testing and they got unlucky compared to AMD when choosing their alternative solder.
 
Someone correct me if i am wrong, but AFAIK wasn't bumpgate caused by bad solder which nvidia only switched to (from their tried and tested old solder) due to EU environmental regulations preventing the use of their old solder recipe?

There wasn't the time for long term testing and they got unlucky compared to AMD when choosing their alternative solder.


I believe the crux of it was:
Nvidia’s bumps were cracking due to repeated thermal stresses, leading to physical failure.

How it came to pass that they ended up in that situation? row of bad decisions? not testing stuff long enough?

Can another situation like that happend? probably..
 
It's just as likely that AMD will have a bumpgate next. There is no good reason to think one GPU maker is more likely to have this happen than the other.

The Xbox 360's RRoD bumpgate was from an ATI GPU.
 
I've have multiple customers with bad lappys because of nvidia. I actually tried 'baking' the motherboard from a laptop or 2 and it actually fixed it for awhile. Hopefully no-one will get ripped-off again by that junk.
 
Someone correct me if i am wrong, but AFAIK wasn't bumpgate caused by bad solder which nvidia only switched to (from their tried and tested old solder) due to EU environmental regulations preventing the use of their old solder recipe?

The new Pb-free solder contributed to the problem I believe. But remember pretty much every electronics company had the same set of rules for Pb-free solder.

The Xbox 360's RRoD bumpgate was from an ATI GPU.

It had more to do with the pathetic cooling IMO. ATI only designed the GPU. MS was responsible to get it manufactured as well as the cooling.
 
Last edited:
The new Pb-free solder contributed to the problem I believe. But remember pretty much every electronics company had the same set of rules for Pb-free solder.

but they chose different ones. AMD went with one alternative to Pb, nvidia with another.

It had more to do with the pathetic cooling IMO. ATI only designed the GPU. MS was responsible to get it manufactured as well as the cooling.
Among many other corners cut. It later came out that MS engineers told management exactly what is going to happen and the people in charge didn't believe them and went with the "cost cutting" anyways.
 
> xbox

It had more to do with the pathetic cooling IMO. ATI only designed the GPU. MS was responsible to get it manufactured as well as the cooling.
But in both cases better cooling would have prevented the solder cracks, right?

That is, if MS had a better cooler, and if nvidia had different guidance to laptop makers about what temps were allowed for their GPU module, all would have been well.
 
But in both cases better cooling would have prevented the solder cracks, right?

That is, if MS had a better cooler, and if nvidia had different guidance to laptop makers about what temps were allowed for their GPU module, all would have been well.

The Xbox 360 issue wasn't cooling. If it was cooling related more 360's would fail from over use or in warmer climates.

The 360 issue was as said poor decision making in cost cutting. It wasn't only the GPU that broke, it was solder points around the GPU, around the CPU, and overall around many of the stress points where the old X-Clamp was located, this is why some RROD's were reviable via towel tricks and more torque on the heatsinks and others wouldn't revive.
 
Back
Top