Delidded my GTX460...[update 9/18] results are in!

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Idontcare

Elite Member
Oct 10, 1999
21,126
0
0
#26
Is there a reverse correlation between core size and durability?
Yes there is. It's intrinsic to all materials, the fundamentals being that the critical fracture toughness of a material is inversely proportional to thickness of the sample size (below a minimum threshold).



That said, my biggest concern is that the corners of the die will act as stress concentrators, easily exceeding Kic and cracking off the corners, if I am not careful.

The Kic for the corners of the die will be independent of the die-size (except for cases of extremely small, <1mm on edge or there abouts).
 

Idontcare

Elite Member
Oct 10, 1999
21,126
0
0
#27
I think those bare chips are fairly durable, or at least ATI thought so anyway. The 5770 I re-TIMed had no IHS (though it was maybe 1/3 the size of that 460 chip).
Structurally, mechanically, I'd expect the AMD GPU chips to be nearly identical to Nvidia's considering that they are both made at the same foundry with the same wafers and same wafer processing history.

They would differ if either one of them elected to have a non-standard wafer thinning process used in packaging, but I haven't seen any evidence of this so far.

Where the actual difference in propensity to crack the die will come out is in the PCB and socket structure. If one is more flexible then the other then the silicon chip itself will become more of a rigid structural member taking on the load when the PCB is under pressure from the HSF mounting bracket.

This is the biggest unknown for me here. AMD definitely appears to ring their GPU with a metal bracket whereas the Nvidia setup does not have this.
 

Idontcare

Elite Member
Oct 10, 1999
21,126
0
0
#28
Updating thread with some pics from lapping the Accelero Extreme:

Stock surface with TIM removed (91% IPA):


Added a hatch-pattern with a blue sharpie pen to gage the degree of flatness the stock surface had:


After a few passes on 220 grit:


The stock surface was rather flat, only the extreme right edge is indicating signs of convexity (curves away from the GPU silicon surface).

Nearly finished on 220 grit:


Finished at 220 grit, nice and flat:


After 1000 grit:


After 2000 grit:


After 3000 grit:


Nice and flat :thumbsup:
 

Idontcare

Elite Member
Oct 10, 1999
21,126
0
0
#29
Here's the best shot I could grab depicting the gap that is created by removing the IHS from the GPU:



My plan is to dremel cut the standoff posts and then collar them with springs so there is some resistance between the PCB and the HSF mounting bracket.
 

Klavshc

Junior Member
Sep 5, 2011
12
0
0
#30
very beautiful shots

if I may ask, what did you use for you "leveling" material, a pane of glass or anything else?

Also if you had pictures in a monstrously higher resolution, I'd love to drool over them.
For some reason I really really dig the macro view of heatsinks, tims etc.

Good luck with your project.
 

Idontcare

Elite Member
Oct 10, 1999
21,126
0
0
#31
very beautiful shots

if I may ask, what did you use for you "leveling" material, a pane of glass or anything else?

Also if you had pictures in a monstrously higher resolution, I'd love to drool over them.
For some reason I really really dig the macro view of heatsinks, tims etc.

Good luck with your project.
Thanks :)

For leveling I use a plate glass that is thick/sturdy enough to be the desktop to a desk of mine. Not a desk-top covering, the actual desktop itself is composed entirely of glass.

I take that plate glass desk top and I place it onto a second sturdy wood table, which I then use for polishing.

You can see it here:


If you like what you see in this thread, you might enjoy the pics over in my Noctua NH-D14 and Corsair H100 thread in cases and cooling.

For higher res pics, I intentionally downgraded these pics to avoid thrashing my photobucket bandwidth, but there are 10MPixel (or is it 14?) originals to be found if you peruse my photobucket album directly at this link. I won't embed them or my photobucket account will definitely be trashed, but beware they are 5-6MB pics each if you do download them.

Look for the pics that end with "-1.jpg" and those are the original full resolution versions.

edit: hrmm...photobucket apparently downsized my high-res pics to 1400 x 1100...the originals I uploaded were 4320x3240 :\
 
Last edited:

Klavshc

Junior Member
Sep 5, 2011
12
0
0
#32
awesome thanks.

skim-read the other post and oggled all the pictures thoroughly!
 

Idontcare

Elite Member
Oct 10, 1999
21,126
0
0
#33
Woot, my ram came in today (G.Skill RMA turn-around time FTW :thumbsup:) so I was able to get my system powered up for some tests.

Here's what I did with the GPU, first I took out the dremel with a metal cut blade and cut the standoff posts in half:



^ you can see the springs I bought laying in the background of that photo. They were $0.75 each, I only used two for this project.

Here's the uncut spring jacketing the sawed off post:



I cut the ends off each spring (using the dremel cutter again), here are the spring-jacketed posts now:



Here's one final shot of the polished Accelero HSF surface:



For the first attempt, I added an very small dot of Noctua NT-H1 TIM:



^ This turned out to be too little, I failed to grab a photo showing how much the TIM spread across the silicon surface but it did not cover corner-to-corner and I knew something was amiss because the screen was artifacting during the BIOS initialization phase of booting the computer.

I took off the Accelero and cleaned off the NT-H1, then added a much larger dollop:



^ that turned out to be the right amount. No artifacting now :thumbsup:

I've got OCCT running now on the GPU, I'll update the thread with temp results as soon as I have them :)
 

StrangerGuy

Diamond Member
May 9, 2004
8,399
9
91
#34
Maybe I'm wrong but, to me the card shouldn't artifact at all even if there wasn't any TIM in the BIOS phase. It probably means the HSF base is barely touching the GPU die.
 
Nov 19, 2004
11,817
1
0
#35
Nice IDC. Been following the NH-D14 thread as well.

You are certainly entertaining yourself and as always producing high quality informative posts for the rest of us.

[lurk]
 

Idontcare

Elite Member
Oct 10, 1999
21,126
0
0
#37
Maybe I'm wrong but, to me the card shouldn't artifact at all even if there wasn't any TIM in the BIOS phase. It probably means the HSF base is barely touching the GPU die.
Becase the TIM is not zero viscosity, even under high pressure it does not flatten out to the extremes, leaving an air-gap between the Silicon and the HSF wherever the TIM failed to cover. Those parts of the die will get extremely hot.

No TIM would actually be better, a lot better, than having too little TIM.
 

Idontcare

Elite Member
Oct 10, 1999
21,126
0
0
#38
OK...drum-roll please...and the result is: a 10C decrease in temps! :)

Here's my temps at 1.087 V, 865 MHz, and 80% fanspeed (auto-control) before I delidded: temps were 59C with OCCT



And after delidding, same voltage and clockspeed, but now at only 60% fanspeed (auto-controlled): temps are now only 49C



10C decrease in temps, nice. Even if I don't get a higher OC (haven't checked it yet), having the fans running 20% lower rpm's is nice because I can't hear them at all at the 60% they now top out at in OCCT.
 

blackened23

Diamond Member
Jul 26, 2011
8,556
0
0
#40
:thumbsup::thumbsup:

I wish I was brave enough to undertake such a project :cool:
 
Sep 5, 2003
19,460
0
126
#41
10*C drop is huge, esp. on such an impressive cooler already. Great work!

:thumbsup::thumbsup:
 

Idontcare

Elite Member
Oct 10, 1999
21,126
0
0
#42
10*C drop is huge, esp. on such an impressive cooler already. Great work!

:thumbsup::thumbsup:
That's the thing, once I had a peek under the IHS to see just how poorly coupled the IHS was/is to the GPU silicon I was not surprised at all to see how much improvement resulted...

What is odd is that this just goes to show how much opportunity Nvidia is leaving on the table with their IHS lidding process and the end result of cooler-fan noise, thermals, and power-consumption that the consumer experiences.

We all know the beneficial purposes of lidding the chip in terms of durability and longevity, but such a porous internal surface combined with gobs of TIM is going to result in exactly what we are seeing here.
 

dakU7

Senior member
Sep 15, 2010
515
0
0
#43
Wow. Impressive.
49c @ load is what I would expect from a water cooled GPU.
 

Patrick Wolf

Platinum Member
Jan 5, 2005
2,443
0
0
#44
What is odd is that this just goes to show how much opportunity Nvidia is leaving on the table with their IHS lidding process and the end result of cooler-fan noise, thermals, and power-consumption that the consumer experiences.
No kidding, NV should put you on their payroll. Nice work!

Everyone that reads this thread should e-mail nvidia and tell them to stop using an IHS.
 
Last edited:

Seero

Golden Member
Nov 4, 2009
1,456
0
0
#46
Idontcare, impressive work.
 

WMD

Senior member
Apr 13, 2011
476
0
0
#47
Now do the same on a gtx580. I would love to see 1000mhz core and 30k vantage gpu score on air with that card.
 

zebrax2

Senior member
Nov 18, 2007
909
10
81
#49
A 10 degree decrease in temp is awesome

How do you make sure that the lap is straight(parallel to the chip) rather than slanted?


This impressive project and result makes me ask: why is there a lid on the GPU to begin with?
We all know the beneficial purposes of lidding the chip in terms of durability and longevity, but such a porous internal surface combined with gobs of TIM is going to result in exactly what we are seeing here.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,683
1
0
#50
THIS! is just about the MOST beautill pictue ever :)





BIG thumbs up for doing this Idontcare.



And holy flying cows! 80% fan -> 60% fan use, and your still 10 degree's colder than before!

I guess the guys that make GPUs *could* learn a thing or two from this.

Just like CPU coolers come with better polish, its time for GPU coolers (even factory reference designs) to get with the times.


This impressive project and result makes me ask: why is there a lid on the GPU to begin with?
I secound this notion.... why the hell is it there in the first place.
 
Last edited:
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS