FIVR tech on future GPUs ?

Soulkeeper

Diamond Member
Nov 23, 2001
6,731
155
106
any of you think we'll see products do this in the future ?
also, do any video card boards use the newer IR356* with integrated mosfets ?
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,202
126
Does AMD or NV have that sort of technology, or just Intel at this point?

Edit: And isn't the point of FIVR for power-saving? What use does that have for 300W GPUs?

Not to mention, it actually increases the heat coming off of the die core. GPUs already have massive heatsinks, this would require more.
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,731
155
106
I was thinking about this the other day as my room temp hit 84F and I started getting hardware errors on a video card.

Thing is 100% stable 24/7 untill the room temp goes above 80F
yet the gpu core itself cooled to like 50C under full load, the vrms just get too hot

I was looking at 2 gigabyte motherboards that use the new IR mosfets and was impressed.

This is the intel board using them
http://cdn.overclock.net/b/b3/500x1000px-LL-b381d26e_11.PNG
The review of the fm2+ board with the same IR fets shows it beating the same socket competitors by 20W on total system power use.

Something like that would be nice in video cards, with FIVR being the next step.
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
The AIB's would love FIVR since they are awesome at failing at cooling their VRMs 100% of the time.

110-120C VRMs seems to be the minimum temperature design target for the AIBs if you see the actual tests on the cards in a stress test environment.