• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

FIVR tech on future GPUs ?

Soulkeeper

Diamond Member
any of you think we'll see products do this in the future ?
also, do any video card boards use the newer IR356* with integrated mosfets ?
 
Last edited:
Does AMD or NV have that sort of technology, or just Intel at this point?

Edit: And isn't the point of FIVR for power-saving? What use does that have for 300W GPUs?

Not to mention, it actually increases the heat coming off of the die core. GPUs already have massive heatsinks, this would require more.
 
I was thinking about this the other day as my room temp hit 84F and I started getting hardware errors on a video card.

Thing is 100% stable 24/7 untill the room temp goes above 80F
yet the gpu core itself cooled to like 50C under full load, the vrms just get too hot

I was looking at 2 gigabyte motherboards that use the new IR mosfets and was impressed.

This is the intel board using them
http://cdn.overclock.net/b/b3/500x1000px-LL-b381d26e_11.PNG
The review of the fm2+ board with the same IR fets shows it beating the same socket competitors by 20W on total system power use.

Something like that would be nice in video cards, with FIVR being the next step.
 
The AIB's would love FIVR since they are awesome at failing at cooling their VRMs 100% of the time.

110-120C VRMs seems to be the minimum temperature design target for the AIBs if you see the actual tests on the cards in a stress test environment.
 
Back
Top