integrated + discreet CF... turn off your video card when not using?

taltamir

Lifer
Mar 21, 2004
13,576
6
76
http://www.anandtech.com/video/showdoc.aspx?i=3178

I was just reading the anand article about AMDs new hybrid crossfire... and its the first multi GPU I can get behind. The idea of turning off your video card when not gaming to save electricity is phenomenal. Too bad it would take a while until it arrives. I am surprised they are limiting it to the 34xx series though.. If I knew I could use it with a 3870 card I would recommend it over a 8800GT. Sure the performance is a little weaker, but its also slightly cheaper and the notion of turning off you video card when not in use is tickling my imagination.

But seeing as there are no games I haven't finished already that require anything more then the 7900GS, I will end up waiting for the next gen to upgrade anyways and might just end up getting it.

They are saying they will start it off with the low end... giving 40-70% boost in performance.
My problem with that is that this is that as usual they are messing things up..
The people who buy those sort of things are not really gonna know what the hell they are getting, or to look for it.
They are also missing out on the greatest benefit of this scheme, turning off the video card.
turning off your video card is much MUCH more attractive when you run a beast of a video card. Not when you are running a puny low end video card that works off a 2 inch fan without even a heatsink (well, there is a little heatsink like thingie but its too small to call a heatsink).

I am wondering how much of a performance boost you would get with a 3870 kind of card? even a 5% boost might be good if you are buying AMD processor anyways. It will make my choice between an AMD or nVidia chipset for a phenom lean towards AMD. (assuming the phenom is fixed and even competitive by then).


Even if it had a 0% benefit though... if it didn't even crossfire but instead either the video card or the built in gpu was working, and the video card could forward its frames to the built in one. I would go for it... windows runs off the built in gpu, games run off the video card. And the video card shuts off when not in use. Brilliant.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
37 views and not a single reply a day later? Now that is a surprise..

Say how much kwh does a top end video card use on idle anyawys? I am thinking 80+ watts... right?
 

GuitarMachine

Junior Member
Dec 9, 2007
5
0
0
i really like the idea of CF in this way, and like you stated it would be that much more awesome if you could use what ever card you wanted, high end or not.
to be honest ive really been holding out for a decent integrated solution that would satisfy my needs. usually MMORPG's arent really graphic intensive and it would save some money if i only needed another low end card.

from what ive seen a lot of cards use over 100W and get super hot :)
 

fire400

Diamond Member
Nov 21, 2005
5,204
21
81
Intel still has tri-gate and Sandy Bridge to look forward to, not that both will correspond to each other exactly, however, don't get too comfortable with what you're seeing right now with all the hype AMD is shooting for.

AMD and ATI are trying to be a regime of multicore masters? We'll see how they put up against nVidia, Intel and VIA.

You know the funny thing is, I read the fusion technology is supposed to be sitting on low-end to mid-range computers. While AMD's puttin' their teams on fusion technology, let's hope it brings them to the highend line-up pretty soon.

Intel 3GHz quadcore, 1600MHz FSB, 12mb L2
Geforce 9K discreet graphics
4GB DDR3-1800

so you'd choose that over a fusion two years from now?
 

GuitarMachine

Junior Member
Dec 9, 2007
5
0
0
two years is a long time, dont forget it ... .
and by saying this, I am not commiting to anything before I see results ;)