eVGA GeForce 7900 GTO 512MB - $255 shipped from ChiefValue

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blckgrffn

Diamond Member
May 1, 2003
9,684
4,331
136
www.teamjuchems.com
Originally posted by: Guido06
Coolbits auto detect 720/770 or so.

7800GTX 4683

Stock 7900GTO 3dmark06 - 5841

Changed to 700/800, ran 3dmark06 - 6354

Going to try and up the chip to 715+ and hopefully get the memory up.

Does anyone know how to get Coolbits to get the memory slide overclock to go greater than 800?

I hit that too. n7 has a post over in video about this beast now. 725 caused my card to fail the coolbits test. I am comfy @ 700. That last little bit isn't going to change much. The news of the day is the memory is great to GTX speeds!

Nat
 

postmortemIA

Diamond Member
Jul 11, 2006
7,721
40
91
Originally posted by: InSuboRdiNaTioN
Originally posted by: postmortemIA
Haha you guys don't get that this is sign that DX 10 cardsare around a corner.

When deal looks too good, chances are that it is.

DX10 cards aren't far out on the horizon. But being able to buy this card for $400+ one day and $240 the next is hot no matter how you look at it. These things pack a ton of horsepower and at these prices I don't think depreciation will be too ugly even when DX10 cards are initially launched.
You want depreciation? Buy a G80 when it first hits the market and wait a few months. Then we'll see which is the better "deal."


Well tell me now how much is my 6800 Ultra worth? I bought it for $350 and then it was super deal (May 2005), because at most places it was well about $450-$500. Soon after, nobody could buy any of them.

Sorry for "making fun" of this hot deal, I'm just saying it might not be so great deal as it seems at first.
 

videopho

Diamond Member
Apr 8, 2005
4,185
29
91
--------------------------------------------------------------------------------
Originally posted by: postmortemIA
Haha you guys don't get that this is sign that DX 10 cardsare around a corner.

When deal looks too good, chances are that it is.
--------------------------------------------------------------------------------

I bet he's drooling as hell...
 

blckgrffn

Diamond Member
May 1, 2003
9,684
4,331
136
www.teamjuchems.com
Originally posted by: postmortemIA
Originally posted by: InSuboRdiNaTioN
Originally posted by: postmortemIA
Haha you guys don't get that this is sign that DX 10 cardsare around a corner.

When deal looks too good, chances are that it is.

DX10 cards aren't far out on the horizon. But being able to buy this card for $400+ one day and $240 the next is hot no matter how you look at it. These things pack a ton of horsepower and at these prices I don't think depreciation will be too ugly even when DX10 cards are initially launched.
You want depreciation? Buy a G80 when it first hits the market and wait a few months. Then we'll see which is the better "deal."


Well tell me now how much is my 6800 Ultra worth? I bought it for $350 and then it was super deal (May 2005), because at most places it was well about $450-$500. Soon after, nobody could by any of them.


Probably $125 or so. You bought at the realistic end of AGP! :p

I bought my 7800GT last June/July (launch day) for $345...

We *know* G80 is going to be huge, hot, and power hungry. I upgraded from an x850xt with an artic cooler. I like a good mix of power, noise, and power consumption, and I am not so much worried about staying with latest tech curve ;)

Nat
 

msi1337

Diamond Member
Apr 16, 2003
7,818
69
101
bah... only about 20 more hours till I have mine in my hands!!

too bad ups isnt delivering the rest of my new c2d rig until thursday..

gonna be like a crazed man staring at his video card with nothing to plug it into..

and I am an upgradeaholic as well, I bought the following

dec 05: 6600GT, feb 06 6800GT, april 06 X1800xt, july 06 x1600 (for wife), sept 06 7600GT, and now oct 06 7900GTO (and gave the 7600gt to the wife!)

 

Navid

Diamond Member
Jul 26, 2004
5,053
0
0
Originally posted by: postmortemIA
Sorry for "making fun" of this hot deal, I'm just saying it might not be so great deal as it seems at first.

That's how computer hardware depreciates. Nothing new! It's been like that for a long time.
 

Harmattan

Senior member
Oct 3, 2006
207
0
0
Uuuuhg, I couldn't resist - I've got two on the way from NewEgg. Hopefully, these will tide me over until the 1st DX10 card refresh... that's what I keep telling myself anyways.

$1M question is... will my Antec TP 2.0 550W be able to handle them? My CPU (X2 4400)isn't OC'd and I've only got 1 HD and nothing else extravegant sucking pw. Anyone out there got advice?
 

msi1337

Diamond Member
Apr 16, 2003
7,818
69
101
Originally posted by: Harmattan
Uuuuhg, I couldn't resist - I've got two on the way from NewEgg. Hopefully, these will tide me over until the 1st DX10 card refresh... that's what I keep telling myself anyways.

$1M question is... will my Antec TP 2.0 550W be able to handle them? My CPU (X2 4400)isn't OC'd and I've only got 1 HD and nothing else extravegant sucking pw. Anyone out there got advice?

according to the PSU calculator yes... it reads about 400watts needed for your setup...however, I know many people are very biased against Antec TruePower PSU's

 

JACKHAMMER

Platinum Member
Oct 9, 1999
2,870
0
76
Originally posted by: postmortemIA
Originally posted by: InSuboRdiNaTioN
Originally posted by: postmortemIA
Haha you guys don't get that this is sign that DX 10 cardsare around a corner.

Well tell me now how much is my 6800 Ultra worth? I bought it for $350 and then it was super deal (May 2005), because at most places it was well about $450-$500. Soon after, nobody could buy any of them.

Sorry for "making fun" of this hot deal, I'm just saying it might not be so great deal as it seems at first.


Well, G80 is going to be $450 for the cheaper version; too rich for my blood. @ $250 this and the 1900xt represent the best bang for your buck till atleast Jan-Feb (my guess). And that is what defines a hot deal, especially if this is limited availability and they run out soon.

Anyways, I bit. This will be replacing my ATI 9800 in an upcoming C2D build that I seem to be buying bit by bit.
 

Zap

Elite Member
Oct 13, 1999
22,377
7
81
Originally posted by: Devistater
Yeah a number of modern graphics cards run 2d stuff at a lower speed then crank it up to max when you start up a 3d application. Keeps the card cool and the fan noise down (since the fan slows down if its not hot). And 2d stuff is not generally very demanding anyway.

With Vista, if you're using the Aero interface you'll always be running in 3D mode.

Originally posted by: Navid
When a graphics card runs a 2D application, it generates much less heat even if it is overclocked. If you disable the frequency switching and set them all (2D, 3D, low perf .....) to the same frequency (using ATITool or Riva or whatever), you will see that the temperate only rises when running a 3D application.

So, I am not sure why the card manufacturers do this. The fan can still slow down and become quiet for 2D even if all the frequencies are the same.

So then why do Pentium M chips have SpeedStep? Because it makes a difference, however large or small.

Originally posted by: Woofmeister
We're all enabling each other! Maybe we should try to start a support group like "Upgraders Anonymous." I can see it now:

Hi, my name is Woofmeister and I'm a serial upgrader. It's been two months since my last video card purchase . . .

LOL. Not "upgraders anonymous" but more "hot deals anonymous."

Hello, my name is Zap and I'm a recovering Hot Deals addict. It's been 3 weeks since the last time I posted, "THAT'S HOT, IN FOR 2!"

Originally posted by: postmortemIA
Well tell me now how much is my 6800 Ultra worth?

Considering you can get a new factory overclocked 6800GS with roughly same performance for $140AR...
 

toolfan

Senior member
Oct 11, 1999
285
0
76
Well I couldn't hold out any longer, ordered one. Replacing an unlocked X800 GTO, so should be a pretty nice upgrade.
 

suckerpunch

Junior Member
Oct 2, 2006
15
0
0
In for one from NewEgg. I'm upgrading from an Athlon Thunderbird 900mhz / GeForce2 MX to an E6400 / 7900GTO monster. This will be interesting.

And, in a side note, I bought memory from ZipZoomFly (about which I will post shortly) on 9/30. They still haven't shipped the memory or posted the charge to my card, even though customer reps say the order is perfectly fine. I ordered the card from NewEgg tonight; I'm betting it will be here before the memory purchased three days ago.
 

Navid

Diamond Member
Jul 26, 2004
5,053
0
0
Originally posted by: Zap
So then why do Pentium M chips have SpeedStep?

Pentium M is for laptops, isn't it? I expect they do that to minimize the power consumption to increase battery life. Something that is irrelevant for a gaming PC.


Because it makes a difference, however large or small.
It is going to be small if any and makes no difference in the noise of the fan.
Just give it a try. It is very easy. Overclock your card for 2D and see if the temperature goes up.
Recent Nvidia cards, when overclocked, run in 40s at idle (2D overclocked) and in 70s under load (3D).
Have you tried it?
 

Devistater

Diamond Member
Sep 9, 2001
3,180
0
0
Originally posted by: Navid
Originally posted by: Zap
So then why do Pentium M chips have SpeedStep?

Pentium M is for laptops, isn't it? I expect they do that to minimize the power consumption to increase battery life. Something that is irrelevant for a gaming PC.
Why yes it is. However, core solo and duo and all the current intel desktop dual core and quad core chips are SUPRISE!! Based entirely upon the pentium M with a few additions. And they also include ALL The power saving features as well.

They abandoned the pentium 4 line because it got too hot. They even had to cancel plans for the 4 ghz p4.

Intel and its marketing department decided to go with the p4 strategy (much less work per clock even compared with the p3, but a heck of a mhz headroom in the p4 designs in terms of how high they could push it month after month) solely for marketing purposes so they could claim to have the highest mhz cpu's as compared to AMD. Eventually they reached the end of the line of how high they could crank those suckers and had to bring the laptop pentium design (which was based much more on earlier more effecient and lower power pentium designs than the p4) back to the desktop.

You claim that its irrelevant for a gaming PC which isn't true. Reducing power and thus heat is ALWAYS a good thing for a gaming PC. Not only does it reduce the heat in the case , and reduce the load on the PSU, thus potentially increasing stability, it also makes more headroom for overclocking.

There aren't many overclockers who aren't also gamers :)
 

kenji4life

Senior member
Jun 20, 2006
218
0
0
to add to that.. 'speed step' , cpu throttling, 'cool n quiet' etc has become the standard for computing. as devistater said, this is a positive thing and is DIRECTLY correlated to performance in ANY pc.
 

gamefreakgcb

Platinum Member
Sep 2, 2004
2,354
0
76
the intel speedstep in conroe does not function properly, with it enabled, I get frequent restarts becasue the vcore drops below needed amount.
 

ThinJ

Junior Member
Oct 25, 2005
6
0
0
Hey guys, can some of you post what driver revisions you're using? Or even more specifically, if the card works ok with the newest beta drivers? I believe the beta driver revision I'm using is 92.97. They've been solid as a rock, and I'd like to keep them.

Also, any first-hand impressions regarding SLI with these cards?

As for whether they'll function with the Antec Truepower 2.0 PSU mentioned a few posts up, I sure hope so since that's the supply I have too!

If not I'm prepared to pull the trigger on a PC Power & Cooling Silencer 610 watt.
 

FiLeZz

Diamond Member
Jun 16, 2000
4,778
47
91
Well I as well now pulled the trigger.. I will be pissed if it does not oc to GTX.. L0L, I guess we will see in a few days.

 

Navid

Diamond Member
Jul 26, 2004
5,053
0
0
Originally posted by: Devistater
Originally posted by: Navid
Originally posted by: Zap
So then why do Pentium M chips have SpeedStep?

Pentium M is for laptops, isn't it? I expect they do that to minimize the power consumption to increase battery life. Something that is irrelevant for a gaming PC.
Why yes it is. However, core solo and duo and all the current intel desktop dual core and quad core chips are SUPRISE!! Based entirely upon the pentium M with a few additions. And they also include ALL The power saving features as well.

They abandoned the pentium 4 line because it got too hot. They even had to cancel plans for the 4 ghz p4.

Intel and its marketing department decided to go with the p4 strategy (much less work per clock even compared with the p3, but a heck of a mhz headroom in the p4 designs in terms of how high they could push it month after month) solely for marketing purposes so they could claim to have the highest mhz cpu's as compared to AMD. Eventually they reached the end of the line of how high they could crank those suckers and had to bring the laptop pentium design (which was based much more on earlier more effecient and lower power pentium designs than the p4) back to the desktop.

You claim that its irrelevant for a gaming PC which isn't true. Reducing power and thus heat is ALWAYS a good thing for a gaming PC. Not only does it reduce the heat in the case , and reduce the load on the PSU, thus potentially increasing stability, it also makes more headroom for overclocking.

There aren't many overclockers who aren't also gamers :)

Iguess I was not clear!
I meant trying to extend the battery run time was irrelevant for a gaming PC becasue a gaming PC often runs through a PSU plugged into the power grid.

I never meant that overclocking does not increase power consumption.
It will obviously increase power consumption. But, by how much?
All I meant was that overclocking for 2D does not increase the temperature of the GPU enough to require extra GPU fan RPM, which is what you claimed and I replied to.

Running an LED fan instead of a normal fan also increases power consumption! So, will you remove an LED fan so that you can reduce your cooling fan RPM?

Instead of making all these posts has any of you guys tried it to see how much the temperature will increase if you overclock a GPU that is only running 2D?
Do you have figures for power consumption overlclocked compared to not overclocked for 2D and then for 3D? Seems like if you had those numbers, this discussion would be over.

I will get those numbers tonight when I get home and will post.
 

Devistater

Diamond Member
Sep 9, 2001
3,180
0
0
Originally posted by: ThinJ
Hey guys, can some of you post what driver revisions you're using? Or even more specifically, if the card works ok with the newest beta drivers? I believe the beta driver revision I'm using is 92.97. They've been solid as a rock, and I'd like to keep them.

Also, any first-hand impressions regarding SLI with these cards?

As for whether they'll function with the Antec Truepower 2.0 PSU mentioned a few posts up, I sure hope so since that's the supply I have too!

If not I'm prepared to pull the trigger on a PC Power & Cooling Silencer 610 watt.

Check on nvidias website for that version of the driver, then check the products supported list for that version. It will tell you if its supported.

As for a PSU, check the 12v rail(s) amperage. That will tell you if it can support SLI. I believe one of the pages or pdfs on the EVGA webpage for this product mentioned a reccomended amperage for single and SLI configurations.
Originally posted by: Navid
Iguess I was not clear!
I meant trying to extend the battery run time was irrelevant for a gaming PC becasue a gaming PC often runs through a PSU plugged into the power grid.

I never meant that overclocking does not increase power consumption.
It will obviously increase power consumption. But, by how much?
All I meant was that overclocking for 2D does not increase the temperature of the GPU enough to require extra GPU fan RPM, which is what you claimed and I replied to.

Running an LED fan instead of a normal fan also increases power consumption! So, will you remove an LED fan so that you can reduce your cooling fan RPM?

Instead of making all these posts has any of you guys tried it to see how much the temperature will increase if you overclock a GPU that is only running 2D?
Do you have figures for power consumption overclocked compared to not overclocked for 2D and then for 3D? Seems like if you had those numbers, this discussion would be over.

I will get those numbers tonight when I get home and will post.
Extending battery runtime (aside from battery advances) is done by decreasing power usage which decreases heat. Which are some of the goals of every CPU design whether its for desktop or laptop. And its good for gaming.
An LED consumes VERY little power, certainly far less than clocking a GPU core up.

Whenever you increase clock frequency, even if the core is mostly idle, you will increase power consumption. There is no perfect idle state. The only question is by how much. You say its not a significant amount. I say it could be significant. I can't imagine why they'd design it that way if it wasn't benificial to the power/heat usage in at least a small degree.
 

Navid

Diamond Member
Jul 26, 2004
5,053
0
0
These power figures have been measured in Watts by a Kill-A-Watt device. It is the total power usage of the PSU.
The temperature has been reported by ATITool.
The graphics card is a 7900GT. The RAM clock rate has been kept the same for all these measurements.


Desktop with no 3D running.

GPU f @ 520MHz
133W
GPU T = 45C

GPU f @ 640MHz
133W
GPU T = 45C



While running rthdribl (a 3D application)

GPU f @ 520MHz
181W
GPU T = 59C

GPU f @ 640MHz
186W
GPU T = 59C


If you want me to make any other measurements, let me know.
 

Devistater

Diamond Member
Sep 9, 2001
3,180
0
0
I didn't think there would be a differance so small it wasn't measurable while at idle!

You were right, I was wrong, tell your sister you were right about me.
 

Navid

Diamond Member
Jul 26, 2004
5,053
0
0
Originally posted by: Devistater
I didn't think there would be a differance so small it wasn't measurable while at idle!

You were right, I was wrong, tell your sister you were right about me.

I will not reply to you the way you deserve because there are others who read these posts whom I respect.