2900XT close in price to the 8800GTS...

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Keysplayr2003
Originally posted by: Creig
Originally posted by: SickBeast
I agree that the 320MB GTS would be a good card for you. There is a very good chance that you would never need the extra 320MB of memory. It's cheaper and better than the 2900XT (in your case with the lower screen resolution).

Agreed.

Agreed

Originally posted by: SickBeast
IMO the 2900XT runs too hot and uses up too much power to be considered viable. You'll probably need a PSU with a rating 100w higher than the 8800 cards need.

Actually, the 2900XT pulls about 90w more only against the 8800GTS 320. Comparing it to an 8800GTS 640 this narrows to 65w and finally drops to only 17w against an 8800GTX.

"Only" 90W more than a 8800GTS320? Isn't 90W about enough to run a modern Dual Core CPU? 90W is substantial a a good indicator of how much more heat a 2900XT generates as well.

Originally posted by: SickBeast
Plus, the PSU needs to have the new 8-pin connectors which is a PITA.

You do not need to have an 8-pin connector in order to use the 2900XT. The only time you need an 8-pin connector is if you intend to overclock the card and void your warranty. The 2900XT works just fine with the 6-pin connectors.

Quoted from ATI/AMD's specifications for the 2900XT512:
"Connection to 550 Watt (750 for CrossFire?) or greater power supply with two 2x3-pin PCIe® power connectors is required.
For enhanced performance with ATI Overdrive?, a power supply with one 2x3-pin and one 2x4-pin PCIe® power connector is required.
"

So the 8pin PCI-e connector actually allows the card to o/c itself (under warranty).

Originally posted by: SickBeast
The GTS consistantly outperforms the 2900XT from what I have seen. Granted, they are usually pretty close and the 2900XT wins a few tests, but really the 8800 is the superior card.

I thought that the newer drivers put the 2900XT performance above the 8800GTS 640 and within reach of the 8800GTX. Or did Nvidia also have some major driver improvements which placed the 2900XT back to 8800GTS levels?

Link to the latest full blown review with latest drivers from both camps?

Originally posted by: SickBeast
Also keep in mind that you will save money on electricity by going with the more efficient 8800

I do agree that you will save money, but it is a very, VERY, VERY small amount. Even if you compare the 2900XT to the 8800GTS 320, that's a 90 watt difference. The average time spent playing games is 8 hours per week. 8 hours per week x 4 weeks per month x 90 watts = 2800 watts = 2.8 kW x .05 cents per kW = 14.4 cents per month. That amount is so negligible that it cannot even be considered a relevant factor.

Do these calculation include idle time? Or just strictly gaming? Either way, if your calculations are truly indicative of what power costs are, then I never want to hear anyone complain about power usage costs again. BUT, it is still 90W more that a given PSU has to pump out. The OP has a 430W Thermaltake, but as you can see above, ATI/AMD recommends 550W for single 2900XT.


Originally posted by: SickBeast
plus your entire rig will run a little cooler, which will help out with your CPU and system temperatures.

Since both cards have DHES coolers that vent their hot air to the outside of the case, I don't think either one will raise case temps. In fact, they might even help lower temps on a poorly ventilated case.

Agreed. Both cards have very nice coolers on them. But some people, especially overclockers have concerns about GPU core temps. More power consumption means more thermal output.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Ackmed
Originally posted by: Creig

Originally posted by: SickBeast
Also keep in mind that you will save money on electricity by going with the more efficient 8800

I do agree that you will save money, but it is a very, VERY, VERY small amount. Even if you compare the 2900XT to the 8800GTS 320, that's a 90 watt difference. The average time spent playing games is 8 hours per week. 8 hours per week x 4 weeks per month x 90 watts = 2800 watts = 2.8 kW x .05 cents per kW = 14.4 cents per month. That amount is so negligible that it cannot even be considered a relevant factor.

Yep. Anyone who thinks its even a factor, is only fooling themself. That, or is very bias and a "fan boy." Trying way too hard. For one, it can be much less of a 90w difference, if the 320 is a factory overclocked model, making the already minuscule amount, even less. Most sites already show it at about 50-60w difference. Also, at idle (where everyones PC is most of the time) the 2900XT uses less than the GTX and the Ultra. So you can make the argument that some of NV cards cost more over a years time. But that would be silly too, because it would so small, it wouldnt even matter.

Not so very long ago, you felt in this quote below, that 20 to 40W differences in power consumption was worth at least something? Else why post it? Actually, this post shows you care about Power Consumption, Heat AND noise benefits. You don't care anymore?

"Originally posted by: Ackmed


Disappointed? Not hardly. Would I have liked more performance? No dobut. Refreshes generally do not give a huge increase in performance. From the GF3/GF3Ti, 9700Pro/9800Pro/5900U/5950U, the list goes on and on.

Lets take a look at a few points.

1. Performance is a little better, say a few percent, more in some cases. CF master cards, are now the same speed as the normal XTX. And the same price, of $449.
2. People complained that the X1900's were not HDCP compliant, the X1950 is.
3. People complained that the X1900's consumed a lot of wattage, the X1950 dropped usage, by about 20w. Not a huge amount, but some. In CF, that would be about 40w less.
4. People complained that the X1900's ran "hot". The X1950's run cooler now, about 10c less now under load.
5. People complained about the noise from the X1900's. The X1950's run much quieter now.
6. People complained that prices were going too high. The X1900XTX was a MSRP of $650, the X1950XTX has a MSRP of $450. Thats $200 less, for a little more performance, lower power consumption, cooler running, quieter running, etc.

Why would I be disappointed if the card costs $200 less than the last launch, and is better in every aspect.?"
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Sorry, I know your tricks by now. Trying to start something with me, and then go run and tell your higher ups, trying to make me the bad guy. Not going to work. Its pretty sad actually, you're really stretching on this attempt, and yet you've failed miserably again. If anyone is baiting, its you again. You're going to get ignored, unless you change your ways.

When I was talking about that then, I was replying to others who made a big deal about the X1900XT's power, noise and heat. I was not one of those people. The X1950XT's are better in all areas. I simply outlined where the X1950 was better, when people were complaining that there wasnt a huge performance boost. It was cheaper, quieter, used less power, and ran cooler. A win across the board, compared to the previous card, which is what I showed.

No where in that post did I mention anything about the price difference of the lower watt usage. Which is exactly what was posted in this topic, and refuted. You are not comparing the same things, not even close. I didnt then, and still dont care if one card runs at 10c more than the other, if the heat is exhausted out of the case. I didnt then, and still dont care now if one card is slightly louder than the other. I havent changed my stance on anything in that old post you went and dug up to try and start something. Im not a flip-flopper, or a hypocrite like some people. Im sure this will get reported, and you'll try to get me in trouble again. And is probably just what you wanted. Either stop baiting, or Ill stop replying to you, and simply be the bigger man and ignore you. It starts now.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
You can't have a conversation without the attitude can you? If you have trouble dealing with the questions I ask you, then just don't respond at all. It's better for both of us.

Actually, the only SINGLE thing I was comparing was your different stances of how miniscule or significant differences in wattages were from the old post to the one in this thread.

Old thread you seemed to believe 20 to 40W meant something.
This thread, 50 to 60W seems "miniscule".

I asked why. You wrote a book.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
I told you why, and was right on target.

You didnt even compare the same things. In the old post, I was not talking about the price difference a month/year that a card with 20w would be. In this topic, we were. Once again, its not comparing the same thing. And once again, I didnt change my stance on anything. As you tried to accuse me of. I was pointing out in the old post, that there was less wattage used, for those that made a big deal about it.

Its plainly obvious, you just tried to make something out of nothing, and failed yet again. I didnt think 20w to 40w ment something in the old post, I mentioned it or those who complained about the X1900's wattage use. Which is again obvious if you read the post. And in this thread I did not say that 50w to 60w was "minuscule", I said that the price difference between the GTS and XT in watts was. Going by Creigs numbers, were 14.4 cents a month. Thats minuscule to me, and Id wager to most anyone. You tried to make something out of nothing, and failed, again.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: bryanW1995
I]Originally posted by: Creig[/i]
I]Originally posted by: SickBeast[/i]
I do agree that you will save money, but it is a very, VERY, VERY small amount. Even if you compare the 2900XT to the 8800GTS 320, that's a 90 watt difference. The average time spent playing games is 8 hours per week. 8 hours per week x 4 weeks per month x 90 watts = 2800 watts = 2.8 kW x .05 cents per kW = 14.4 cents per month. That amount is so negligible that it cannot even be considered a relevant factor.




where do you live that you get .05c per kw hr??? that needs to double or maybe even, uh, TRIPLE! that would be 14.4x3 = 43.2 cents per month. that's a coke every two months (12 oz can). would you deny sickbeast his coke???
:D

Perhaps I should buy myself some coke now and then seeing as I run an 8800GTS 320MB. :light:
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: SickBeast
Originally posted by: bryanW1995
I]Originally posted by: Creig[/i]
I]Originally posted by: SickBeast[/i]
I do agree that you will save money, but it is a very, VERY, VERY small amount. Even if you compare the 2900XT to the 8800GTS 320, that's a 90 watt difference. The average time spent playing games is 8 hours per week. 8 hours per week x 4 weeks per month x 90 watts = 2800 watts = 2.8 kW x .05 cents per kW = 14.4 cents per month. That amount is so negligible that it cannot even be considered a relevant factor.




where do you live that you get .05c per kw hr??? that needs to double or maybe even, uh, TRIPLE! that would be 14.4x3 = 43.2 cents per month. that's a coke every two months (12 oz can). would you deny sickbeast his coke???
:D

Perhaps I should buy myself some coke now and then seeing as I run an 8800GTS 320MB. :light:

:beer: for me :D
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: keysplayr2003
Originally posted by: Ackmed
Originally posted by: SickBeast

Also keep in mind that you will save money on electricity by going with the more efficient 8800, plus your entire rig will run a little cooler, which will help out with your CPU and system temperatures.

You were really trying in your whole post, but this "point" is just laughable.

Care to tell us how much per year someone would save? Lets see some hard factual numbers here, not some number you think up.

Trying to what, exactly? What is he trying to do?

And then:
Does a 2900XT require and use more power than a 8800GTS320? (yes) (no)
Does a 2900XT run warmer than an 8800GTS320? (yes) (no)

Aside from an exact monetary figure of how much money a given user would save/lose going with either card, one would have to agree that it would cost $x.xx per year more to run a card that requires and used more power. Is it neglidgable? It sure could be. Is it substantial? It sure could be.
Yeah I agree...what was I trying to do?

Keys is right; your posting style *is* quite combative.

I can tell you what I was trying to do: give someone an honest reccomendation based on the facts. I said he would save money on electricity. Is that not true? You should perhaps consider the fact that if 1,000,000 used 2900XT's instead of GTS's, it could have a considerable environmental impact if nothing else.

I don't know how much it would cost per year. To me the bigger issue is the fact that you need a beefier PSU for the X2900XT. Even 90w more can get quite expensive. I'm pretty sure a 700w PSU is at least $30 more than a 600w PSU.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Ackmed
I told you why, and was right on target.

You didnt even compare the same things. In the old post, I was not talking about the price difference a month/year that a card with 20w would be. In this topic, we were. Once again, its not comparing the same thing. And once again, I didnt change my stance on anything. As you tried to accuse me of. I was pointing out in the old post, that there was less wattage used, for those that made a big deal about it.

Its plainly obvious, you just tried to make something out of nothing, and failed yet again. I didnt think 20w to 40w ment something in the old post, I mentioned it or those who complained about the X1900's wattage use. Which is again obvious if you read the post. And in this thread I did not say that 50w to 60w was "minuscule", I said that the price difference between the GTS and XT in watts was. Going by Creigs numbers, were 14.4 cents a month. Thats minuscule to me, and Id wager to most anyone. You tried to make something out of nothing, and failed, again.

Not the same thing? Wattage is wattage. And I didn't say you WERE talking about the price difference a month/year. So if you weren't talking about the price difference a month/year like you say, what was it about the wattage you were talking about? POWER CONSUMPTION.
And finally, I will thank you very much if you reduce your ever present combative tone with me or anyone else here who, heaven forbid, disagrees with you. Yes, we are all tired of it and it has NO place here at AT forums any longer.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: SickBeast

I don't know how much it would cost per year.

Exactly! So you shouldnt be making the claim that you woul save money on your electricity bill by going the GTS route. Unless you're worried about 14.4 cents, or in all probability, even less. So now that we have a number, are you honestly going to try and make the statement that a person would be better going with the GTS to save money on their monthly bill?

Originally posted by: keysplayr2003


Not the same thing? Wattage is wattage. And I didn't say you WERE talking about the price difference a month/year. So if you weren't talking about the price difference a month/year like you say, what was it about the wattage you were talking about? POWER CONSUMPTION.
And finally, I will thank you very much if you reduce your ever present combative tone with me or anyone else here who, heaven forbid, disagrees with you. Yes, we are all tired of it and it has NO place here at AT forums any longer.

Yep, I was talking about wattage back then. The lower wattage of the X1950XT and the X1900XT. To people who complained that the X1900XT consumed too much. I made several points in how the X1950XT was a better card than the X1900XT, when peope said it was not a good refresh. Wattage, noise, and heat were all reduced, which is what I said. Not that I cared personally about the wattage at all. And I was talking about the price difference back then too. The price difference between the X1900XT and the X1950XT at launch was $200, which I posted about. Not the comparable NV card. Once again, I didnt change my stance on anything.

The simple, and sad fact is, that you went back about a year to dig up an old post, to try and show that I changed my stance on something. When I didnt even to start with. Even if I did, a person has that right, and a year in technology is nothing. Ive been to half a dozen countries in a year, half way around the world, and been to war in that year. My opinion has changed in a lot of things. But not in what I posted way back when on this subject.

My combative tone towards you? Because you disagree with me? Thats a good one. You didnt disagree with me, you tried to catch me in a flip-flop, but failed again. You used all caps, which is combative, I did not. You troll and bait me by posting after me (as always), and I ignored your first response even. So you had to post again, as always. You dig up a post thats about a year old, and try to show that I "changed my stance" on a matter. Then you try to put the blame on me saying Im combative. Ill tell you what, I wont reply to you anymore, and be the bigger man and ignore your posts. Its obvious as Ive stated before, you've got issues with me. You try to start something with me all the time, then go run and tell your higher ups that Im the bad guy. Im done with it. Dont reply back, because I wont be reading or replying. But Ive got $100 that says you will, but you accused me in the past of my ego forcing me to keep posting. Time for you to take some of your own advice.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Originally posted by: Ackmed
Originally posted by: SickBeast

I don't know how much it would cost per year.

Exactly! So you shouldnt be making the claim that you woul save money on your electricity bill by going the GTS route. Unless you're worried about 14.4 cents, or in all probability, even less. So now that we have a number, are you honestly going to try and make the statement that a person would be better going with the GTS to save money on their monthly bill?

Originally posted by: keysplayr2003


Not the same thing? Wattage is wattage. And I didn't say you WERE talking about the price difference a month/year. So if you weren't talking about the price difference a month/year like you say, what was it about the wattage you were talking about? POWER CONSUMPTION.
And finally, I will thank you very much if you reduce your ever present combative tone with me or anyone else here who, heaven forbid, disagrees with you. Yes, we are all tired of it and it has NO place here at AT forums any longer.

Yep, I was talking about wattage back then. The lower wattage of the X1950XT and the X1900XT. To people who complained that the X1900XT consumed too much. I made several points in how the X1950XT was a better card than the X1900XT, when peope said it was not a good refresh. Wattage, noise, and heat were all reduced, which is what I said. Not that I cared personally about the wattage at all. And I was talking about the price difference back then too. The price difference between the X1900XT and the X1950XT at launch was $200, which I posted about. Not the comparable NV card. Once again, I didnt change my stance on anything.

The simple, and sad fact is, that you went back about a year to dig up an old post, to try and show that I changed my stance on something. When I didnt even to start with. Even if I did, a person has that right, and a year in technology is nothing. Ive been to half a dozen countries in a year, half way around the world, and been to war in that year. My opinion has changed in a lot of things. But not in what I posted way back when on this subject.

My combative tone towards you? Because you disagree with me? Thats a good one. You didnt disagree with me, you tried to catch me in a flip-flop, but failed again. You used all caps, which is combative, I did not. You troll and bait me by posting after me (as always), and I ignored your first response even. So you had to post again, as always. You dig up a post thats about a year old, and try to show that I "changed my stance" on a matter. Then you try to put the blame on me saying Im combative. Ill tell you what, I wont reply to you anymore, and be the bigger man and ignore your posts. Its obvious as Ive stated before, you've got issues with me. You try to start something with me all the time, then go run and tell your higher ups that Im the bad guy. Im done with it. Dont reply back, because I wont be reading or replying. But Ive got $100 that says you will, but you accused me in the past of my ego forcing me to keep posting. Time for you to take some of your own advice.

No need to bet, of course I'm going to reply.
Yes, I caught you in a flip flop. So what? People in here do it all the time. But you take it personally. Your paranoia is really high. I'm just here for the conversation, not to "trick" you into being yourself. Ackmed, this isn't about being the bigger man. This is about computer hardware and the discussion of said hardware. If you won't respond to my posts, that fine. But if you post something I want to agree, comment on or disagree with, I surely will continue to do so. Always have. And not with just you as you like to believe.

By the way, I use CAPS to emphasize a point, not to be combative. Sometimes I'll use "quotes". Other times I will italicize. It's to make it stand out from surrounding text. Emphasis. On rare occasions, I'll even throw caution to the wind and underline something.

Anyway, have a root beer on me and have a good night.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: bryanW1995
Originally posted by: Creig
I do agree that you will save money, but it is a very, VERY, VERY small amount. Even if you compare the 2900XT to the 8800GTS 320, that's a 90 watt difference. The average time spent playing games is 8 hours per week. 8 hours per week x 4 weeks per month x 90 watts = 2800 watts = 2.8 kW x .05 cents per kW = 14.4 cents per month. That amount is so negligible that it cannot even be considered a relevant factor.

where do you live that you get .05c per kw hr??? that needs to double or maybe even, uh, TRIPLE! that would be 14.4x3 = 43.2 cents per month. that's a coke every two months (12 oz can). would you deny sickbeast his coke???

So sorry, it looks like you're correct. The average price per kWh in the United States is 11 cents (as of May, 2006). So it looks like it's actually going to cost someone 32 cents more per month to run an X2900XT versus an 8800GTS 320. So he'll actually be able to afford his coke every month and a half instead! :beer:


Originally posted by: keysplayr2003
Do these calculation include idle time? Or just strictly gaming?

I used the 8 hours per week of gaming that was stated in the link I posted. Actually, they show the adult male plays 7.6 hours per week and the adult female plays 7.4 hours per week. I guess I could have averaged the two and used 7.5, but since I knew the monthly cost difference was going to be so miniscule I simply rounded up to 8 hours.

I think it would be next to impossible to determine an average idle time as some people use their computers only for games, some also use them for work purposes, some run F@H on their GPUs, some people shut their computers off when they're done while others let them run 24/7, etc...

Besides, the idle draw of the latest generation cards from both manufacturers are within 15 watts of each other. I would say that's close enough to be a non-factor in choosing one card of this generation over another.


Originally posted by: keysplayr2003
"Only" 90W more than a 8800GTS320? Isn't 90W about enough to run a modern Dual Core CPU? 90W is substantial a a good indicator of how much more heat a 2900XT generates as well.

The "only" I was referring to was meant to convey that the 90 watt difference only applied against the 8800GTS 320. Sickbeast's quote of "You'll probably need a PSU with a rating 100w higher than the 8800 cards need." was vague in that it sounded as if there were a 100w difference between the X2900XT and ALL 8800 series cards, so I simply clarified.

Yes, 90w is a pretty healthy power difference between the 8800GTS 320 and the X2900XT. But again, both the 8800 series cards and the X2900XT have DHES coolers which takes the hot air generated by the GPU and blows it out of the case instead of recirculating it inside. So I don't see how either card is going to substantially raise case temps.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: SickBeast
Originally posted by: keysplayr2003
Originally posted by: Ackmed
Originally posted by: SickBeast

Also keep in mind that you will save money on electricity by going with the more efficient 8800, plus your entire rig will run a little cooler, which will help out with your CPU and system temperatures.

You were really trying in your whole post, but this "point" is just laughable.

Care to tell us how much per year someone would save? Lets see some hard factual numbers here, not some number you think up.

Trying to what, exactly? What is he trying to do?

And then:
Does a 2900XT require and use more power than a 8800GTS320? (yes) (no)
Does a 2900XT run warmer than an 8800GTS320? (yes) (no)

Aside from an exact monetary figure of how much money a given user would save/lose going with either card, one would have to agree that it would cost $x.xx per year more to run a card that requires and used more power. Is it neglidgable? It sure could be. Is it substantial? It sure could be.
Yeah I agree...what was I trying to do?

Keys is right; your posting style *is* quite combative.

I can tell you what I was trying to do: give someone an honest reccomendation based on the facts. I said he would save money on electricity. Is that not true? You should perhaps consider the fact that if 1,000,000 used 2900XT's instead of GTS's, it could have a considerable environmental impact if nothing else.

I don't know how much it would cost per year. To me the bigger issue is the fact that you need a beefier PSU for the X2900XT. Even 90w more can get quite expensive. I'm pretty sure a 700w PSU is at least $30 more than a 600w PSU.

Sick come on, this is a stretch and you know it. Keys is not right, he only tries to bait people and it shows. Considering that most of the 320mb cards are overclocked and when running in 2d mode the Nvidia GPU use more power. So?

I also doubt in the enthusiast market 15-25 cents a month is even noticable. And no, to run a 2900XT you can run it fine on a 500W PSU.

I wait to see the come back for this one...

The "comeback" is your final warning about mod callouts!
- and to everyone else that is skating on thin ice in this thread.
Graphics moderator apoppin
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: SickBeast
I can tell you what I was trying to do: give someone an honest reccomendation based on the facts. I said he would save money on electricity. Is that not true? You should perhaps consider the fact that if 1,000,000 used 2900XT's instead of GTS's, it could have a considerable environmental impact if nothing else.

Oh, come on now. That's a simply ridiculous statement. IF a million people used 2900XT's instead of GTS's? How about IF people all used passively cooled low-end video cards instead of powerhouses like the 8800GTX? Or IF people all used 17" LCDs instead of 24" or 30"? Or IF...

I think you get my point. You can "what if" all day long, but it won't change reality.

You brought up the subject "you will save money on electricity by going with the more efficient 8800". So I did the math and showed you how little that actually amounted to.

Or are you honestly trying to say that there is actually a gamer out there who would choose an 8800GTS 320 over an X2900XT for the sole reason that it's going to cost him 32 cents less a month for electricity?
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Originally posted by: apoppin

The "comeback" is your final warning about mod callouts!
- and to everyone else that is skating on thin ice in this thread.
Graphics moderator apoppin

apoppin, you cannot take action against someone who is responding to general comments made by somebody who happens to also be a moderator. If the comments being made are in direct relation to actions/warnings given by a moderator, then that is another matter. But moderators are also general members of the AT forums and should not be given any more or less latitude than is extended to everybody else while they are making non-moderator related posts.

I do not see where in Zstream's post that he is calling out keysplayr2003 because he is a moderator or any action he took while acting in a moderating role. Rather, it appears that the "comeback" is directed toward Zstream's assertion that keysplayr2003 tries to bait people while posting in a non-moderator capacity.



edit - Good God, I'm starting to sound like a lawyer.

You are *not* to edit out moderator comments in a post.

You do also not see Zstream's *pattern* of calling out Keys. i do. And IF you have a problem with my comments, then post in Personal Forum Issues.

Graphics Moderator apoppin
 

mruffin75

Senior member
May 19, 2007
343
0
0
Originally posted by: Creig
Originally posted by: apoppin

The "comeback" is your final warning about mod callouts!
- and to everyone else that is skating on thin ice in this thread.
Graphics moderator apoppin

apoppin, you cannot take action against someone who is responding to general comments made by somebody who happens to also be a moderator. If the comments being made are in direct relation to actions/warnings given by a moderator, then that is another matter. But moderators are also general members of the AT forums and should not be given any more or less latitude than is extended to everybody else while they are making non-moderator related posts.

I do not see where in Zstream's post that he is calling out keysplayr2003 because he is a moderator or any action he took while acting in a moderating role. Rather, it appears that the "comeback" is directed toward Zstream's assertion that keysplayr2003 tries to bait people while posting in a non-moderator capacity.



edit - Good God, I'm starting to sound like a lawyer.

Maybe you do sound like a lawyer (are you?!?! :) )... but you are correct...

I didn't see any issue with what Zstream said either.. Keys decided to get into this thread and post some comments, and they were replied to... just because he's a moderator doesn't mean we have to agree with them...

But back on topic (sort of!)..

Ah yes you do need a beefier power supply for the 2900XT (most of the time), but maybe in the process of buying the beefier power supply, you've actually purchased one that is more *efficient* than your previous 300-400W'er..

Then that could possibly cancel out any extra power draw from the 2900!! :)


 

Marty502

Senior member
Aug 25, 2007
497
0
0
Man, this got nasty.

Anyway, I guess it's a wise choice to wait just a little bit more time. Heck, I might even end up with a 8800GTX with some patience!

Thanks everyone for your support.
 

jjzelinski

Diamond Member
Aug 23, 2004
3,750
0
0
Video Cards and Graphics has historically been a turbulent place. Very strange, but lines in the sand have been drawn and if you find youself on the wrong side in the wrong crowd expect a beat down. lol, dumb :)
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: Creig
Originally posted by: SickBeast
I can tell you what I was trying to do: give someone an honest reccomendation based on the facts. I said he would save money on electricity. Is that not true? You should perhaps consider the fact that if 1,000,000 used 2900XT's instead of GTS's, it could have a considerable environmental impact if nothing else.

Oh, come on now. That's a simply ridiculous statement. IF a million people used 2900XT's instead of GTS's? How about IF people all used passively cooled low-end video cards instead of powerhouses like the 8800GTX? Or IF people all used 17" LCDs instead of 24" or 30"? Or IF...

I think you get my point. You can "what if" all day long, but it won't change reality.

You brought up the subject "you will save money on electricity by going with the more efficient 8800". So I did the math and showed you how little that actually amounted to.

Or are you honestly trying to say that there is actually a gamer out there who would choose an 8800GTS 320 over an X2900XT for the sole reason that it's going to cost him 32 cents less a month for electricity?
Actually I just did the *proper* math, and you save $78.84 per year on electricity by using 90 watts less, if the device is on all the time. Even if it's only on 8 hours a day, you're saving over $25 anually, which does add up.

I got the information for my calculations at this website.

It said to use 10 cents per kilowatt hour, so I did. The rate varies. People in Hawaii could theoretically save close to $150 per year by using a 320MB 8800GTS.

How did you perform your calculations?

I used: 90 watts = 90 watt hours per hour the device is run. I multiplied that times 24 hours, then times 365 days. After that I multiplied the figure by .1 to reflect the price of electricity.
 

jjzelinski

Diamond Member
Aug 23, 2004
3,750
0
0
Still, this a conversation about PC enthusiasts; it just doesn't seem probable that energy conservation has jack to do with anything an enthusiast cares about... I'd say looking at the conversation from the outside that you're technically correct, but your point is unfortunately irrelevant for the vast majority of peeps here.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: SickBeast
Originally posted by: Creig
Originally posted by: SickBeast
I can tell you what I was trying to do: give someone an honest reccomendation based on the facts. I said he would save money on electricity. Is that not true? You should perhaps consider the fact that if 1,000,000 used 2900XT's instead of GTS's, it could have a considerable environmental impact if nothing else.

Oh, come on now. That's a simply ridiculous statement. IF a million people used 2900XT's instead of GTS's? How about IF people all used passively cooled low-end video cards instead of powerhouses like the 8800GTX? Or IF people all used 17" LCDs instead of 24" or 30"? Or IF...

I think you get my point. You can "what if" all day long, but it won't change reality.

You brought up the subject "you will save money on electricity by going with the more efficient 8800". So I did the math and showed you how little that actually amounted to.

Or are you honestly trying to say that there is actually a gamer out there who would choose an 8800GTS 320 over an X2900XT for the sole reason that it's going to cost him 32 cents less a month for electricity?
Actually I just did the *proper* math, and you save $78.84 per year on electricity by using 90 watts less, if the device is on all the time. Even if it's only on 8 hours a day, you're saving over $25 anually, which does add up.

I got the information for my calculations at this website.

It said to use 10 cents per kilowatt hour, so I did. The rate varies. People in Hawaii could theoretically save close to $150 per year by using a 320MB 8800GTS.

How did you perform your calculations?

I used: 90 watts = 90 watt hours per hour the device is run. I multiplied that times 24 hours, then times 365 days. After that I multiplied the figure by .1 to reflect the price of electricity.

*Yawn* if you're worried THAT much about $150 a year, maybe you should get a new hobby
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: jjzelinski
Still, this a conversation about PC enthusiasts; it just doesn't seem probable that energy conservation has jack to do with anything an enthusiast cares about... I'd say looking at the conversation from the outside that you're technically correct, but your point is unfortunately irrelevant for the vast majority of peeps here.
Well, even enthusiasts have a budget, and personally I don't like having $25 of my annual hardware budget eaten up un-necessarily by using a part that is 90w less efficient. At this point, that equates to a 512MB stick of memory every year.

The additional heat output from the card is also of concern. People like Crieg are saying that the cards exhaust the heat outside of the case, but that neglects the fact that the backside of the card inevitably will give off some extra heat to the rest of the case.

I'm confident that the average 2900XT case is at least 3C hotter than the average 8800GTS 320MB case.

There's also the fact that quite often people will spend more than $25 extra per year. IMO that's a lowball figure.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: cmdrdredd
*Yawn* if you're worried THAT much about $150 a year, maybe you should get a new hobby
That $150 could have purchased you an 8800GTX instead of the 2900XT. :light:
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: SickBeast
Originally posted by: cmdrdredd
*Yawn* if you're worried THAT much about $150 a year, maybe you should get a new hobby
That $150 could have purchased you an 8800GTX instead of the 2900XT. :light:

but I don't want a card that doesn't work in my games. Give me a friggen break...

There is a game that I play all the time that to this day and I see tons of people on the forums asking for a fix, Nvidia now says they are not providing one when earlier this year they announced they would have a fix in september. It took the threat of a lawsuit to get that promise, and now it's september and Nvidia says they aren't going to fix it and blamed it on someone else. Good move :thumbsdown:

If Nvidia got their head out of their rear and actually worked on the issues they know about (texture memory leak for example) I would probably have considered it. Plus, at 1280x1024 I have no use for a gtx...even if it worked I would have not spent that much.
 

jjzelinski

Diamond Member
Aug 23, 2004
3,750
0
0
Originally posted by: cmdrdredd
Originally posted by: SickBeast
Originally posted by: cmdrdredd
*Yawn* if you're worried THAT much about $150 a year, maybe you should get a new hobby
That $150 could have purchased you an 8800GTX instead of the 2900XT. :light:

but I don't want a card that doesn't work in my games. Give me a friggen break...

There is a game that I play all the time that to this day and I see tons of people on the forums asking for a fix, Nvidia now says they are not providing one when earlier this year they announced they would have a fix in september. It took the threat of a lawsuit to get that promise, and now it's september and Nvidia says they aren't going to fix it and blamed it on someone else. Good move :thumbsdown:

If Nvidia got their head out of their rear and actually worked on the issues they know about (texture memory leak for example) I would probably have considered it. Plus, at 1280x1024 I have no use for a gtx...even if it worked I would have not spent that much.

That texture memory leak is unforgivable. I'm done with NV for a good while for that one.