• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Radeon X1950 XTX gets new cooler

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: schneiderguy
Originally posted by: Elfear
Originally posted by: schneiderguy
Originally posted by: Creig

Two ATI cores giving four NV cores a run for its money? Sounds pretty fast to me.

the two ATI cores also have more transistors than the 4 nvidia cores, take more power than the four nvidia cores, and produce more heat than the four nvidia cores. 😕

I know my buying decision always comes down to which gpu solution has less transistors. :disgust:

because more transistors = more heat and more power consumption?

im sure you'll love it when you get two of these things and it takes as much power as your refrigerator :roll:
yes surely it'll take as much as a refigerator. Now get out.
 
Originally posted by: Praxis1452
Originally posted by: schneiderguy
Originally posted by: Elfear
Originally posted by: schneiderguy
Originally posted by: Creig

Two ATI cores giving four NV cores a run for its money? Sounds pretty fast to me.

the two ATI cores also have more transistors than the 4 nvidia cores, take more power than the four nvidia cores, and produce more heat than the four nvidia cores. 😕

I know my buying decision always comes down to which gpu solution has less transistors. :disgust:

because more transistors = more heat and more power consumption?

im sure you'll love it when you get two of these things and it takes as much power as your refrigerator :roll:
yes surely it'll take as much as a refigerator. Now get out.

i didnt mean it'll LITERALLY take as much power as a refrigerator 😕
 
Originally posted by: nitromullet
All he was saying is that even if the X1950XTX doesn't beat two GX2's, it would still be impressive if they came close.

and why would it be impressive? because it will cost less.
 
Originally posted by: redbox
Originally posted by: beggerking
Originally posted by: nitromullet
Originally posted by: beggerking
umm..if one 1950 can't beat 1 GX2, how does 2 1950s beat 2 GX2s?

I never said they could, and we don't know either way... My point was about the relevance of mentioning power consumption and heat disipation with regards to absolute performance... Even if it were true, It is simply not a relevant rebuttal with regards to performance. I also find it interesting that someone feels the need to begin doing damage control on something that has yet to be shown. Even more interesting is the idea that they would bring up other information (power requirements) that also isn't known to make an irrelevant argument.

I think we are talking about combined performance here..( raw performance, heat, price, power consumption, feature set) to determine the value of 1950xtx, not raw performance alone.

read Creig' post above
"Two ATI cores giving four NV cores a run for its money? Sounds pretty fast to me."

price/performance ratio. so schneiderguy's argument is relevant..heat and power is part of a videocard's "performance".

Lets go to that time honored test some people keep bringing up. "If you want the fastest you are going to have to pay for it", now if that means ATI installing a power brick, ATI redesigning their coolers, or even you having to upgrade your powersupply then thats all game. I haven't seen you pull the price/performance card out of the bag for a long time now. So I thought I just might play that game.

That being said I am not in that upper echelon, that upper crust. No, I look also for the price/performance ratio. When you start putting other features under the umbrella of "performance" then you blure the line. I could just as well put the advancements in IQ that ATI has over the NV parts under this "performance umbrella" as could you also shove the NV HDCP feature under there too. However, this makes the umbrella quite large and hard to handle. I perfer to keep the definition of performance just to FPS, that is a nice defined line that most sites bench. So do keep your heat and power issues on your mind, but keep them out of the performance equation.

you can say or think whatever you would like, but with this quote
"Two ATI cores giving four NV cores a run for its money? Sounds pretty fast to me."
he is comparing value vs performance. If not, this would be a false statement because the loser out of a pure performance benchmark cannot be "impressive".


 
Originally posted by: beggerking
Originally posted by: nitromullet
All he was saying is that even if the X1950XTX doesn't beat two GX2's, it would still be impressive if they came close.

and why would it be impressive? because it will cost less.

No, that's not what he is saying at all. He's saying it would be impressive if the performance was close because the quad-SLI rig has a total of four gpu's and the Crossfire rig would only have two.

read it again:
run for one's money, a

A close contest or a strong competition, as in We may not win the game, but let's give them a run for their money. This term probably comes from horse racing, where one may get considerable pleasure from watching the race even if one does not win much. Its first recorded use was in 1874.
 
Originally posted by: Wreckage
I post a link from the same site as the OP, that lists performance and cost and that's bad?

Sigh, I guess nothing will make some people happy.

It's not that you provided evidence from the same site that made him say that. It was the way in which you concentrated and cut out one quote that, when presented by itself, had a different context to it than the article as a whole. The GDDR4 was what they were meaning would be 7% faster, not the card as a whole. If it is going to be competeing against a 7950GX2, and the 7950 is I believe more than 7% faster in most games, they would need to do better than 7% I would think.
 
Originally posted by: tuteja1986
Originally posted by: Aflac
Originally posted by: tuteja1986
Amm , I think ATI needs to get some really strict employee that can make the whole the company do think fast. Why the hell even bother with a X1950 XTX release 🙁 they should be worried about G80 which will come out soon and kill all the sales of ATI high end R580 card. Arr also steal more market share and making ATI have 3 strikes with late launch of its next gen GPU.

R4XX series = failed because of delay
R5XX series = failed because of horrible delays
R6XX series = looks as the same future

R520 was horribly delayed (X1800 series), but R580 was not, because it was a separate project (X1900 series). Hopefully I got those model #s right.

Anyway... 1. inq link, 2. no pics??

Yeah I get your point but i am trying to say that Nvidia may launch its G80 late next month. This would kill the sale of ATI X1950XT 🙁 and ATI response wouldn't come untill late Q4 and by that time Nvidia would have sold craploads of its Graphic card.


far as im concerned the 1950 is barely worht a look. same core speed, new cooler and some fancy memory thats about it.
 
Originally posted by: josh6079
Originally posted by: Wreckage
I post a link from the same site as the OP, that lists performance and cost and that's bad?

Sigh, I guess nothing will make some people happy.

It's not that you provided evidence from the same site that made him say that. It was the way in which you concentrated and cut out one quote that, when presented by itself, had a different context to it than the article as a whole. The GDDR4 was what they were meaning would be 7% faster, not the card as a whole. If it is going to be competeing against a 7950GX2, and the 7950 is I believe more than 7% faster in most games, they would need to do better than 7% I would think.

Actually I copied the entire sub-title over so that nothing was taken out of context. The article does not attribute the 7% increase to the memory, in fact it was just a broad (pulled outta their butt) statement. In fact the article was clearly written by someone with poor english skills.

Although the X1950 may not compete well against the 7950 in performance and it may not even try. It may be cheaper for ATI to produce and less expensive than the 7950. However it should be better than the 7900GTX and that will hopefully bring the price of that down as well. Then the G80 will come out and screw everything up, until the R600 show's up and then Vista crashes and everyone gives up and buys a Matrox triple head.
 
Originally posted by: Wreckage
Originally posted by: josh6079
Originally posted by: Wreckage
I post a link from the same site as the OP, that lists performance and cost and that's bad?

Sigh, I guess nothing will make some people happy.

It's not that you provided evidence from the same site that made him say that. It was the way in which you concentrated and cut out one quote that, when presented by itself, had a different context to it than the article as a whole. The GDDR4 was what they were meaning would be 7% faster, not the card as a whole. If it is going to be competeing against a 7950GX2, and the 7950 is I believe more than 7% faster in most games, they would need to do better than 7% I would think.

Actually I copied the entire sub-title over so that nothing was taken out of context. The article does not attribute the 7% increase to the memory, in fact it was just a broad (pulled outta their butt) statement. In fact the article was clearly written by someone with poor english skills.

Although the X1950 may not compete well against the 7950 in performance and it may not even try. It may be cheaper for ATI to produce and less expensive than the 7950. However it should be better than the 7900GTX and that will hopefully bring the price of that down as well. Then the G80 will come out and screw everything up, until the R600 show's up and then Vista crashes and everyone gives up and buys a Matrox triple head.

lol k dued we bleive u now we r no u r nto trllo tahnks fro clernag taht up
 
Originally posted by: schneiderguy

because more transistors = more heat and more power consumption?

im sure you'll love it when you get two of these things and it takes as much power as your refrigerator :roll:

We've already gone over this in another thread but just for fun let's crunch some numbers.

Let's assume that power requirements scale linearly with speed (I know they generally don't but the difference would be small). If the X1950XTX is 7% faster than an X1900XTX than power output under load will be 7% higher.

An X1900XTX draws ~27 more watts at load than a GX2. So we'll add 7% to that to get ~29W more than the GX2 (again that will be slightly off since I'm adding 7% to the difference between the two cards and not the total draw of the card itself but it will be close enough for this comparison).

So 29Wx2 for two X1950XTXs will come out to 58W. Let?s say that you game for 2 hours every day of the month. That will come out to an extra 3480W per month in power consumption over a pair of GX2s. With the cost of electricity at $0.1031 per kilowatthour you?re looking at a whopping $0.3588 extra per month. Even if we double that amount to take into account the approximations I made, it's still only about 70 cents extra a month. That?s a pretty convincing sum of money.

Funny thing is, if you look at power consumption at idle from the first link, the GX2 consumes 17W more than the X1900XTX. So if you include 2 hours per day of the computer being on in a non-gaming situation, you're looking at basically halving the monstrous sum of $0.3588.

As far as heat goes, 58W really isn't that big of a deal. It's like turning on a 60W bulb behind your rig (that is if the new cooler is really like an Arctic Cooler unit).
 
Originally posted by: tuteja1986
Originally posted by: Aflac
Originally posted by: tuteja1986
Amm , I think ATI needs to get some really strict employee that can make the whole the company do think fast. Why the hell even bother with a X1950 XTX release 🙁 they should be worried about G80 which will come out soon and kill all the sales of ATI high end R580 card. Arr also steal more market share and making ATI have 3 strikes with late launch of its next gen GPU.

R4XX series = failed because of delay
R5XX series = failed because of horrible delays
R6XX series = looks as the same future

R520 was horribly delayed (X1800 series), but R580 was not, because it was a separate project (X1900 series). Hopefully I got those model #s right.

Anyway... 1. inq link, 2. no pics??

Yeah I get your point but i am trying to say that Nvidia may launch its G80 late next month. This would kill the sale of ATI X1950XT 🙁 and ATI response wouldn't come untill late Q4 and by that time Nvidia would have sold craploads of its Graphic card.

I've seen no concrete info about an August g80 launch, and suggesting it is no more likely than suggesting a r600 August launch.
 
News flash: ATi and NV release "speed bump" cards all the time, typically about 6 months after the release of a new card. Even if the 7% is true, its probably about average for such a card.

GF2, GF2 Pro, then GF2U. GF3, then GF3 Ti. 5900U, then 5950U. 9700 Pro, 9800 Pro, 9800XT. X800XT, X800XT/PE, etc. Both companies do it, have done it, and will continue to do it. Complaing about such a bump is pretty funny to me. Complaining about a speed increase, a better cooler, and much faster ram, is even funnier. Of course, the mains ones who do that, are staunch NV supporters, and dont buy highend cards anyways...

Its also funny that because ATi seems to be releasing another card, that they are not focusing on the R600. Thats just ignorant thinking. They've been doing it for years.
 
Originally posted by: Ackmed

Its also funny that because ATi seems to be releasing another card, that they are not focusing on the R600. Thats just ignorant thinking. They've been doing it for years.

wasnt one of the reasons that the 7800's came out so far ahead of the x1800's was because nvidia didnt do a refresh on the 6800's and had more resources to work on the 7800's? or was the delay because of the problems with the 90nm process ATI was having?

 
Originally posted by: schneiderguy
Originally posted by: Ackmed

Its also funny that because ATi seems to be releasing another card, that they are not focusing on the R600. Thats just ignorant thinking. They've been doing it for years.

wasnt one of the reasons that the 7800's came out so far ahead of the x1800's was because nvidia didnt do a refresh on the 6800's and had more resources to work on the 7800's? or was the delay because of the problems with the 90nm process ATI was having?

The delay for the X1800's was due to a 3rd party IP defect. At stated by AT, from ATi. It was not ATi's fault, but they got all the blame from most forum readers. I assure you NV and ATi both still work on the next card, while putting out a refresh. They have different departments.

edit, and I like that cooler. It exhaust all the air out of the case, as all highend coolers should do I think.

 
Originally posted by: schneiderguy
Originally posted by: Elfear
Originally posted by: schneiderguy
Originally posted by: Creig

Two ATI cores giving four NV cores a run for its money? Sounds pretty fast to me.

the two ATI cores also have more transistors than the 4 nvidia cores, take more power than the four nvidia cores, and produce more heat than the four nvidia cores. 😕

I know my buying decision always comes down to which gpu solution has less transistors. :disgust:

because more transistors = more heat and more power consumption?
Oh, and we all know that's true. My Prescott has 125 million transistors. And my X1800XT has 321 million transistors. Yet my Prescott is probably pumping out twice as much heat as my X1800XT :disgust:
 
Originally posted by: Praxis1452
http://www.dailytech.com/Article.aspx?newsid=3446

look at it... it's ugly as hell but I think it'll perform better atleast..

Yeah, its not the greatest looking, but it should cool very well, and hopefully be a lot quieter. Looks like they got Arctic Cooling to design it for them. Hmm, is that a heatpipe I can barely see on the opt edge? That thing is a behemoth, looks like all copper and look how far the fins extend.
 
Originally posted by: josh6079
Originally posted by: Praxis1452
http://www.dailytech.com/Article.aspx?newsid=3446

look at it... it's ugly as hell but I think it'll perform better atleast..

NICE! Even copper voltage regulator heatsinks. I wish there were different angles.

Yeah, me too. Definitely has a heatpipe, you can see part of it on the closer side.

Looks like the memory sinks aren't connected to the main core sink like on the Silencers, so I'm guessing it has cutouts underneath the fan to give them and the voltage sink some air.

Hmm, looks like the memory sinks probably are connected. Comparing it to the HIS IceQ3 HSF, it looks like the shroud is bigger, possibly a bigger fan, and the main heatsink fins appear to extend further towards the end/vent/opening.
 
Originally posted by: Sc4freak
Originally posted by: schneiderguy
Originally posted by: Elfear
Originally posted by: schneiderguy
Originally posted by: Creig

Two ATI cores giving four NV cores a run for its money? Sounds pretty fast to me.

the two ATI cores also have more transistors than the 4 nvidia cores, take more power than the four nvidia cores, and produce more heat than the four nvidia cores. 😕

I know my buying decision always comes down to which gpu solution has less transistors. :disgust:

because more transistors = more heat and more power consumption?
Oh, and we all know that's true. My Prescott has 125 million transistors. And my X1800XT has 321 million transistors. Yet my Prescott is probably pumping out twice as much heat as my X1800XT :disgust:

where did i say anything about processors? great, your prescott idles at 90c, i dont care 😕 i was taking about graphics cards...

 
Back
Top