ATI price drops - via Xbit

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Cooler

Diamond Member
Mar 31, 2005
3,835
0
0
Originally posted by: SickBeast
If prices get low enough I might buy something now instead of waiting for R600. :)

I haven't seen such cut-throat competition in the video market since Matrox had its G400, nVidia had the TNT2, 3DFX had its Voodoo 3, and ATI had some kind of Rage/Radeon card.

I didn't expect a price drop from ATI. I figured that G71 was a reactionary move and that nVidia would set prices to suit what ATI is doing. I think this type of counter-attack from ATI means that there is a price war that will happen. $500 is too much for a top-end GPU anyways. We're about to see how low they can go. :)

I still say that ATI will make GDDR4 Refresh of x1900 in july. The memory and core clock are very conservative.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: Cooler
Originally posted by: SickBeast
If prices get low enough I might buy something now instead of waiting for R600. :)

I haven't seen such cut-throat competition in the video market since Matrox had its G400, nVidia had the TNT2, 3DFX had its Voodoo 3, and ATI had some kind of Rage/Radeon card.

I didn't expect a price drop from ATI. I figured that G71 was a reactionary move and that nVidia would set prices to suit what ATI is doing. I think this type of counter-attack from ATI means that there is a price war that will happen. $500 is too much for a top-end GPU anyways. We're about to see how low they can go. :)

I still say that ATI will make GDDR4 Refresh of x1900 in july. The memory and core clock are very conservative.

Meh, I doubt it. I think ATi is done with refreshes, and are ready to move onto something other than 16pp. Hopefully unified shaders.
 

TraumaRN

Diamond Member
Jun 5, 2005
6,893
63
91
Originally posted by: 5150Joker
Originally posted by: munky
Originally posted by: vaccarjm
Originally posted by: 5150Joker
Originally posted by: KeithTalent
ATI?s Radeon X1900 XTX to Cost $549

After the price-drop, the Radeon X1900 XTX ? the current flagship of ATI ? will cost $549, the Radeon X1900 XT will drop to $479, while the Radeon X1800 XT products with 512MB or 256MB should cost around $299 (512MB version will cost higher than that). Additionally, ATI is expected to introduce Radeon X1800 XL 512MB for $299 and drop the price of the Radeon X1800 XL 256MB.

I was actually expecting a PE edition of some sort, rather than just price drops. Has anyone heard anything about a PE edition card?


Of course it will be enough. ATi vid cards have better feature support and are more "future proof". If they cost the same as the 7900 cards, then nVidia has nothing on them.


Oh and let me guess there 5150....nvidia didnt see this happening right? One of the biggest vid companies in history didnt plan that ATI were going to cut prices as well to respond to their 7900 series. Of course Nvidia knows this and im sure they have something planned. They just dont spends zillions of cash not knowing what to expect from ATI.

What do you do for a living? live under a rock?

This isn't so much about "knowing" as it is about "doing." People knew about the r580 for months before its launch, and what the specs were going to be. Nv knew it too, but did they do anything about it? If you still believe Nv was just sitting on a pile g71's, waiting for Ati to release the r580, then I have a bridge to sell you. I remember Jen Hsue (sp) talking about moving all their gpu's to 90nm in the second half of 2005, and look how long it took them to release the g71... it's march 2006, and I'm still waiting!

And then if you get into the whole "I knew that they knew that I knew..." thing, it becomes just pure speculation. Suffice it to say that price drops are a good thing for the consumer, but Nv would not drop prices if they knew they could get away with higher ones, and neither would Ati.


Exactly, nVidia has known about R580 and possible counters but technologically they're behind and they simply can't do much aside from drop prices and pushing their SLI/duo/quad SLI to get attention.

Technologically behind in whose eyes? They have differing business philisophies, ATi thinks more Pixel shaders is the way to go whereas Nvidia does not....kinda like Intel and AMD, Intel thinks FSB is better for their procs whereas AMD has the integrated mem controller. And all this talk of blowing each other out of the water is all water beneath my bridge I'd rather see price drop and buy the card that offer the most performance for the price than whine bitch and moan and stroke my e-penis that my card OCs 3 more Mhz and gets 5 more 3marks and 4 more FPS in one game out of hundreds
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: SickBeast
If prices get low enough I might buy something now instead of waiting for R600. :)

I haven't seen such cut-throat competition in the video market since Matrox had its G400, nVidia had the TNT2, 3DFX had its Voodoo 3, and ATI had some kind of Rage/Radeon card.

I didn't expect a price drop from ATI. I figured that G71 was a reactionary move and that nVidia would set prices to suit what ATI is doing. I think this type of counter-attack from ATI means that there is a price war that will happen. $500 is too much for a top-end GPU anyways. We're about to see how low they can go. :)


I posted awhile back that ATi was set to drop prices on March 2nd-3rd and was right. :)

http://forums.anandtech.com/messageview...STARTPAGE=17&FTVAR_FORUMVIEWTMP=Linear


Originally posted by: 5150Joker
I've heard a rumor that ATi will be dropping the prices on the X1900 cards starting around March 2nd.


I should start playing lotto!
 
Jun 14, 2003
10,442
0
0
Originally posted by: 5150Joker
Originally posted by: KeithTalent
ATI?s Radeon X1900 XTX to Cost $549

After the price-drop, the Radeon X1900 XTX ? the current flagship of ATI ? will cost $549, the Radeon X1900 XT will drop to $479, while the Radeon X1800 XT products with 512MB or 256MB should cost around $299 (512MB version will cost higher than that). Additionally, ATI is expected to introduce Radeon X1800 XL 512MB for $299 and drop the price of the Radeon X1800 XL 256MB.

I was actually expecting a PE edition of some sort, rather than just price drops. Has anyone heard anything about a PE edition card?


Of course it will be enough. ATi vid cards have better feature support and are more "future proof". If they cost the same as the 7900 cards, then nVidia has nothing on them.


glad to see the " " round future proof there, bit of a superfolous word in this realm of technology. and the only significant extra feature i see ATI having is HDR and AA (and depending on how you use it its either useful or its not)

oh and ATI's stellar AF, it really is nice. but thats it.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: otispunkmeyer
Originally posted by: 5150Joker
Originally posted by: KeithTalent
ATI?s Radeon X1900 XTX to Cost $549

After the price-drop, the Radeon X1900 XTX ? the current flagship of ATI ? will cost $549, the Radeon X1900 XT will drop to $479, while the Radeon X1800 XT products with 512MB or 256MB should cost around $299 (512MB version will cost higher than that). Additionally, ATI is expected to introduce Radeon X1800 XL 512MB for $299 and drop the price of the Radeon X1800 XL 256MB.

I was actually expecting a PE edition of some sort, rather than just price drops. Has anyone heard anything about a PE edition card?


Of course it will be enough. ATi vid cards have better feature support and are more "future proof". If they cost the same as the 7900 cards, then nVidia has nothing on them.


glad to see the " " round future proof there, bit of a superfolous word in this realm of technology. and the only significant extra feature i see ATI having is HDR and AA (and depending on how you use it its either useful or its not)

oh and ATI's stellar AF, it really is nice. but thats it.


Don't forget the 48 pixel shaders and going by current trends, I think ATi made the smart bet. The R580 also handles branching better than G70/G71 does.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: 5150Joker
Originally posted by: SickBeast
If prices get low enough I might buy something now instead of waiting for R600. :)

I haven't seen such cut-throat competition in the video market since Matrox had its G400, nVidia had the TNT2, 3DFX had its Voodoo 3, and ATI had some kind of Rage/Radeon card.

I didn't expect a price drop from ATI. I figured that G71 was a reactionary move and that nVidia would set prices to suit what ATI is doing. I think this type of counter-attack from ATI means that there is a price war that will happen. $500 is too much for a top-end GPU anyways. We're about to see how low they can go. :)


I posted awhile back that ATi was set to drop prices on March 2nd-3rd and was right. :)

http://forums.anandtech.com/messageview...STARTPAGE=17&FTVAR_FORUMVIEWTMP=Linear


Originally posted by: 5150Joker
I've heard a rumor that ATi will be dropping the prices on the X1900 cards starting around March 2nd.


I should start playing lotto!

You know too much. Are you sure, you are not an ATI marketing guy? :)

 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: RussianSensation
X1800XT $299 ($100 price drop) :) will most likely beat 7900GT as X1800XT beats 7800GTX in 11 out of 17 benchmarks at Xbitlabs
The only issue is if availability is a concern (plus 256mb X1800XT have never been around). This card less than 6 months ago cost $549. ($499 MSRP for 256mb version). This shows that buying top of line is expensive on your wallet.

7800GTX 512mb ($399) (huge price drop) - :thumbsup: availability?

X1900XT @ $479 is nice too. Retail prices will probably hit $450 then.

Let's hope 7900GTX can outgun X1900XT.
X1900XT for $100 less than a 7900GTX looks like a no-brainer that is if the are 5% of performance between each other.
 

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
Originally posted by: crazydingo
Originally posted by: RussianSensation
X1800XT $299 ($100 price drop) :) will most likely beat 7900GT as X1800XT beats 7800GTX in 11 out of 17 benchmarks at Xbitlabs
The only issue is if availability is a concern (plus 256mb X1800XT have never been around). This card less than 6 months ago cost $549. ($499 MSRP for 256mb version). This shows that buying top of line is expensive on your wallet.

7800GTX 512mb ($399) (huge price drop) - :thumbsup: availability?

X1900XT @ $479 is nice too. Retail prices will probably hit $450 then.

Let's hope 7900GTX can outgun X1900XT.
X1900XT for $100 less than a 7900GTX looks like a no-brainer that is if the are 5% of performance between each other.

I would have to double check but I thought that the MSRP for the 7900GTX was 499.
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
The 7900GT will be beaten before it even launches if the x1800xt's drop down to it's price range or below.

I expect nvidia and their fans to spam the letters "SLI" from now until the G80, endlessly.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
I think ATi CLEARLY has the upper hand right now and looking ahead it is just going to get better. Right now, most games dont make extensive use of shaders, like future games will, and the X1900XTX will SHINE here, yet even though it STILL beats the 7800GTX 512 in current games. In future games, where performance will probably best be represented by the performance in shader-heavy F.E.A.R, ATi is just going to get a BIGGER lead. ATi has the performance advantage NOW, and it will only increase as new games come out. While ATi looks ahead to what next-gen games will require, nVidia moves to a smaller process and releases an "overclocked 7800GTX 512", aka 7900GTX.
 

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
Originally posted by: Extelleron
I think ATi CLEARLY has the upper hand right now and looking ahead it is just going to get better. Right now, most games dont make extensive use of shaders, like future games will, and the X1900XTX will SHINE here, yet even though it STILL beats the 7800GTX 512 in current games. In future games, where performance will probably best be represented by the performance in shader-heavy F.E.A.R, ATi is just going to get a BIGGER lead. ATi has the performance advantage NOW, and it will only increase as new games come out. While ATi looks ahead to what next-gen games will require, nVidia moves to a smaller process and releases an "overclocked 7800GTX 512", aka 7900GTX.

But since the XTX beats the 512 in most thing doesn't mean with the the speed boost with the 7900GTX it won't beat the XTX. While I agree that the XX will probably pay off better in the future if things go shader intensive, but do we know for shure if thats the case. I mean Unreal I think is being devloped on an NVidia solution isn't it? (I really am not sure hear so I could be wrong) If thats the case then people should head over to Epics Unreal site, a huge amount of studios and promising games are being produced on the Unreal engine.

As it stands Their are 4 engines worth speaking about, Source, Doom, U3, and the monilith engine. (the Farcry I don't think has any real adoption). The Doom and U3 engines are Nvidia based the Source is ATI based but the CPU is usually more important leaving just the Monolith engine which history has shown to have a relatively small usage outside monolith themselves.

I really don't think that the world is going to go shader crazy, and if/when they do make that leap it won't be fore 2-3 years because current engines are not going to be redone for awhile. by this point for people who are looking to be on the top now those XTXs and XTs will be worthless.
 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: Topweasel
Originally posted by: crazydingo
Originally posted by: RussianSensation
X1800XT $299 ($100 price drop) :) will most likely beat 7900GT as X1800XT beats 7800GTX in 11 out of 17 benchmarks at Xbitlabs
The only issue is if availability is a concern (plus 256mb X1800XT have never been around). This card less than 6 months ago cost $549. ($499 MSRP for 256mb version). This shows that buying top of line is expensive on your wallet.

7800GTX 512mb ($399) (huge price drop) - :thumbsup: availability?

X1900XT @ $479 is nice too. Retail prices will probably hit $450 then.

Let's hope 7900GTX can outgun X1900XT.
X1900XT for $100 less than a 7900GTX looks like a no-brainer that is if the are 5% of performance between each other.

I would have to double check but I thought that the MSRP for the 7900GTX was 499.
7900 GTX $549
7900 GT $349
 

Topweasel

Diamond Member
Oct 19, 2000
5,437
1,659
136
Originally posted by: crazydingo
Originally posted by: Topweasel
Originally posted by: crazydingo
Originally posted by: RussianSensation
X1800XT $299 ($100 price drop) :) will most likely beat 7900GT as X1800XT beats 7800GTX in 11 out of 17 benchmarks at Xbitlabs
The only issue is if availability is a concern (plus 256mb X1800XT have never been around). This card less than 6 months ago cost $549. ($499 MSRP for 256mb version). This shows that buying top of line is expensive on your wallet.

7800GTX 512mb ($399) (huge price drop) - :thumbsup: availability?

X1900XT @ $479 is nice too. Retail prices will probably hit $450 then.

Let's hope 7900GTX can outgun X1900XT.
X1900XT for $100 less than a 7900GTX looks like a no-brainer that is if the are 5% of performance between each other.

I would have to double check but I thought that the MSRP for the 7900GTX was 499.
7900 GTX $549
7900 GT $349

I double checked and saw this. And since the person who posted this had a anti-ATI slant in his post I would think iff anything that these numbers are high.
 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: Topweasel
Originally posted by: crazydingo
Originally posted by: Topweasel
I would have to double check but I thought that the MSRP for the 7900GTX was 499.
7900 GTX $549
7900 GT $349
I double checked and saw this. And since the person who posted this had a anti-ATI slant in his post I would think iff anything that these numbers are high.
That Inq article is unsure about the number of pipes, why should I consider their pricing part right? :D

And for the person having an anti-nvidia (I think you meant that) slant, then can I say ad hominem. ;)
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: 5150Joker
Originally posted by: otispunkmeyer
Originally posted by: 5150Joker
Originally posted by: KeithTalent
ATI?s Radeon X1900 XTX to Cost $549

After the price-drop, the Radeon X1900 XTX ? the current flagship of ATI ? will cost $549, the Radeon X1900 XT will drop to $479, while the Radeon X1800 XT products with 512MB or 256MB should cost around $299 (512MB version will cost higher than that). Additionally, ATI is expected to introduce Radeon X1800 XL 512MB for $299 and drop the price of the Radeon X1800 XL 256MB.

I was actually expecting a PE edition of some sort, rather than just price drops. Has anyone heard anything about a PE edition card?


Of course it will be enough. ATi vid cards have better feature support and are more "future proof". If they cost the same as the 7900 cards, then nVidia has nothing on them.


glad to see the " " round future proof there, bit of a superfolous word in this realm of technology. and the only significant extra feature i see ATI having is HDR and AA (and depending on how you use it its either useful or its not)

oh and ATI's stellar AF, it really is nice. but thats it.


Don't forget the 48 pixel shaders and going by current trends, I think ATi made the smart bet. The R580 also handles branching better than G70/G71 does.

Youre beating branching to death, evidence?
 

Capt Caveman

Lifer
Jan 30, 2005
34,543
651
126
ATI or Nvidia Fan, you gotta love competition!!! Six more days til I order a new card, which one who knows...
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Acanthus


Youre beating branching to death, evidence?

Next time do your own research:

R520's batch size, being only 4x4 pixels large, should be very efficient for large batch sizes, at least in relation to NVIDIA's G70 which is described as having batch sizes of 64x16 (1024) pixels. R520's Pixel Shader architecture also has a specific Branch Execution Unit which means that ALU cycles aren't burned just calculating the branch alone for each pixel.

Source: http://www.beyond3d.com/reviews/ati/r520/index.php?p=04
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: 5150Joker
Originally posted by: Acanthus


Youre beating branching to death, evidence?

Next time do your own research:

R520's batch size, being only 4x4 pixels large, should be very efficient for large batch sizes, at least in relation to NVIDIA's G70 which is described as having batch sizes of 64x16 (1024) pixels. R520's Pixel Shader architecture also has a specific Branch Execution Unit which means that ALU cycles aren't burned just calculating the branch alone for each pixel.

Source: http://www.beyond3d.com/reviews/ati/r520/index.php?p=04

Im not the one spouting off in a thread about it. Dont bitch about needing to back up your claims.

Do you know what branching is?

Look, i "did my own research"
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Acanthus
Originally posted by: 5150Joker
Originally posted by: Acanthus


Youre beating branching to death, evidence?

Next time do your own research:

R520's batch size, being only 4x4 pixels large, should be very efficient for large batch sizes, at least in relation to NVIDIA's G70 which is described as having batch sizes of 64x16 (1024) pixels. R520's Pixel Shader architecture also has a specific Branch Execution Unit which means that ALU cycles aren't burned just calculating the branch alone for each pixel.

Source: http://www.beyond3d.com/reviews/ati/r520/index.php?p=04

Im not the one spouting off in a thread about it. Dont bitch about needing to back up your claims.

Do you know what branching is?


Dynamic branching is clearly spelled out in the article or do you need someone to translate it for you as well? Are you mentally disabled? You don't have to be a programmer to understand the article.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: 5150Joker
Originally posted by: Acanthus
Originally posted by: 5150Joker
Originally posted by: Acanthus


Youre beating branching to death, evidence?

Next time do your own research:

R520's batch size, being only 4x4 pixels large, should be very efficient for large batch sizes, at least in relation to NVIDIA's G70 which is described as having batch sizes of 64x16 (1024) pixels. R520's Pixel Shader architecture also has a specific Branch Execution Unit which means that ALU cycles aren't burned just calculating the branch alone for each pixel.

Source: http://www.beyond3d.com/reviews/ati/r520/index.php?p=04

Im not the one spouting off in a thread about it. Dont bitch about needing to back up your claims.

Do you know what branching is?


Dynamic branching is clearly spelled out in the article or do you need someone to translate it for you as well? Are you mentally disabled? You don't have to be a programmer to understand the article.

I know exactly what branching is and how it works. The problem here is the performane hit for doing it the way nvidia does it is not large enough to matter. There is no evidence to point to that says this hinders performance at all.

And you start with personal attacks, suprise suprise.

Quick, pull up all of those awesome effects in games that occupy 16 pixels of the screen.

Edit: and of course, because we are talking about branching, make sure that its the combination of multiple effects to create a new, unique effect.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Acanthus
Originally posted by: 5150Joker
Originally posted by: Acanthus
Originally posted by: 5150Joker
Originally posted by: Acanthus


Youre beating branching to death, evidence?

Next time do your own research:

R520's batch size, being only 4x4 pixels large, should be very efficient for large batch sizes, at least in relation to NVIDIA's G70 which is described as having batch sizes of 64x16 (1024) pixels. R520's Pixel Shader architecture also has a specific Branch Execution Unit which means that ALU cycles aren't burned just calculating the branch alone for each pixel.

Source: http://www.beyond3d.com/reviews/ati/r520/index.php?p=04

Im not the one spouting off in a thread about it. Dont bitch about needing to back up your claims.

Do you know what branching is?


Dynamic branching is clearly spelled out in the article or do you need someone to translate it for you as well? Are you mentally disabled? You don't have to be a programmer to understand the article.

I know exactly what branching is and how it works. The problem here is the performane hit for doing it the way nvidia does it is not large enough to matter. There is no evidence to point to that says this hinders performance at all.

And you start with personal attacks, suprise suprise.

Quick, pull up all of those awesome effects in games that occupy 16 pixels of the screen.


If you know what it is, then why ask for a link? Or were you just trying to troll as usual?