• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Whats with the hype of the X1900's?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,402
8,574
126
Originally posted by: RedStar
heh..don't wait and get whatever rules right now..is that it?

Are the people who went out and bought the 1800XT a few weeks ago agreeing with that?

:)

For me, i just hope this crashes the 7800gt into the 300$ cdn range :)
(although those pixel shaders on the 1900 series seem to address the trend in future gaming for the next year)

the thing is, everyone has known for a month or so exactly what x1900 would be (x1800 with 3x the shader logic)

we don't know exactly what G71 will be. nvidia is very tight lipped about it. memory bandwidth CANT improve that much over what the 512 GTX has (unless nvidia goes to 512 bit memory, which is doubtful. if they do it would be killer. think 9700 pro with AA/AF turned way up killer) because the chips simply don't exist (has samsung even started sampling GDR4?).

added fillrate may be damn near useless without the memory bandwidth available to feed it. (32 pipes/ 430mhz G71 = 550 mhz worth of G70, 750mhz would be a huge theoretical fillrate boost beyond that).

IF ati is right, and game FPS become even further decoupled from fillrate (because shader ops can do the same thing as texturing, but don't take up much memory bandwidth), then the x1900 could be faster than the G71 even with a slight clock rate deficit (48 pixel shaders vs the theorized 32).

the fact of the matter is, G71 rumors are just that, rumors. remember the R520 rumors? 32 pipes! no, 24! 750 mhz! no 500 mhz!
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: 5150Joker
I dunno Rollo, nVidia might give us high clocked 32 pipe monster that is as "widely" available as the 512 GTX. If and when it shows up and proves to be a better card in mass quantity and fair price I'll just buy it as long as its significantly faster, has angle independent AF and possibly HDR + AA. Otherwise I'll wait it out for R600. After hearing how the 512 GTX would be widely available and competitively priced from many on this forum, I'd take rumors based on vapor hardware like G71 with a chunk of salt.

BTW some of you should read Eric Deimers (sp) interview at Beyond3D. It explains why ATi took the pixel shader route rather than slap on more texture units. From it you can infer that nVidia will likely be bandwidth limited if they actually produce a 32 pipe card in mass quanitity like rumored. Hope nVidia doesn't give us another 512 GTX press edition.

TIme will tell which card will be the "best of the best" for the first half of 2006, 5150. I don't think anyone would be "sad" they bought a X1900XT as they a re great cards at a very good price point.

I just think those who want "the best" might like to wait, and those who upgrade infrequently might like to wait.

In six weeks, X1900XTs will probably be $425 instead of $525, so if you can do without an upgrade for a month and a half, you might either save $100 on a X1900, or get a much better card in a 7900.

Time will tell. I'm in no hurry to upgrade, as my rig does fine. :)

What's interesting to me is all the people who wanted to wait for the R520 to compare were willing to wait months and months, but don't seem to have six weeks to wait on the 7900.


 

mb

Lifer
Jun 27, 2004
10,233
2
71
Rollo - quick question for ya since I'm too lazy to search... G70 = 7800gt, gtx, and 512mb gtx... G71 = 7900 which will be announced in 6 weeks? And then G80 (8000gtx?) is coming after that (later this year?)? I've fallen out of the loop and all these numbers don't make much sense anymore.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: supafly
Rollo - quick question for ya since I'm too lazy to search... G70 = 7800gt, gtx, and 512mb gtx... G71 = 7900 which will be announced in 6 weeks? And then G80 (8000gtx?) is coming after that (later this year?)? I've fallen out of the loop and all these numbers don't make much sense anymore.

You're correct!
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Rollo

42 days is not a long time to wait, I think I can make it. ;)

Good for you. Some of the people here who are upgrading now to 1900 cards don't already have GTX SLI setups which were given to them at steep discounts or for free.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Cooler
you going for quad sli this time?

Wouldn't that be sweet? Not elegant, not cool, not quiet- just "I want to put it on 19X14 8X16X on all games".

I think my case could handle the airflow better than my marriage could handle me explaining to my wife why I wanted 4 video cards in my box. ;)
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Rollo
Originally posted by: Cooler
you going for quad sli this time?

Wouldn't that be sweet? Not elegant, not cool, not quiet- just "I want to put it on 19X14 8X16X on all games".

I think my case could handle the airflow better than my marriage could handle me explaining to my wife why I wanted 4 video cards in my box. ;)


LOL the key is to surprise her. Cook her dinner that night, giver her some extra lovin' and then slip in the fact that you have quad SLI :D
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: M0RPH
Originally posted by: Rollo

The people who want ATI no matter what will buy X1900XTs, and rightly so. It's the best product they've put out since the 9700Pro.

The people who want to have the best card should wait, because the X1900XTX will not be it, or even close. The games that are shader limited are FEAR and FEAR between......errr, Few and Far between and the 7900s massive fillrate advantage, coupled with a large bandwidth advantage, should destroy the X1900XTX in all current games not spelled F-E-A-R. ;)

You all should definately listen to this guy based on the great advice he gave you last time... 'wait for the 512MB GTX, it'll be competitive in price and widely available.' Lol.

The G71 is nothing but vapor at this point in time and given Nvidia's recent track record with high end parts, I wouldn't put too much blind faith in it. Leave that for the Nv fanboys.

Not to mention, when does "best" mean "fastest"? There are several key areas the X1900 will likely have over the G71. #1, you can bet the price will be much lower. #2 HDR+AA in games that NV cant do it in. #3 Better AF, etc. If more games are programmed like F.E.A.R., there is serious doubts that the G71 will be fast enough. U3 uses a lot a dymanic branching as well, who knows what will happen with that game. Sweeny said the G70 would be the better card than ATi's, I cant imagine that considering the X1800 is faster than the GTX in almost all tests.

Fastest doesnt always mean best. I think its the most important component of the "best" card, but by no means is it the only thing that matters. The leap frogging will continue, just like it has for the past several years. As has been pointed out several times above me.
 

Cooler

Diamond Member
Mar 31, 2005
3,835
0
0
Originally posted by: 5150Joker
Originally posted by: Rollo
Originally posted by: Cooler
you going for quad sli this time?

Wouldn't that be sweet? Not elegant, not cool, not quiet- just "I want to put it on 19X14 8X16X on all games".

I think my case could handle the airflow better than my marriage could handle me explaining to my wife why I wanted 4 video cards in my box. ;)


LOL the key is to surprise her. Cook her dinner that night, giver her some extra lovin' and then slip in the fact that you have quad SLI :D
You could just get two video cards at time to soften the blow
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: 5150Joker
Originally posted by: Rollo
Originally posted by: Cooler
you going for quad sli this time?

Wouldn't that be sweet? Not elegant, not cool, not quiet- just "I want to put it on 19X14 8X16X on all games".

I think my case could handle the airflow better than my marriage could handle me explaining to my wife why I wanted 4 video cards in my box. ;)


LOL the key is to surprise her. Cook her dinner that night, giver her some extra lovin' and then slip in the fact that you have quad SLI :D

"I'm cooooming into possession of quad sli tomorrow"

That type of thing? ;)

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ackmed
Originally posted by: M0RPH
Originally posted by: Rollo

The people who want ATI no matter what will buy X1900XTs, and rightly so. It's the best product they've put out since the 9700Pro.

The people who want to have the best card should wait, because the X1900XTX will not be it, or even close. The games that are shader limited are FEAR and FEAR between......errr, Few and Far between and the 7900s massive fillrate advantage, coupled with a large bandwidth advantage, should destroy the X1900XTX in all current games not spelled F-E-A-R. ;)



You all should definately listen to this guy based on the great advice he gave you last time... 'wait for the 512MB GTX, it'll be competitive in price and widely available.' Lol.

The G71 is nothing but vapor at this point in time and given Nvidia's recent track record with high end parts, I wouldn't put too much blind faith in it. Leave that for the Nv fanboys.

Not to mention, when does "best" mean "fastest"? There are several key areas the X1900 will likely have over the G71. #1, you can bet the price will be much lower. #2 HDR+AA in games that NV cant do it in. #3 Better AF, etc. If more games are programmed like F.E.A.R., there is serious doubts that the G71 will be fast enough. U3 uses a lot a dymanic branching as well, who knows what will happen with that game. Sweeny said the G70 would be the better card than ATi's, I cant imagine that considering the X1800 is faster than the GTX in almost all tests.

Fastest doesnt always mean best. I think its the most important component of the "best" card, but by no means is it the only thing that matters. The leap frogging will continue, just like it has for the past several years. As has been pointed out several times above me.


Hmmmm.

Ackmed speculates worst case scenario for nVidia, go figure.

Of course, if you buy now, and 7900GTs/Us are comparably priced, widely available, and faster in upcoming games, Ackmed probably won't be refunding your money. :(
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
You do realize that all this flaming and arguing is based on the ASSumtion that the g71 is a 32 pipe 750 mhz card, and that no documentation exists to support this theory. I knew the r580 was gonna be a 48 PS 16 TMU card as soon as the specs of the rv530 were announced, you can just extrapolate the rest from the leaked Ati docs. I wish I had some leaked Nv docs right about now, but all those who are waiting for a 32 pipe monster card form Nv might be in for a disapointment come March.

Why is the g71 slated to launch 1.5 months after the r580? Usually Nv would have a competing product released about the same time, not 6-8 weeks later. Also, if they did release a monster g71 card, how would that affect the sales of the g80 coming out later this year? A unified shader g80 would need to have no less than 48 unified shaders clocked through the roof just to break even with the rumored monster g71 in most apps. My best guess is that the g71 will be either a 32 pipe (or even 28 pipe) card clocked at at 500-600mhz, or a 24 pipe card clocked at 700+ mhz, but not both. Moreover, such a card would in fact be a good match for the r580 in many cases except the most shader heavy games like FEAR. While I'd like to see a 700mhz 32 pipe card available for $600 in two months, I highly doubt thats gonna happen.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,402
8,574
126
Originally posted by: Rollo

"I'm cooooming into possession of quad sli tomorrow"

That type of thing? ;)
there is an image that i did not ever need to think of

*shudder*
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Rollo

Hmmmm.

Ackmed speculates worst case scenario for nVidia, go figure.

Of course, if you buy now, and 7900GTs/Us are comparably priced, widely available, and faster in upcoming games, Ackmed probably won't be refunding your money. :(

If the G71 turns out to be a horribly overpriced $750 card like the last Nvidia card you recommended everyone wait for, will you give these people the extra cash they need to buy one?
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Here's my take on it:

First of all, I think the X1900XT is THE card to get for the time being anyways. It's pretty much exactly the same thing that's in the Xbox360, so developers will be quite familiar to it soon. Its design of 16 pipelines/48 pixel shaders seems to be based on the fundamentally sound notion that games are getting more and more pixel shader intensive, and it certainly raises the bar in terms of pixel shading power. Tremendously. I like the technology A LOT.

With that said, I think G71 will initially give the X1900XTX a bit of a whooping. A 32 pipeline semi-refresh of the 7800 series with similar clocks to the 7800GTX 512MB (nevermind a potential bump in core speed) will definitely spank the X1900XTX in current/past games, where pixel shaders are NOT the limiting factor. I'd expect Far Cry to be a cakewalk for the G71 (but as Rollo has said, who hasn't played Far Cry already?). Ditto Half Life 2. I'd expect ATI to get bludgeoned there.

But with that said, I think the focus is on the future, since even a 7800GT can play pretty much all but a few of the latest games (eg. Fear, COD2?) at high resolution with tons of AF and some AA. So basically it's between the combined brute force might of the G71 ('more of everything') and the focused brute force of R580 (pixel shaders all the way!).

I could see it going two ways in the future. On the one hand, nobody wants to design a game that only works well on half the cards (although idSoftware seemed to do just that initially on Doom3). So, I could understand if developers kept pixel shaders reasonable so G71 wouldn't be overloaded. In this case, the pipelines of G71 would propel it to easy victory.

On the other hand, developers may very well go for heavy shader loads, especially since Xbox360 should become a showcase for gorgeous pixel shaded games, and I doubt the PC developers would ever let console developers trump them. So a "high shader mode" for potential future games could also be a possiblity, where ATI's 48 pixel shaders become of key importance to display the game in its higest detail.
-------------

My post is based on absolute conjecture, but I think it's based on some reasonable assumptions.

What we do know is that if the PC market has taught us anything, it's to buy for now, not the future. Prices will always drop, and hardware will always become obsolete. You have to jump on the wagon at some point. But at the same time, make sure there are games you want to play that will take advantage of this hardware. You don't need an X1900XTX to play Half Life 2 or World of Warcraft...
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: Cooler
you going for quad sli this time?

I know that I've already got a Renegade being held for when they release. Yes, it's Intel, and it burns my rear, but it's limited production, and it's going to be a monster (imo, anyway).

I'm not interested in a quad setup, as a pair of 512MB GTXs suffice for me right now (meaning, I'm not actively pursuing it, but the Renegade is obviously a different story).

That being said, let me throw out a little tidbit, possibly on the G71, but in very near future generations. Unified architecture isn't going away, and nVIDIA has already recognized the benefit.
 

xtx4u

Banned
Jan 1, 2006
73
0
0
Originally posted by: Rollo
Originally posted by: Corporate Thug
if i had to guess, I would say that ATi is gonna anticipating G71 with another card

just a hunch

http://www.xbitlabs.com/news/video/display/20060124235047.html

That sort of goes without saying. I think it also goes without saying a 32 pipe 700MHz + 7900 sporting 1800MHz RAM will pretty much slaughter the 16 pipe X1900s at 99.99999% of the games on the market, but maybe not FEAR. ;)

I wonder how many will wait 6 weeks to see? A lot of money to take the chance today. :(

iz all about who can pwn who in Unreal 3 engine, I wish epic just release the a demo or something in March so we can have a real comparison on 2006.
 

xtx4u

Banned
Jan 1, 2006
73
0
0
Originally posted by: Rollo
Originally posted by: Corporate Thug
i would think the return on the x1900 should be pretty damn, even in 6 weeks. I'm personally going to wait but i must admit, i am a bit worried based on the GTX 512 "launch".

The people who want ATI no matter what will buy X1900XTs, and rightly so. It's the best product they've put out since the 9700Pro.

The people who want to have the best card should wait, because the X1900XTX will not be it, or even close. The games that are shader limited are FEAR and FEAR between......errr, Few and Far between and the 7900s massive fillrate advantage, coupled with a large bandwidth advantage, should destroy the X1900XTX in all current games not spelled F-E-A-R. ;)

how is it the best product since 9700pro? The x850 series owned the 6800 pci-e series badly if u exclude SLi in almost everygames especially Source which I know MANY ppl play.
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
We got a nice in game (about 5 minutes worth) at the 7800 launch last year in San Fran. It was sexy then, and was fairly smooth on the 7800. Sweeney also used the 7800's capabilities to show off what they were doing with their engine, and that was fairly impressive, too.