So what do you all think?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
The card is very fast in Direct3D, no doubts there. The biggest problem I have with it is OpenGL performance and its dual-slot cooler.
 

solofly

Banned
May 25, 2003
1,421
0
0
Originally posted by: Frackal
R520 might have dual core optimizations ... in 6 months

By then I will be making room in my box for next gen cards.;)

PS
Or maybe sooner...
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: southpawuni
Originally posted by: munky
x1800 - HOT
x1600 - not
x1300 - not

From a technological perspective, the x1800 is way ahead of the gtx. It runs SM3 like Nvidia only wishes it's cards could run it. And it has better IQ. The only problem is you cant buy one. OpenGL performance is also lacking, but that can be fixed in the drivers. I have a feeling in Unreal3 it's gonna pwn the 7800.

It doesnt matter if the results are not there.
The NV30 was technologically superior too, and it lost as well.

I'm glad people are recognizing the OGL issues, they promised a long time ago to fix that.. as you say "in the drivers".
Your excuse it can be fixed in the drivers is not going to cut it. They've had years.

I dont see how you can relate this to the FX cards. Late availability, dual slot... yeah, maybe in that regard. But when it comes to runing complex DX9 shaders, it beats the 7800gtx hands down, quite the opposite of what happened to the FX. If FEAR, Farcry and BF2 are any indication of future games, I'm pretty sure the x1800 is gonna run em faster.

As for OGL, it is the drivers, but I'm not holding my breath for them to fix em. Maybe if Doom3-based games become more popular, then they'll improve the drivers. Either way, there are more DX9 games than OGL games, so it's not a deal breaker in any case.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: munky
Originally posted by: southpawuni
Originally posted by: munky
x1800 - HOT
x1600 - not
x1300 - not

From a technological perspective, the x1800 is way ahead of the gtx. It runs SM3 like Nvidia only wishes it's cards could run it. And it has better IQ. The only problem is you cant buy one. OpenGL performance is also lacking, but that can be fixed in the drivers. I have a feeling in Unreal3 it's gonna pwn the 7800.

It doesnt matter if the results are not there.
The NV30 was technologically superior too, and it lost as well.

I'm glad people are recognizing the OGL issues, they promised a long time ago to fix that.. as you say "in the drivers".
Your excuse it can be fixed in the drivers is not going to cut it. They've had years.

I dont see how you can relate this to the FX cards. Late availability, dual slot... yeah, maybe in that regard. But when it comes to runing complex DX9 shaders, it beats the 7800gtx hands down, quite the opposite of what happened to the FX. If FEAR, Farcry and BF2 are any indication of future games, I'm pretty sure the x1800 is gonna run em faster.

As for OGL, it is the drivers, but I'm not holding my breath for them to fix em. Maybe if Doom3-based games become more popular, then they'll improve the drivers. Either way, there are more DX9 games than OGL games, so it's not a deal breaker in any case.


You dont like quake4!?!? or enemy territory?
F.E.A.R is gnig to be diffferent, and you will see what i mean when the final version hits the stores.
BF2 probably benfits from 512mb, as it does with more system ram. I would wait to see how 512 mb GTX compares with the X1800 XT 512mb.
Farcry, judgng from non bias reviews e.g AT, xbitlabs, guru3d, hardocp it looks to me that they perform even in far cry, if not the 7 series doing better.

And from benchs such as shadermark 2.1, pixel shader tests on 3d mark 05, the GTX is unbeaten.

but then again, its still an awesome card performing with not yet matured drivers.

 

Byte

Platinum Member
Mar 8, 2000
2,877
6
81
I was saving up for something nice, i had 6800GT, but wasn't using it much so sold it for full price and stuck using a 9800pro. Now i game on my XPS2. Looks like i'll be skipping this gen and getting an ipod video instead :D
 

cbehnken

Golden Member
Aug 23, 2004
1,402
0
0
lukewarm, should've been faster and available now. I'm glad I'm satisfied with what I got, because I certaintly wouldn't bother to replace it with a x1800
 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
Hot, perhaps, but only in temperature :D

I'd say all-in-all extremely disappointing for a product 6+ months past due. And consider the flagship won't be available to purchase for about 2 months yet...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Pabster
Hot, perhaps, but only in temperature :D

I'd say all-in-all extremely disappointing for a product 6+ months past due. And consider the flagship won't be available to purchase for about 2 months yet...

yes, that is the "official" nVidian reply
:Q



:D
 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
Originally posted by: munky
I dont see how you can relate this to the FX cards. Late availability, dual slot... yeah, maybe in that regard. But when it comes to runing complex DX9 shaders, it beats the 7800gtx hands down, quite the opposite of what happened to the FX. If FEAR, Farcry and BF2 are any indication of future games, I'm pretty sure the x1800 is gonna run em faster.

Perhaps you haven't seen the same figures I have, but in BF2 the 1800XT is only marginally faster than the GTX. And the jury is still out on F.E.A.R. as well as forthcoming nVidia 80 series official drivers.

As for OGL, it is the drivers, but I'm not holding my breath for them to fix em. Maybe if Doom3-based games become more popular, then they'll improve the drivers. Either way, there are more DX9 games than OGL games, so it's not a deal breaker in any case.

I'm going to go out on a limb and say that if nVidia turned in such poor performances in OpenGL titles there would be a lot of complaints. OpenGL is far from dead, after all.
 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
Originally posted by: apoppin
yes, that is the "official" nVidian reply
:Q



:D

Nah, the official 'nVidian' reply is a 512MB 7800GTX clocked considerably faster than current parts. And some new drivers to boot :D
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Pabster
Originally posted by: apoppin
yes, that is the "official" nVidian reply
:Q



:D

Nah, the official 'nVidian' reply is a 512MB 7800GTX clocked considerably faster than current parts. And some new drivers to boot :D

ok, fair enough . . . thanks for giving the 2nd part of the official reply
:D

and you guys can be glad i cut my index finger today . . . can't type fast . . . :p
:frown:

[and these threads sure ain't worth reading if you can't reply]
:roll:
 

gxsaurav

Member
Nov 30, 2003
170
0
0
I just completed reading the Tech report review & I must say, I m impressed by the ability that the X1800XT beat 7800GTX by as mush as 12 fps in a few directX Games at the normal resolution I play (1024X768), while still having 16 pipelines, could it be because they used a 512 MB X1800XT & a 256 MB 7800GTX, but it also generates more heat & consumes more power then 7800GTX, Y is that, isn't it made on 90 nm process as compared to 110 nm of 7800GTX.

& one more thing, what difference does a lead of 6 or 12 frames means, I mean, in all the testes, specially at which I play games 1024X768, 4X AA & 4X Anis, the GT & GTX are already providing enough frames rates, even more then 60, also GT & GTX are cheaper & available today
 

lifeguard1999

Platinum Member
Jul 3, 2000
2,323
1
0
Originally posted by: munky
The only problem is you cant buy one. OpenGL performance is also lacking, but that can be fixed in the drivers.

:p
That's two problems. (1) You cant buy one. (2) OpenGL performance is also lacking.
:p

Seriously though, I do wish ATI would get its act together with OpenGL performance. Their OpenGL performance has been behind NVidia's since the original Radeon days. Since it has been so long, I am not certain that it can be fixed in the drivers.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
I think ATI's launch was disappointing from the fanboy point of view, but a pretty safe route for ATI to take. They decided to play it conservative with a 16-pipeline architecture, which depending on your point of view was either a ballsy thing to do, or completely hubristic. With the previous generation already packing 16-pipeline cards, I'd have to say that it seems ATI was overconfident in their architecture being able to stick with 16-pipes. At the same time, though, they do have some cojones for claiming (and producing) the fastest card available using only 16-pipes. They really do have faith in this new architecture, and it seems pretty solid. The 512-bit ring bus, 8 pixel/vertex units (or whatever it is they have 8 of ;) ), etc. ... It's a solid foundation to build on.

Unfortunately, the X1800XT should have been the first stepping stone on top of this architecture, and IMO should've had at least 20 pipes. When you're coming late to the party, you better be packing a little extra to compensate. Make the X1800XL a 16-pipe card, but make the flagship truly special (aside from the wicked fast RAM).

So basically we get a quick card that has 512MB although doesn't seem to really need it (imagine what R520 + 20 or 24 pipes could have been), competes with Nvidia in everything, bests the GTX by about 10% in D3D with settings cranked and lays down in OpenGL like before...

Hopefully ATI fills out their product line like this generation, because currently, there's no attractive card except the very top: the XL is overpriced and the X1600 and X1300, while nice parts in their own right (or at least they will be when their prices go down), don't beat the last generation 16-pipeline cards.

The next couple of months should have ATI's prices dropping quite nicely from MSRP (just like the GT and GTX have), with hopefully something attractive in the price/performance category. Unfortunately, with the XT having only 16 pipes, they don't have a lot of options for a third card; either an underclocked XL or an underclocked XT (or perhaps with a quad disabled).

 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Man, they shut down my thread like this, yet this one is somehow allowed to remain alive (see my sig).

Anyhow, X1800XT is definately NOT hot. Such a shame after so much hype. 32 pipes would have rocked tho. :)
 

AnnoyedGrunt

Senior member
Jan 31, 2004
596
25
81
I'm not too impressed.

The only card that seems like a true "next gen" in terms of performance is the the X1800XT. The problem is that the price of that card is soooo much (FWIW, I also believe the GTX is way too expensive as well).

The X1800XL is competitive with the 7800 GT, so as long as it costs a similar amount then I believe it would do well.

Everything lower than that seems overpriced relative to existing cards.

I am happy to see ATi harness some of their graphics power for increasing IQ, and hopefully it will convince NV to give an option for their old AF method.

The AVIVO stuff seems to be primarily hype, with no real functional benefits (or at least nothing different from the old AIW cards).

What I find most dissapointing bout the new video cards, is that instead of lowering the price of previous get high end (and making it the mid range), they seem to just add new higher end stuff to the top. I guess it's just taking a bit longer for things to trickle down.

Personally, I have a very hard time spending more than $300 on a video card, and right now there is nothing in this new series that competes with an X800XL in performance and price but with the added features. Given that six months have gone by since the X800XL came out, I had expected to see something perform much better at a similar price, instead of what we have in this offering.

I would also like to add that I'm somewhat disgusted by the huge amounts of RAM that are being added to the low end cards, which jacks up the price but doesn't improve performance at all. There is no reason for an X1300 or X1300pro to have 256 MB of RAM. ATI should drop $30-$35 from the price of the cards and only offer them with 128. Also, they seem to have 2 $150 cards and 2 $199 cards, and in both cases the one with less RAM will be vastly superior. That just seems like a very bad marketing strategy to me.

Of course, I'm glad I don't really have any incentive to go off and buy a new card, so maybe in the end that's a good thing.

-D'oh!
 

Nox1

Member
Jun 30, 2001
64
0
0
Not as hot as I was hoping.

I'll be honest, its loss of "hotness" wasn't about the benchmarks. It was the lack of product.

I was throughly impressed with Nvidia launching the 7800 with wide availability. Ati has struck out 3 times in a row about this matter... Platnium Editions, Crossfire, and now the x1800. All no shows at launch.

I've been faithful to Ati since the 9700. But, I believe it's time to turn to Nvidia for my upgrade.

*

By the way, there's nothing new under the sun... remember waaaay back when in the 3dfx/Rendition days? Rendition dropped the ball, people went to 3dfx...the eventually 3dfx dropped the ball and Nvidia took the lead...Nvidia lost it to Ati's 9700 a few years back... now it's back to Nvidia's court.

Heck, as long as there's a great card out there (no matter who makes it), we, the consumers, are always the winner.
 

moonboy403

Golden Member
Aug 18, 2004
1,828
0
76
since it seems to be able to overtake 7800 gtx in most games
i'd say it's hot and i wish i have one!!!!
i hope nvidia will come up with the 512mb version to counter ati so we all benefit =D
 

Steelski

Senior member
Feb 16, 2005
700
0
0
If it was available now in numbers noone would really be complaining.
If the upgraded core of the R580 is a higher piped version then we should all be in for a real treat. I want to see some nice HDR benchmarks on something somewhere. preferably on farcry.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Stoneburner
Not thinking in terms of victory or defeat, the ATI's new architecture seems very forward looking. First, alot of people suspected this was just R420 on steroids which was R300 on steroids, but that was not true. Second, the g70 is basically the nv30 architecture which was a failure at first and then matured and now is impressing everybody.

umm.. how is r420 "new architecture" while g70 is "basically nv30"? if that's not a slanted view i am not sure what is.

So the fact ATI has managed to be more than competitive in certain games while dealing with new architecture is impressive.

multiple delays and 6 mo late to market is impressive?

and while the x1800xt preview cards are certainly "more than competitive", they're STILL not here. i would think more than a paper launch would be required unless you're easily impressed.

THis is assuming they aren't cheating though like the 5 series NV's were.

some more mudslinging.. let's not forget ati has been guilty multiple times as well...

ANd of course 3dmark isn't a game but ATI'S STRONG showing in that along with other newer games like FEAR might portend a bright future for r520 based tech.

while it certainly must be tempting to grasp the positives, the FEAR demo is hardly indicitive of thow the game will perform:

"We were interested in testing the FEAR demo, but after learning that the final version would change the performance characteristics of the game significantly, we decided it would be best to wait until we had a shipping game. From what we hear, the FEAR demo favors ATI's X1000 series considerably over NVIDIA hardware, but performance will be much closer in the final version."

So while if you are buying this generation you should probably go with Nvidia based on price alone, it is very possible by the time r580 and better drivers come around ATI might be pulling strongly ahead of Nvidia.

if you're buying this generation, you don't have a choice -- at least for a min. of another month.

as far as next gen and beyond, one could say it's also possible nvidia may pull strongly ahead w/ g80. i think likely we will see close performance figures yet again, with some minor differences in performance and features -- but overall there will still be parity. the biggest question is who will be able to deliver their product in a timely fashion. timely delivery and good avail. drives prices down -- a very good thing for consumers.

ati has not often shown they can do that in the last couple of years...

of course, any of those scenarios are possbile -- it's all merely speculation at this point.

ON the other hand, WTF is with ati and open gl? They've had so many years to work with it and still can't get it right?

i certainly can't figure that one.. perhaps they don't care and simply put more work into the mainstrem api - d3d.

 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Nice try ATI
B+ for effort
After all the problems they had, putting out a very competative part is good even 6 months late, the feature set looks great so I say good job ATI.

C- for excecution
6 months late is still 6 months late, paper launching the highend part (although 1 month is better then last time round) still plenty of room for improvement.

A for us
Who really cares if ATI is late to the party, they have a part which is competitive which means price wars and cheaper prices for us
 

Rage187

Lifer
Dec 30, 2000
14,276
4
81
I just can't believe Ati had 6 months to beat their closest competitor and all they could do was barely tie/nudge out the GTX.

6 months and they totally had access to their competitor's parts to have a baseline yet they still couldnt do it. :confused:

Not to mention the $100 mark up.
 

kruull

Member
Aug 12, 2005
110
0
0
Well the 1800Xt is nice and all, but as many of u allready stated: LATE. Maybe not for all Atifans not buying a 7800 previously just to get their hands on the Ati...
But also they have to admit that the 1800 is still a bit dissapointing - where are the promissed 12000 points in 3dm05? They also said CF will run on nv4 sli motherboards as well: now one has to buy a new board and has to wait for the CF cards as well.
I am not a Ati nor a nV fan - iam buying what is generally considered a good buy: as the 6600GT was 10 months ago and the 7800 is now (shoot me if im wrong but 7800GTX for 500$ is better than 1800XT for 600$ + nV will drop the prices of the 7800 series soon - at least i hope so).