• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Official X1900 Reviews **Updated** 16 REVIEWS

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
What the crap, something up with the hardware zone review...

I score higher in 3dmark06 in my sig computer than both the 7800GTX 512 and the X1800XT.. Come on i know this is wrong and i really don't think the 3500+ will do that much to such powerfull cards.

My Score 3893
 
Originally posted by: Lord Banshee
What the crap, something up with the hardware zone review...

I score higher in 3dmark06 in my sig computer than both the 7800GTX 512 and the X1800XT.. Come on i know this is wrong and i really don't think the 3500+ will do that much to such powerfull cards.

My Score 3893


Quake 4: 8x6 72.4; 10x7 79.9 - did they run viruscheck or what? 🙂
 
Originally posted by: Janooo
Originally posted by: Lord Banshee
What the crap, something up with the hardware zone review...

I score higher in 3dmark06 in my sig computer than both the 7800GTX 512 and the X1800XT.. Come on i know this is wrong and i really don't think the 3500+ will do that much to such powerfull cards.

My Score 3893


Quake 4: 8x6 72.4; 10x7 79.9 - did they run viruscheck or what? 🙂

Maybee the card got bored and backed down to 2d clocks. then it yawned and flipped its owners off.
 
Originally posted by: wizboy11
Don't really like the performance of the X1900XT from the hardware zone review. It often equaled the performance of the GTX512 or beat it by a SLIGHT margine. So, they beat the GTX512 by a little bit, how the hell are they going to beat the G71?

They're not.
 
Originally posted by: Rollo
Originally posted by: wizboy11
Don't really like the performance of the X1900XT from the hardware zone review. It often equaled the performance of the GTX512 or beat it by a SLIGHT margine. So, they beat the GTX512 by a little bit, how the hell are they going to beat the G71?

They're not.

if specs proove to be true.
 
Hexus didn't have a bad thing to say about R580 except that software was holding it back.
ATI clearly designed R580 to be extremely fast in one kind of processing, while keeping the rest of the chip in pretty much the same very healthy state as R520, at the same clocks. Anticipating, which is the key word, games titles with very high reliance on fragment shader processing means that R580 can almost come off looking too much like R520!

R580, moreso than any modern GPU that meets the D3D9 spec, therefore relies on software to show it off. Even current synthetic benchmarks designed to show off theoretical rates in 3D hardware can have a hard time exploiting the tripling in fragment processing ability. That's not to say the performance increases at the same clocks as R520 are invisible. Clearly they're not without increases, especially at the higher resolutions, of up to 30% in the games we tested, clock-for-clock.

A good glimpse of shader rate throughput in our instruction issue test also gives a clue to where R580's strengths lie. Further, it has no real-world weakness when it comes to comparison with R520. Vertex processing rate is intact, performance drops for antialiasing and texture filtering are almost identical, and it shares exactly the same feature base, even besting R520 with a working implementation of ATI's Fetch4 feature.

Therefore it's fair to sum up that R580, clocked very conservatively, gives a staggering fragment shader rate first and foremost. Following that, its considered engineering means that's followed by balanced assistance to the other major facets of today's modern game rendering, general stream programming and D3D9 games still to come. Double Z-only rate sustained with MSAA, plenty of memory bandwidth in XT and XTX configuration, more than 1GiB/sec for GPU-to-host writebacks for the first time and very low penalty PS branching seal this particular deal in a big way.

At $649 for the X1900 XTX and $599 for X1900 XT, it'll push X1800 XT and XL down in price in short order, putting the two GPUs and their SKUs in the kind of price place we'd have expected over time, given an earlier R520 introduction. It's just somewhat maddening to see it happen so soon, annoying the early adopter of R520 hardware.

All-In-Wonder Radeon X1900 is due to launch within days, availability is excellent - expect to easily pick one up on launch day and the weeks and months to come - and the performance promise of ATI's new flagship GPU is compelling. It's not quite the performance monster some may have expected, but that's largely down to it being held back by software.

GeForce 7800 GTX 512 is generally bested in all modern games, and Radeon X1000-series products have enough significant image quality advantages to give X1900 XT the nod even if the performance difference was slight in either direction, comparatively speaking. We're seeing all the early XT boards come with the 1.1ns BJ11 DRAMs of the XTX, making the XTX a choice only for those with carefree finances.

A new high-end GPU with everything going for it, perhaps bar the software to really show it off. Will it come in time, especially with D3D10/Vista games programming already under way at most major developers, is the question. Even if it doesn't fully realise its potential, it's a blindingly fast 3D graphics product with the best IQ possible.

Finally, we've got Radeon X1900 Crossfire to pitch against GeForce 7800 GTX 512 SLI in the coming days, so stay tuned for that.
 
I somewhat find it hard to believe that the XTX is going to be $650. I know thats MSRP, but thats too much, even for the high end, but maybe thats why the XTX and XT come with 1.1ns memory.
 
Originally posted by: RobertR1
If it's 1.1ns mem, the mem *should* OC easily to 1700-1800mhz with some added voltage, right?


Yep and the best part is the XT and XTX have identical memory and core. Can't wait to get my card, gonna slap on the water block and OC that sucker!
 
Originally posted by: beggerking
I feel cheated by ATI.. the core is basically the same as 1800xt with 36 more shader processors...
the scores weren't all that impressive at all..

I think it should be called x1801xt...
I was expecting it to be a new gen..

It's not a new generation, it's a refresh. As is the 7900 series from nVidia. The true next generation video cards will be the R600 and G80 based video cards.

Originally posted by: Rollo
Originally posted by: wizboy11
Don't really like the performance of the X1900XT from the hardware zone review. It often equaled the performance of the GTX512 or beat it by a SLIGHT margine. So, they beat the GTX512 by a little bit, how the hell are they going to beat the G71?

They're not.

I hate to sound like a broken record repeating what everyone else has said, but let's wait till we test these with the Cat 6.2 drivers. I'm assuming these are the ones that will properly recognize a X1900 and not the hacked drivers we've been looking at. I'll wait for the "real" benchmarks to hit before making a final judgement on the real performance of these video cards.

There are many rumors on the G71 and not enough concrete information. We've heard fantastic claims such as a 430mhz G71 being the equivalent of a 550mhz G70 and that the G71 will be clocked at 700mhz. That means that the G71 is suppose to be roughly 40% more powerful than the G70...I find that a bit hard to believe. Maybe in a few limited instances but in general doubtful, especially considering that this is a refresh product.

I'm also curious as to whether the 7900's will have full HD video decoding acceleration (WMV9, H.264, etc) and if it has any HD video encoding acceleration. This is one area that will help me evaluate what's right for me. At least ATI has a promise that their video cards will accelerate video encoding as well as decoding, even if they are late on their promise.

For the prices these video cards are now costing, I want more than just more eye candy in the latest games. I want value added features that helps me in other things such as video encoding/decoding and physics.

There was nothing in the last iteration of video cards (7800, X1800) that made me want to buy them. If ATI or nVidia wants my money, they better give me a darned good video card for my money.

My current plan is to upgrade to whichever is best, the 7900 or X1900. Then get 2GB of RAM as well as an X2 processor. That pretty much takes care of me for the next couple of years as far as computer upgrades go. Current CPU roadmaps don't show me any reason to upgrade in 2006 as well as 2007 past a dual core processor and getting 2GB of RAM. And I've never been one to be pressured into buying a video card every time they release one so I don't believe I will be getting one until the refresh of the R600 and the G80 comes out. That would probably be a good time to consider upgrading to Windows Vista as well since SP1 should be coming right out with all the bug fixes.
 
Wow-thanks for posting these because its all thats out there right now- but these are the worst reviews I've ever seen.
 
Originally posted by: GTaudiophile
When will the NDA be lifted?

In an hour or so. When people get to work in the USA. I am watching adamantley to see which review site gets up and goes to work first.
 
The general consensus of the reviews so far seems to be, not that great now but as more shader heavy games come out it'll show itself as a real beast
 
X1900 slaughters the 7800 in FEAR, look at H's review. Crossfire drivers are hurting x1900 because H can't enable some quality settings, but they did get both HDR and AA in Serious Sam 2. "HardOCP Must Have Hardware"
 
As others have said we really need to see it with at least vaguely mature drivers not the hacked/premature drivers so far. Hardocps numbers seem rather strange at times and where the X1900XT is scoring far too close to the X1800XT - but if you check their test setup their using (hacked?) beta drivers
 
i dont trust hardocp much because of their "highest playable settings". however guru3d is respectable enough imo. shows some nice scores in Q4. 1920x1080 6aa 16af and STILL 61fps! even 1 more in COD2...

IMHO, this is the card to buy right now... it delivers one helluva performance. its certainly not a bad refresh and with better drivers and optimizations it can only get better. there is also no doubt in my mind that the G71 will more then likely beat this card again. even with optimized drivers. but its definatly not a bad card (available, and at a good price.)

i'd figure they learned from their mistakes with the R520. ao atleast that drama wasnt completly a disaster 🙂

i know everybody's waiting for anand's review 🙂 but ill take this one for now.

edit: 1600x1200 6aa 16af and still 97 FPS in hl2! woohoo! finally i was seriously getting ticked off with the framerate of my ti4600 is cs:s. guess thats over ^^
 
Looks okay to me, tho the best feature of it will hopefully be decreased prices on the previous gen hardware.... Here's hoping nvidia's new part is competitive (or a little faster!) than this, forcing prices down more! Everyone wins!
 
Back
Top