Originally posted by: RussianSensation
And this is ATI's Official Girl -- look at the rendering on this one, lighting is spectacular....wow ATI X800pro truly has achieved realism beyond belief.
Wow, I really gotta get one of those cards now.
Originally posted by: RussianSensation
And this is ATI's Official Girl -- look at the rendering on this one, lighting is spectacular....wow ATI X800pro truly has achieved realism beyond belief.
Originally posted by: Bucksnort
Well I think if the ATI has no outrageous ps requirements and performance is anywhere near the Nv then we can assume Nv screwed up again. Whos is going to go out an buy a ps just for a video card when you can get an ATI.
And that's a pretty BIG "if".Originally posted by: Bucksnort
Well I think if the ATI has no outrageous ps requirements and performance is anywhere near the Nv then we can assume Nv screwed up again. Whos is going to go out an buy a ps just for a video card when you can get an ATI.
WISE BIRDS tell us here in Vienna that ATI and at least some of their knowledgeable partners don't feel so bad about NV40 Ultra. Even though the NV 40 Ultra, Geforce 6800 Ultra is a very fast card that outperforms everything else on planet for the time being, the ATI next generation chip might end up even faster.
The canaries are singing that 12 pipelines, 475MHz/950MHz card with 96 bit precision and PS 2.0 shader only will end up faster then 16 pipelines, 400MHz /1100MHz cards with 128 bit precision and PS 3.0 shader model.
THERE'S ONLY one thing about the new ATI marchitecture that we haven?t reported on yet. ATI calls it 3Dc and it's a new way of compression that this chip will include.
I guess ATI didn?t want us in Canada because it didn?t want us to learn about its new chips and its capabilities too soon.
The last piece of the puzzle is this 3Dc compression that ATI is willing to supply as an open standard and this marchitecture looks very impressive, at least on paper.
3Dc will be able to perform similar level of quality as normal non compressed textures, we learn, but it wont be able to compress all data. It might lose information about light and shadow stuff.
It will certainly speed things up r and R300 marchitecture tweaked up to R420 runs things faster then without. ATI is claiming 8.8 to 1 compression for R300 if I remember correctly and it was always rather good with these optimisations. However, I'd urge caution about these claims, as the R300 could compress data in 8.8 to 1 ratio only if you were talking about a black screen with information about only the black shade.
It's another piece of BlueCrystalKit to make your 950/1000MHz memory look better, that's for sure.
Originally posted by: apoppin
And that's a pretty BIG "if".Originally posted by: Bucksnort
Well I think if the ATI has no outrageous ps requirements and performance is anywhere near the Nv then we can assume Nv screwed up again. Whos is going to go out an buy a ps just for a video card when you can get an ATI.
:roll:
Oh, yeah . . . i DID (coincidently) upgraded (yesterday) my PS to 480w . . .
. . . just in case.
:roll:
Read my post above yours . . . the x800pro will be faster then nV40Ultra.Originally posted by: SilentRunning
Originally posted by: apoppin
And that's a pretty BIG "if".Originally posted by: Bucksnort
Well I think if the ATI has no outrageous ps requirements and performance is anywhere near the Nv then we can assume Nv screwed up again. Whos is going to go out an buy a ps just for a video card when you can get an ATI.
:roll:
Oh, yeah . . . i DID (coincidently) upgraded (yesterday) my PS to 480w . . .
. . . just in case.
:roll:
This just in........All next generation graphics cards will require 480.5w power supply
:laugh:
How are any beans being spilt by that?Originally posted by: GTaudiophile
Futuremark spills a few beans!
Originally posted by: ViRGE
How are any beans being spilt by that?Originally posted by: GTaudiophile
Futuremark spills a few beans!
ATI DECIDED TO INCREASE the frequency of its soon to be released X800XT cards that will be presented in just a few days, on May the fourth.
The card is now clocked at 525MHz and I am sure that ATI will claim that its yields are good and that's why it's done it, but actually it learned the Nvidia numbers from online reviews and decided to increase the frequency a little bit more just to match Nvidia's Geforce 6800 Ultra threat.
Knowledgeable friends tell us that Nvidia wins 3dmark 2003 and loses but in most of the other game test ATI wins over Nvidia. Still, bear in mind that ATI is only marginally faster when it's faster and we are not talking about quantum leaps here. Whoever wins in the game tests, whether it's Nvidia or ATI, wins by a small margin.
I am sure that ATI will promote how games are important now that Nvidia ironically, not to say Sardiniacally, embraced 3dmark03 as its most adorable friend.
ATI played a nasty game from the start as it wanted to see how NV40 look and feels before it decided to release its new card that will top the Radeon 9800XT card that is still the fastest currently shipping. I guess that the Canadians knew all a long how fast it could clock the card to match Geforce 6800 Ultra performance and that?s exactly what it did.
Isn?t it an irony that the Nvidia card is clocked lower and ATI card clocked faster? We all know that in the past Radeon was at least matching Nvidia's performance on much lower speeds, if we talk about the high end market of course.
Nvidia still has some faster NV40 ultra chips at least 2000, we heard and they can be clocked up to 475MHz. I don?t know for sure but I would suspect that NV45, Nvidia PCI Express card will end up with those chips. Otherwise we might end up with Geforce 6800 Ultra 2 or a simple scenario where the reference cards end up lower clocked than the retail ones. This has never happened before but there is a first time for everything.
We are sure that ATI cannot go much further than 525MHz as 600MHz is the theoretical limit of 0.13 µm marchitecture.
THE OTHER day we told you about the dual power supply on the GeForce 6800 series cards, and then I went on to say that ATI will probably do the same. I may have been off here, mainly because I am told the new X800Pro and X800XT will only consume a maximum of 80W. This would put them on the cutting edge of a single molex connector in terms of power usage.
So, how can it pull this trick off? Easy, it will be on a .11µ process rather than the current .13µ, and that should take power down a notch. The only question is what that will do to yield, it seems everyone who ventures under .13µ has the proverbial problems.
Either way, this is really good news for gamers, it means that the things will fit, almost comfortably in a SFF case, and not overtax the power supply. Happy day. The only question remaining is black.
Originally posted by: Acanthus
Nalu was a lot more impressive in the water imho.
The hair floating in the water was definately a first.
Originally posted by: Reliant
Originally posted by: Acanthus
Nalu was a lot more impressive in the water imho.
The hair floating in the water was definately a first.
I kinda think Nalu isn't as impressive. First, she's in water. No other items in the environment to worry about or render. Second, the physics for the hair could be relaxed in a water setting. Let's wait till we see Ruby in full motion and see how she looks before saying one is better than the other.
Originally posted by: gsellis
Originally posted by: Reliant
Originally posted by: Acanthus
Nalu was a lot more impressive in the water imho.
The hair floating in the water was definately a first.
I kinda think Nalu isn't as impressive. First, she's in water. No other items in the environment to worry about or render. Second, the physics for the hair could be relaxed in a water setting. Let's wait till we see Ruby in full motion and see how she looks before saying one is better than the other.
Not sure that would be true. With the water effect, you have the constant lighting source changes... That changes all the shades in the surrounding water too. It is definitely difficult.
Originally posted by: GTaudiophile
People at Rage3D were complaining about Ruby's lack of a backbone...
Of course, the big question is, where's the nude patch?![]()
Originally posted by: GTaudiophile
People at Rage3D were complaining about Ruby's lack of a backbone...
Of course, the big question is, where's the nude patch?![]()