gauravsharma311
Member
- Apr 16, 2004
- 27
- 0
- 0
note to readers: if you put details on full wack even on a fx5200, you can basically get the same "special ps3 only" image quality in the 2nd screenshot in that link. try it.
Originally posted by: NFS4
You guys really need to READ before you start beatin' your chests.
Originally posted by: Alkali
( a previous reply earlier in the thread )
The first screenshot is Pixel shader v1.1, and the second is pixel shader version 3.0 as confirmed by nVidia themselves after the presentation.
Originally posted by: gauravsharma311
note to readers: if you put details on full wack even on a fx5200, you can basically get the same "special ps3 only" image quality in the 2nd screenshot in that link. try it.
Originally posted by: gauravsharma311
AIWGuru,
I agree about the general capilities of PS3 like you've describled later,
but what i'm trying to make clear is that the picture quality obtainable (at decent rendering rates) - which includes the actual image impact when you see it (e.g. first time you see HL2 graphics) is NOT much different at all between PS2 and PS3. Even PS1 and PS2 didn't have absolutely major differences (maybe slightly tellable in HL2, not much), but with PS3 the leap visually isn't that huge, and you won't get instantly tellable differences. Far cry programmers (if they decided to spend time) could brush up PS2 to generate PS3 quality graphics. it would just be more of a mission, that's my point from the start.
Originally posted by: XBoxLPU
Originally posted by: NFS4
You guys really need to READ before you start beatin' your chests.
Originally posted by: Alkali
( a previous reply earlier in the thread )
The first screenshot is Pixel shader v1.1, and the second is pixel shader version 3.0 as confirmed by nVidia themselves after the presentation.
Now who needs to read ?![]()
Originally posted by: Alkali
A quote from the Inquirer...... I dont know how reliable the specs are below, but its a shock to me, I thought the X800 Pro was clocked a bit higher...
ATI to launch X800 PRO on 4th of May - X800XT to follow in two weeks
By Fuad Abazovic: Friday 16 April 2004, 12:56
EVEN THOUGH ATI doesn't want us in Canada without signing one of its famous NDAs, we always have our ways of getting information about its future plans. Back at CeBIT we heard that new chip will be called X800 but simply could not confirm it at that time and we also heard that this chip will be introduced at System Builder Summit.
The date that is apparently being suggested is the 5th of May, and it could be that ATI will launch this chip during the SBS event.
Sources close to ATI claim that it can actually ship this card at the same day of launch and we know that there are quite some samples ready as we speak.
As for the specs, the R420PRO or should we call it X800PRO will be clocked at 475 MHz, 25 MHZ higher then we suggested since yields are better, while the memory will be clocked at 900MHz as we suggested before.
Everything is packed into 180 millions of transistors and the card has 12 pipelines as we suggested. The R420XT, Radeon X800XT will be higher clocked card with 16 pipelines which came as a huge surprise to us.
This card is meant to fight Geforce 6800 non Ultra - the one with 12 pipelines and should be in retail as soon as ATI launches it.
This means that you should be able to buy one on the 4th of May but let's wait and see if ATI can deliver on that day.
Faster Radeon X800XT will come just after E3, and soon after the 14th of May we are told. I cannot remember if I said this before, but the R420 is chip based on R300 marchitecture and you can clearly see this from its 96 bit precision support and lack of PS 3.0.
Performance wise it will give NV40 a good run for its money. µ
Originally posted by: gauravsharma311
AIWGuru,
I agree about the general capilities of PS3 like you've describled later,
but what i'm trying to make clear is that the picture quality obtainable (at decent rendering rates) - which includes the actual image impact when you see it (e.g. first time you see HL2 graphics) is NOT much different at all between PS2 and PS3. Even PS1 and PS2 didn't have absolutely major differences (maybe slightly tellable in HL2, not much), but with PS3 the leap visually isn't that huge, and you won't get instantly tellable differences. Far cry programmers (if they decided to spend time) could brush up PS2 to generate PS3 quality graphics. it would just be more of a mission, that's my point from the start.
Originally posted by: UlricT
As I see it, PS2.0 code can be optimised to run more effieciently using PS3.0 hardware due to the branching & arbitrarily long instuctions. This does NOT add functionality, but programmability. Therefore better graphics would not be the word to use here... it would be more efficient on the hardware to use PS3.0, leading to longer shader rountines by coders! Nothing else...
Originally posted by: UlricT it would be more efficient on the hardware to use PS3.0, leading to longer shader rountines by coders! Nothing else...
I have been of this opinion since the nV30 debacle and I still say it; even the x800xt will struggle against the 6800ultra as nvidia optimizes it's drivers . . . probably ~ +20% for this new core.Originally posted by: PorBleemo
It's starting to look to me that nVidia might win this round.
-Por
Originally posted by: AIWGuru
the 12 pipe, slower clocked chip will be released first, the 16 pipe, faster chip later. That's probably why we're seeing conflicting information.
Originally posted by: NFS4
Originally posted by: XBoxLPU
Originally posted by: NFS4
You guys really need to READ before you start beatin' your chests.
Originally posted by: Alkali
( a previous reply earlier in the thread )
The first screenshot is Pixel shader v1.1, and the second is pixel shader version 3.0 as confirmed by nVidia themselves after the presentation.
Now who needs to read ?![]()
What's your point?
He was talking about PS 3.0 quality being superior to PS 2.0, then he provides a comparison involving PS 1.1 and PS 3.0. WTH does that have to do with PS 2.0?![]()
He was talking about PS 3.0 quality being superior to PS 2.0, then he provides a comparison involving PS 1.1 and PS 3.0. WTH does that have to do with PS 2.0?
