Originally posted by: DfiDude
So the 6800 ultra will run better than the X850XT next year (pci-express)?
If you're replying to me, note the "SLI" in my post. Yes 6800U SLI will run everything much better than a X850XT PE this year, and next.
Originally posted by: DfiDude
So the 6800 ultra will run better than the X850XT next year (pci-express)?
Originally posted by: Rollo
LOL- Sweeney has always preferred nVidia, because they have always had more advanced gpus.
Except for the "glory year", when 24bit was considered "full precision".
Originally posted by: KruptosAngelos
Sweeney is an idiot. In an interview a while back, he said "most" gamers use Nvidia so they are coding the U3 engine around that thought. I guess he missed the fact that ATI ships 50% more units per quarter than Nvidia... If can't see that ATI has had the best video card for the last 2 gens and controls more of the market, I am afraid to see this engine no matter how good it may look in tech demos.
Originally posted by: Ackmed
Originally posted by: Rollo
LOL- Sweeney has always preferred nVidia, because they have always had more advanced gpus.
Except for the "glory year", when 24bit was considered "full precision".
You were wrong at Hard, and you're still wrong here.
PS 1.4, and 1.1. Which had what with the 8500 and GF3/4? Just one example of NV not having the more advanced card.
PS 1.4, and 1.1. Which had what with the 8500 and GF3/4? Just one example of NV not having the more advanced card.
Sweeney is an idiot. In an interview a while back, he said "most" gamers use Nvidia so they are coding the U3 engine around that thought. I guess he missed the fact that ATI ships 50% more units per quarter than Nvidia...
Originally posted by: BenSkywalker
PS 1.4, and 1.1. Which had what with the 8500 and GF3/4? Just one example of NV not having the more advanced card.
Fire up SplinterCell with that 8500 next to a GF4 and try setting both to the highest quality shadow setting and see what you get. He!l, use a X850XTPE for that one next to a GF3 Ti200![]()
Originally posted by: KruptosAngelos
Sweeney is an idiot. In an interview a while back, he said "most" gamers use Nvidia so they are coding the U3 engine around that thought. I guess he missed the fact that ATI ships 50% more units per quarter than Nvidia... If can't see that ATI has had the best video card for the last 2 gens and controls more of the market, I am afraid to see this engine no matter how good it may look in tech demos.
Originally posted by: Ackmed
You are not right. Now you've backtracked and say "for the most part". First it was "always", now its down to "the most part." Which is more accurate.
You cant argue the driver problem when comparing it to the GF4. By then the 8500 drivers were ironed out, and it was a great card. At launch yes it sucked, and features didnt work properly, but a few months afterwards, it was working great.
Perhaps you'll learn to stop making blanket statements.
Originally posted by: Ackmed
Wow, still cant admit you were wrong.
FACT: NV doesnt always have the "more advanced GPU". More often times than not? Sure, but not always. You can either admit thats right, or continue with your ignorance. Either way, Im done arguing it.
I'm in the mood for quibbling, so don't take this too personally. I don't see FP32 as any more innovative than FP24, FP16, or even FX16. GPUs will get more precision when manufacturing makes it economically feasible. I do believe that 3dfx was slated to be the first with higher precision, though, with either Rampage or Spectre (I'm beginning to forget the 3dfx fable, sorryOriginally posted by: Insomniak
My personal view is that I've seen more innovation on Nvidia's end - they were the first to introduce the dedicated Hardware TnL platform, the first to do 32 bit precision (ATi still doesn't have it), the first to offer a highly programmable architecture with NV30 (despite it being a performance stinker), the first to introduce a developer relations program (Get in the Game followed TWIMTBP), they came back to SLI before ATI came after multi-rendering...
To me, I see Nvidia take more risks, which I think is what we need in games and gaming hardware right now. We still need to flesh out those features and keep steaming to photorealism, and I see Nvidia go out on a limb regularly to try and accomplish that - sometimes they succeed, sometimes they don't, but the point is they're willing to give it a go which is something I rarely see ATi take the first step on.
What does that matter?
The fact is, NV doesnt always have more advanced hardware.
There are still certain things that the NV20 can do in hardware that the X850XTPE can't.
Just out of curiosity, what features does the old NV20 have over the x850xt.
To me, it sounds like some BS features that have and will never come into use.
