nVidia said the 5800 Ultra was a mistake, but there was no video with them talking about it that I remember.A while ago, nvidia released a video talkin about how they messed up bad with the 5800 ultra. It is really funny, but i cant find it. If any of you guys know where i can get it, i would really appreciate it if you could give me a link.
Originally posted by: Evdawg
kinda sorda off topic... but didnt the 5800 get better after new drivers?
Wow, that's tragic. My .....oh yeah...I don't have any PS2.0 games because there's only a couple out that no one cares about....it still has poor ps2.0 shader performace
Of course it only has half the fillrate due to half the pipelines, so it's really not very similar at all...the 5700u is actually QUITE similar to the 5800u, in a lot of ways, 128 bit, DDRII, architecture.
You should care Sneaky, because it's WAAAAAAAYYYYYY faster than your 5700Ultra. If you had a 5800Ultra, you'd actually have a pretty fast card.yes, but who cares, it was already outdated by then
Well, yes they are similar in some respects. Both have 3 vertex shader units, so there they should perform roughly the same there. Pixel fillrate is also about the same, but because of the 4x2 design of the 5800u it has more textel fillrate. The 5800u also will have more bandwidth, courtesy of its 1ghz clockrate, so it should handle higher res and aa/af better. In a lot of ways it's a 5900 with less memory bandwidth.Originally posted by: SneakyStuff
the 5700u is actually QUITE similar to the 5800u, in a lot of ways, 128 bit, DDRII, architecture.
Mistake #1) Your link to THG was brokenOriginally posted by: Rollo
Check out the "outdated" 5800U stomping feeble "modern" 5700U flat at DX9 Halo
Ouch, I'd rather be outdated than have a 5700U, Sneaky. I could actually play my games at some decent resolutions and AA/AF settings then.
Originally posted by: Rollo
it still has poor ps2.0 shader performace
You should care Sneaky, because it's WAAAAAAAYYYYYY faster than your 5700Ultra. If you had a 5800Ultra, you'd actually have a pretty fast card.yes, but who cares, it was already outdated by then
Check out the "outdated" 5800U stomping feeble "modern" 5700U flat at DX9 Halo
Ouch, I'd rather be outdated than have a 5700U, Sneaky. I could actually play my games at some decent resolutions and AA/AF settings then.
Mistake #1) Your link to THG was broken
#2) You're using a THG benchmark in an argument .
I can see why that matters to me as I had a 5800, and have a 9800Pro, but why is that important to you? We're talking about you slamming a card that anyone with any sense would rather own than your feeble 5700U.the 5800 would have been a revolutionary card if they could have made it cheaper, quieter, and 256 bit memory bandwidth. The Radeon 9800 mopped the floor with it.
it would be nice if those numbers came with some screenshot comparisons so we can see if there is anything funky going on with floating point precision on the 5800 in order to get it to benchmark as well as it does. is it possable than nvidia would do something as dirty and underhanded as that?
Nothing to argue about here, the 5800 ultra barely beat out the ti4800, let alone a 9700 pro, I never said the 5700u was "better" than a 5800u, just cheaper, and quieter, they share MANY common traits
I have a 5800U I bought slightly used *just a couple months old* paid the same amount for as the 5600U flip chip cost at the time, and the used 9700pros for that price where nigh impossible to get because someone else always got there first. I've been using this card a good while now and it's just gotten better with age, like a fine :wine: Here's what it'll do with the UT2K3 DEMO bench- A64@2.3ghz 5800U@525/1080 DET. 53.03 1024x768x32 flyby 284 BotMatch 105 not too shaby I think. Combined with the A64 it plays Halo, Far Cry, Painkiller, etc. very smoothly even with AA/AF on in some of the games, I usually use 4xAA/AF but use higher too it just depends on the game, my LCD's native res is 10x7 so that is the res I tend to stick with.Originally posted by: SneakyStuff
Whatever, I could seriously care less. I hope the 400 bucks you spent on the 5800u was well worth it....
Originally posted by: Rollo
Snowman:
it would be nice if those numbers came with some screenshot comparisons so we can see if there is anything funky going on with floating point precision on the 5800 in order to get it to benchmark as well as it does. is it possable than nvidia would do something as dirty and underhanded as that?
This isn't much of an argument Snowman. Unless you have some links to some review site saying the 5800s run at lower precision than 5700s, and why they think so, you argument is baseless speculation. It would be just as accurate for me to say, "ATI has been caught cheating before on benchmarks. Therefore, they probably cheated on Shader Day HL2 benchmarks".
LOL, I point out all the incredibly off base information in Sneaky's post, so he says, "Whatever, I don't care" like an urban youth on a day time talk show.Whatever, I could seriously care less. I hope the 400 bucks you spent on the 5800u was well worth it....
tests have shown that the 5800 falls behind the 5700 in many floating point shader operations. i know you don't want to hear it, but running the ps2.0 shaders at fx12
