SneakyStuff
Diamond Member
- Jan 13, 2004
- 4,294
- 0
- 76
So Rollo, when can we expect your next batch of Doom3 benchmarks as evidence of the 5800's superiority?They may be far too slow Ben, but they are only HALF as slow as 5900s! THAT is the difference and superiority my friend! Would you rather ride a tortoise, or a cow, across the country?!?!?! A cow is TWICE as fast!
Rollo likes to inflate the 5800U at any opportunity and deflate ATi, often using exactly the same reasons to do both at the same time. His completely illogical and irrational behaviour is as predictable as the rising sun that comes up each morning.Your acting too childish as well.
It's nice you think so, but since you're using a crippled, low power 9600, I'll take it with a grain of salt. Good choice there- believe the Shady Days hype, did we?rollo, the 5800 Ultra IS a joke.
Really? Did you ever think the cost of the 12 layer PCBs, low yield chips that run at 500MHz, 1.8ns DDR2, and big dual heatpipe copper hsf fan might have something to do with it? Kind of like when ATI quit making a good, but costly card- the 9500Pro- and started making a cheap, at best "OK" card- the 9600Pro.But NOT because of how fast it is. If a card's worth was solely depended on how fast a card is, then nvidia would still make them.
Let's see:The 5800 Ultra is a joke for the following reasons, It is very loud, at the time was MORE expensive then the 9700 Pro, it came out 6 months late and couldnt live up to the hype.
Ever hear of advertising?Yes it was fast, but not "Ultra Fast" like nvidia said it would be. The 9700 pro set the "standard" for high performance cards, and the 5800U was not in a "class" of it's own performance wise like nvidia claimed it to be.
I see. And resurrecting a dieing thread to slam my video card is "mature"?Your acting too childish as well.
Well, it is kind of like the guy driving a beat up Hyundai saying"I've heard those Mercedes rust in 20,000 miles!" or the guy whose girlfriend looks like JarJar Binks saying, "They all look the same in the dark anyway".Some people dont NEED such fast and expensive cards rollo. Your saying someone with a slower card has no right to say that your faster one is slow?
When their opinions are wrong, and they have no basis in fact, I'll say it. It's a public forum, debate is expected. I won't shut up just because you don't like what I have to say.Just WHO are you to say that other people's opinions dont matter?
Since you said it, and it's not true, why don't you post some quotes to back it up? I've never said the 5800 Ultra is anything but an interesting and rare piece of hardware I wanted to try on a whim. I've always said it's approximately as good as a 9700Pro, but not as good as a 9800/5900. If you want to tell lies about me, be prepared to back them.They may not matter to you, as you are stuck in your own little world where every thing you got is the best.
LOL as the owner of a 9600Pro, you don't even know what speed like the 5800 Ultra has feels like. Yet, here you are, calling it "slow" while you hobble along with your crippled card.People's opinions do matter, to me at least. They can say your 5800U is slow because it is slower then a 9800XT or 5950 Ultra. They dont have to have it to know how fast it is.
As soon as the game is released. If it's in the box with a 6800, that will be very soon. Of course, as I don't think the 5800U is superior to anything with 9700/9800/5900 in it's name, I'll probably just post "5800U runs Doom3 pretty good!"Originally posted by: BFG10K
So Rollo, when can we expect your next batch of Doom3 benchmarks as evidence of the 5800's superiority?They may be far too slow Ben, but they are only HALF as slow as 5900s! THAT is the difference and superiority my friend! Would you rather ride a tortoise, or a cow, across the country?!?!?! A cow is TWICE as fast!
Rollo likes to inflate the 5800U at any opportunity and deflate ATi, often using exactly the same reasons to do both at the same time. His completely illogical and irrational behaviour is as predictable as the rising sun that comes up each morning.[/quote]Your acting too childish as well.
The lack of the game's release hasn't stopped you from using it as evidence thusfar, even while you were denouncing existing PS 2.0 as invalid.As soon as the game is released.
Don't change the issue.Well, it's nice you think so, but there are more than a couple people who think your whole "It only suxorz half as much, so it's superior!" argument is a little nuts too.
You know, for some people without a large disposable income, counting pennies is exactly how one can afford the nice toys one has. As the saying goes "watch the pennies and the pounds mind themselves". I'm not sure, but I think the same principle applies even with american currency.Originally posted by: Rollo
More expensive- who cares? You don't buy cards like this if you're counting pennies.
I see. And resurrecting a dieing thread to slam my video card is "mature"?
Originally posted by: SneakyStuff
Well, it's funny how we have drifted so far off course, thanks to Rollo. Back to the ORIGINAL TOPIC what are your thoughts on the r420? What's all this i hear about a second revision 2 months after its' release?
Originally posted by: Rollo
But seriously, the 5800 is uhhh a joke compared to similarly priced 9700 and 9800s.
I've owned and used all three Zephyr, have you?
Since you don't seem to know much about video cards, let me help you:
5800U faster than 9800Pro 256MB at Doom3
5800U as fast as 9700Pro at UT2003, at the 2 usable resolutions
5800U faster at Q3 than 9700Pro
5800U faster than 9700 Pro 2/3 resolutions at Jedi Knight 2
5800U basically tied with 9700Pro at Commanche
So ...uhhhhhh...Zephyr...uhhhhhh......why ...does...uhhhhhh....Anand's testing.....uhhhhh......seem to say the 5800Ultra is equal to the 9700Pro .....and....uhhh....you say...it's..uhhhh....seriously....uhhh...a joke?
Damnit. Here's one of those damn conundrums college never prepped me for:
On one hand I've got the "most knowledgeable" trio of Zephyr, Sneaky Stuff, and Shminu saying my 5800Ultra is a "joke". They have probably never seen one.
On the other hand, I've got this Anand Lai Shrimpi guy who run this site showing it's about equal to it's competition, the 9700 Ultra.
On top of all this, I've got my own long term eperience with all three cards, which seems to agree with that guy Anand: the 5800U is about the same as my 9700Pro was, and a bit less than my 9800Pro was.
I don't know if we'll ever be able to know the answer to this mystery: who to believe, Anand and personal experience, or the 3 "wise" men.
Perhaps there is only one conclusion that can be drawn:
My 5800U has over double the pixel and texel fillrate, more memory bandwidth, and much higher performance than Sneaky Stuff's 5700U, and cost me about the same as his 5700 U cost. Yet Sneaky for some reason thinks the far superior 5800 U is no good.
Fear not Sneaky. My four year old will get the 5800U soon, and I'll buy a nV40 or R420, then you can respect my video card again.
Originally posted by: SneakyStuff
Well, it's funny how we have drifted so far off course, thanks to Rollo. Back to the ORIGINAL TOPIC what are your thoughts on the r420? What's all this i hear about a second revision 2 months after its' release?
Originally posted by: RussianSensation
Isnt R420 going to ship with Half-Life 2? I am not sure, but I think i did hear that somewhere. If that is the case the whole point about having doom 3 with NV40 is not as important because we know which game is gonna have more depth and not just pretty graphics and dark corridors. Of course I am not confident in r420 shipping with half-life 2 but just wanted to make a point. And is there a rumour that X800Pro will have 12 pipes and on May 31st, X800 XT will come out with 16 thus equalling NVidia's? And who says anything about nVidia being faster without any proof but a pathetic 3dmark03 score.....no one said anything about 16 true pipelines either, knowing nvidia. Finally, even if R420 is based on the R300, we already know R300 was better than anything nvidia had to offer. Thus, ATI always did have a superior GPU. So how can ppl make an assumption that because NV40 has a completely new core it will smoke ATI when ATI always had better technology? Could it be that Nvidia might just might equal to or slightly beat AtI this time? I highly doubt the difference will be more than 10% in performance. Of course considering ATI has better image quality and less performance drop while having those features implemented...faster vertex and pixel shaders...hmm....but Nvidia has more stable drivers....this isn't getting anywhere until we see a review. Maybe the reason they have a new core is because NV35 did not beat ATI as they had hoped, and since ATI already has good technology on their hands they did not feel like they had to make drastic changes until R500 to be competitive. NO one said they wanted to be the best....because remaining competitive and draining the compeittiors resources on spending far more money developing a new core can be beneficial when R500 strikes and makes Nvidia cry....by then they won't have as many resources since they spent a lot more on NV40 than AtI did on R420....this is all just speculation and maybe even nonsense on my part, but just my 2 cents. cheers
I did get a laugh out of that. A possibility, maybe he had never used a computer prior to the introduction of the Radeon 9700?Originally posted by: keysplayr2003
And as for your very entertaining comment about how ATI has always had better technology? What planet are you from?
Such comments are always ridiculous, companies with as much capital and liquifiable assets as NVidia has do not simply disappear after one bad product launch. It took 3dfx a while to go under after a series of rather large mistakes (in hindsight anyway), and an outstanding product release by their biggest competitor (The GeForce DDR). NVidia and ATI are in an even better place than 3dfx was, since the bulk of their income comes from integrated chipset sales to major OEMs, and are not relying on the comparatively small gaming market to support their business.Oh, and ask AMD users if Nvidia's resources will ever be depleted.
Originally posted by: keysplayr2003
Originally posted by: RussianSensation
Isnt R420 going to ship with Half-Life 2? I am not sure, but I think i did hear that somewhere. If that is the case the whole point about having doom 3 with NV40 is not as important because we know which game is gonna have more depth and not just pretty graphics and dark corridors. Of course I am not confident in r420 shipping with half-life 2 but just wanted to make a point. And is there a rumour that X800Pro will have 12 pipes and on May 31st, X800 XT will come out with 16 thus equalling NVidia's? And who says anything about nVidia being faster without any proof but a pathetic 3dmark03 score.....no one said anything about 16 true pipelines either, knowing nvidia. Finally, even if R420 is based on the R300, we already know R300 was better than anything nvidia had to offer. Thus, ATI always did have a superior GPU. So how can ppl make an assumption that because NV40 has a completely new core it will smoke ATI when ATI always had better technology? Could it be that Nvidia might just might equal to or slightly beat AtI this time? I highly doubt the difference will be more than 10% in performance. Of course considering ATI has better image quality and less performance drop while having those features implemented...faster vertex and pixel shaders...hmm....but Nvidia has more stable drivers....this isn't getting anywhere until we see a review. Maybe the reason they have a new core is because NV35 did not beat ATI as they had hoped, and since ATI already has good technology on their hands they did not feel like they had to make drastic changes until R500 to be competitive. NO one said they wanted to be the best....because remaining competitive and draining the compeittiors resources on spending far more money developing a new core can be beneficial when R500 strikes and makes Nvidia cry....by then they won't have as many resources since they spent a lot more on NV40 than AtI did on R420....this is all just speculation and maybe even nonsense on my part, but just my 2 cents. cheers
OMG, your not gonna cry when the reviews come out if nvidia comes out ahead are you? I almost feel bad for ya. Almost.
And as for your very entertaining comment about how ATI has always had better technology? What planet are you from?
Nvidia ALWAYS owned ATI from RIVA128 on up until after the Ti4xxx series. ATI's first REAL competitor was the 9700 and it blew everyone away. Good for them. I bought one. Loved the performance, hated the compatability problems and drivers. Although they have improved. So, I will light a candle for ya on the 31st of May.
Oh, and ask AMD users if Nvidia's resources will ever be depleted.
Note the date on this article reviewing the R9700Pro: August 19th, 2002. ATI has had a long run at the top, but don't fool yourself into thinking that they are going to get complacent and not bother until their competitor does come out with a killer card. If things are as they are shaping up to be, NV40 could take the performance crown by a decent margin (which explains why Anand mentioned the hasty release of the X800XT), and if it does, that has more business ramifications than simply losing a few dozen card sales to end users. ATI will have had far from two year reign at the top. Even the previous two years from now, ATI largely only spent three quarters of it (in terms of 3/4ths of 2 years, not three literal quarters) at the top. April through August, 2002 was still owned by the GF4 Ti4600.Originally posted by: RussianSensation
When I said always had a better GPU and a superior one that was in comparison between the current era as in over the last 2 years always had a better technology so then why change it when you are already on top? I should have specified but dont think i am that ignorant. All I am saying is let the competitor waste time and effort to catch up so you can beat them later when they least expect it.
Actually, it's to appeal to a wider base of consumers. Not everyone likes coke, many people like orange flavoured soft drinks, grape flavoured soft drinks, etc., etc.. It's to sell more, not to tie up Pepsi's R&D efforts. If you honestly believe that, I have to laugh.....Oh yeah and about resources I never said they will run out of resources I said they will use up more of them (as in draining their resources, not depleting them -- read what the words say not what you want them to say). Listen the reason sprite, C-plus and any other variation of Coke products exist is only to take Pepsi's attention away from the main product being Coca Cola. That is a way for coke to make pepsi focus on these small product lines and make them waste more resources where it doesnt even count.
Not likely, the vast majority of the money either of these companies make comes from integrated chipset sales -- read what the words say not what you want them to say.Similarly, if ATI has a chip that is able to compete with Nvidias' let Nviida spend R&D now and ATI will save theirs for the next round since they have less cash.
Except they know they will have competition, and you need to stay ahead of the curve. These people are working on cores we won't see in use for another year or two right now, it's not as if they are sitting around bored waiting to see the benchmarks.It only makes sense to improve the technology when you are forced to for these companies.
No. Would the cards be as powerful as they are today? Not likely. Would they be far more powerful than the GF1? Damn sure they would. Game development houses would not allow graphics technology to stall, and it was too lucrative a market for any of the major graphics players to stay out of it.If there was no competition you'd still have your Geforce 1.
A wait-and-see strategy has rarely worked in the computer industry simply because things move pretty fast. I'd say of all the major technological parts to the industry, the graphics industry is one of the fastest paced of the bunch. ATI and NVidia both stated in the past they were going to an 18 month product cycle. For ATI, that means this next release should be an entirely new product. They are rumoured to have scrapped the original R400 core, and opted to 'slightly' redesign the R300 core. NVidia as well should be releasing a new product around now, and it just so happens that they are. Apparently, neither vendor has adhered to their 18 month new product cycle -- NVidia released the FX series way late -- basically what would have been the 6-month mark from their original product cycle -- and ATI appears set to rebadge a redesigned version of R300 as R400. Right now, while ATI and NVidia are preparing to release this coming line of products, they are also both simultaneously working on the upcoming successor to both these products (likely the GF6900, and the RX900), as well both companies will have another team working on the next core revision on (R500/NV50). There is no resting on your laurels in the graphics acceleration business.So if ATI does not see it integral to introduce a completely new core because a slight redesign keeps them competitive why waste the money?
Originally posted by: keysplayr2003
Originally posted by: SneakyStuff
Well, it's funny how we have drifted so far off course, thanks to Rollo. Back to the ORIGINAL TOPIC what are your thoughts on the r420? What's all this i hear about a second revision 2 months after its' release?
Well, Anand's article seems to hint that ATI's seemingly hasty release of the "X800XT" only 1 month after the introduction of the "X800" is most likely due to ATI crapping its collective shorts after sneeking a peek at NV40's performance. They probably called every ATI technician at 3 in the morning on their red ATI BatPhones and shot the "ATI" logo into the cloudy sky for an emergency acceleration of the X800XT's debue. I can almost here the "Enterprise Red Alert Sound" echoing through all the ATI facilities. LMAO......
Those last few words were my own of course, but it probably reflects well what Anand was trying to say.
![]()
![]()
![]()
![]()
Originally posted by: ZimZum
Originally posted by: Pete
Um, ATi's new CEO is ex-ArtX. I think that's a small clue that ArtX is now in ATi's blood.Seriously, ArtX engies probably merged with ATi, just like 3dfx engies merged with nVidia. Remember, ATi bought ArtX.
Artx ( ATI's West team) are the ones who designed the r300 and the original r400 which is now the r500. One of the smartest moves ATI has made was aquiring Artx.