ATi & nVidia

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
SM2.0 can do MOST of what SM3.0 can do, but not very easily ;)

for a SM2 game to look like a SM3 game (one that is meant for SM3, like unreal engine) it would have to put alot more work onto the Sm2 card than what the SM3 code would do to a SM3 card

thats why i chose my 6600GT, its the best mid-range card around WITH SM3.0

can't beat that...

and NO, ATI does not have better picture quality....where would you get that idea? They both have THE SAME quality, just ATI cards perform better in some things, Nvidia in the other.

SM3.0 all the way
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: piromaneak
First off, NO I'm not asking what's the best what I'm asking is what makes people pick nVidia and what makes people pick ATi. I'm trying to get a feel for the two before I make a decision on either or. Is it the APIs they use, the fancy driver interface, cool looking cards or what? And more importantly, what is ATi's and nVidia's "philosophy" on rendering. I guess that means why would one think the other is better.

(Ok, so maybe this is a VERY carefully worded "Which card is best" thread but hey you have to admit, I did an ok job of trying to cover it up :)

Anywho, feedback from both owners of ATi and nVidia welcome. Preferably ones that own either the X8xx line or the 6800 Ultras... Thanks in advance.

-Dave





Ok, so you're basically asking WHY we have our preferences.


Well to start, I prefer Nvidia graphics. I don't think ATi is "bad" in any sense of the word, and I have owned several of their cards in the past. Based on my experiences, I have come to prefer Nvidia graphics because:

1) They just seem to be the more innovative company. They've implemented so many firsts it's amazing. They were the first to hardware T&L. They were the first to 32-bit precision (ATi still isn't there yet). They implemented "The Way It's Meant To Be Played" before any other major graphics company came up with a similar program.

Nvidia takes chances - that's my main reason for preference. Sometimes, it blows up in their face, as putting NV30 on .13 did. Sometimes it pays off big, like expanding into MCPs did. But the bottom line is this: Good technology companies innovate, they don't wait-and-see. Google does it. Apple did it back in the 80s. AMD did it with 64 bit processors. If you look, you find a pattern.

2) Compatibility. I've found that every Nvidia card I've owned has had fewer issues than the ATi cards I've had. It may not be indicative of the market overall, but it's certainly bee my experience. "It Just Works".

3) Performance on par. With the exception of the NV30 core, Nvidia's GPUs have either been the fastest, or tied for, in the industry.


Those are my main three reasons - now, if I see a killer deal on an ATi card where bang for buck is great, I'll still buy it - but all other factors being equal, I usually go with Nvidia.

Just my $.02
 

mooncancook

Platinum Member
May 28, 2003
2,874
50
91
I picked nVidia for my new build because:
1. best performance for $200 (6600GT)
2. ability to go SLI

Then I moved to ATI because:
1. 6600GT will not be enough for gaming on Dell 2005FPW
2. decided I will not do SLI
3. best performance for $300 (X800 XL)
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Let's not make this a sm3 flamefest like the one I just came from. If you ask which one I prefer, I say ATI, because:

1) I've owned a 9000pro, and now use a 9800pro --> xt. I never had issues with the drivers, and I usually dont even bother updating them unless I feel like it.

2) My cousin has a gf4200, and I thought that was a pretty good card until he tried to play Allied Assault and all he saw was a bunch of yellow polygon edges on his screen.

3) Nvidia really got their butts handed to them last genereation. Their over-hyped up dustbuster cards got slaughtered by ATI's cards running at a much lower, cooler clockrate. That must have been embarrasing.

4) Nvidia has consistently been caught cheating on benchmarks. And instead of admitting to it, they claim the benchmarks were faulty. That kind of behavior is not excusable.

5) I still havent heard if Nvidia got the PVP working on all it's cards. As far as I know, there's a bunch of 6800 cards that have faulty hardware, and that's even worse than having buggy drivers.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
4) Nvidia has consistently been caught cheating on benchmarks. And instead of admitting to it, they claim the benchmarks were faulty. That kind of behavior is not excusable.

They were caught optimizing once, and ATI has also been caught. They are on even ground since both have been caught before.

Their over-hyped up dustbuster cards

Last i checked the only card to have a "dustbuster" or FlowFX cooler was the 5800Ultra which was replaced by the 5900 which featured a MUCH quieter fan. Additionally the FlowFX was a very good design and the reason there was so much noise was due to a bearing problem. All cards after that were fixed.

-Kevin
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
Extremely poor texture filtering. ATi uses the lowest possible blending accuracy they can get away with for the texture blending operations. This creates significantly more aliasing then with any nV board I've ever seen. The fans will call it 'more detail'- but 3DCenter wrote up a pretty good article detailing exactly what they are doing(it is quite clearly visible, although a lot of people claim it is a benefit or deny it exists).
I can attest to that. But I still believe that ATI has better default IQ than. But if you set Nvidia to High Quality, it will have less, unnoticeable, textures shimmering.

Sub par cooling solutions- not sure if this has been improved on the latest generation but the HS/F that shipped on the R9800Pro was not up to the task. Crashing to the desktop due to the chip overheating isn't uncommon unless you want to replace the inadequate HS/F with a decent third party solution. This again, seems to be limited to ATi branded parts or those that are duplicates of them.
Nvidia sports a really crappy 6600 heatsink design that if you touch the heatsink, you could break the contact between the GPU and the heatsink, giving you cooling issues and crashes.

Wow, so it seems pretty neck and neck... Although I do have some fuel here to throw on you guys fire and give you something to think about... Now this might be becuase of the benches used or not but I have noticed that in games where the api DirectX is used, yes ATi usually has a lead on it but nVidia is not far behind but with games that run the OpenGL api, nVidia leads by a larger gap. So it seems to me that nVidia is best at running OpenGL (Hence Doom 3) but can also render DirectX games almost as good as ATi can. Can any of you confirm or deny this?
Yes, but Nvidia leads at default IQ which is lower than High Quality which is known to decrease frames by a decent amount. Read more about it here.

This time around, it's SM3.0, OpenGL 2.0 support, SLI, HDR support, hardware MPEG/WMA acceleration, and most of these things are even on the junior models, like the 6600/6200 series. And say what you will about SM3.0, it's gonna make a difference eventually. If chaos theory runs like crap with SM3.0, then it's not being implemented well in that game. All SM3.0 does is allow larger amounts of shading work to be done in fewer passes, so that an instruction that might have taken 2 passes with SM2.0 would only take 1 pass with 3.0. It will bring performance boosts, as is seen with farcry, even though it was just an add-on after the fact, and will allow game developers to take more liberties with shading effects. ATI clearly knows it's important, which is why R520 will have it.
Geforce 6 supports only OpenGL1.5 at the moment. SLI is great if you think it is. In my opinion, it's a waste of money and time since it's too buggy to even be released. Default DX9 supports HDR, so All current ATI cards support HDR. Programmers, though, seem to be skipping over SM2.0 because it sucks enough. So ATI is capable, but programmers are not taking advantage of the technology. Of course ATI knows it's important. But ATI also realized that the performance advantage wouldn't be worth it to for the bigger more expensive chip. Now, it's a different story.

And features... There are like two games that use SM3.0 and PureVideo took them like almost a year to enable in their hardware. Some "marketing" huh.

1) They just seem to be the more innovative company. They've implemented so many firsts it's amazing. They were the first to hardware T&L. They were the first to 32-bit precision (ATi still isn't there yet). They implemented "The Way It's Meant To Be Played" before any other major graphics company came up with a similar program.
ATI was the first to implement DX8 features. Although partial, but still there. ATI was the first out with DX9. Nvidia now uses Anisotropic Filtering and Anti-Aliasing technology that ATI used.





 

ChaosPhoenix

Member
Feb 21, 2005
40
0
0
My previous two video cards were both nVidia, it was an easy choice back then. I have no problem with either company at the moment, and wouldn't hesitate to go to either company were one to have a striking deal.
The reason that I'm drawn to nVidia at the moment is because of the feature set, but if the feature sets were the same, I would most likely choose ATI.
ATI's cards come across to me as being, 'neater', and 'cleaner'. It's just a feeling though, not really based on any facts.

I'm surprised how this hasn't turned into a flame fest, although I'm sure it won't take long.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
4) Nvidia has consistently been caught cheating on benchmarks. And instead of admitting to it, they claim the benchmarks were faulty. That kind of behavior is not excusable.

ATi is openly "cheating" on Doom3 benches now. Carmack has already come out and said that their results were not on a mathematical basis identical to what they should be rendering. In realistic terms I think this is a non issue as it always has been- as long as the visual quality is there. I would rather have a company 'cheating' giving me solid IQ then one playing 'fair' and just having a bug that renders everything wrong.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: VIAN
ATI was the first to implement DX8 features. Although partial, but still there. ATI was the first out with DX9. Nvidia now uses Anisotropic Filtering and Anti-Aliasing technology that ATI used.


Eh, but NV and ATi always had plans to implement DX8, DX9, etc features. I remember when Nvidia was first out of the block with 32-bit color back in the day, and everyone else, especially 3Dfx, was suddenly scrambling to follow suit. That was something others weren't planning on. At least, not at that point in time.

Same goes with TWIMTBP. Suddenly, several months later, there was "GITG".

And now with the rebirth of SLi, ATi's equivalent is coming.


I'm not saying Nvidia is first on everything - that'd be foolish - but I think they definitely take more risks than ATi. Even stuff that may not pan out in their favor - dual slot cooling, for example, everyone went ballistic when it first showed up on NV30. How many people has it really caused a problem for since then? Few, at most. And now ATi is doing it. I wouldn't call this an innovation, but I think it shows that Nvidia is willing to stick its neck out further on the chopping block.

Nvidia plays it riskier, and that's something that needs to be supported in technology. Often many lackluster tries are punctuated with one great discovery, so I think it's neccesary to give support to those companies that are looking in new areas instead of just playing it safe.

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: BenSkywalker
4) Nvidia has consistently been caught cheating on benchmarks. And instead of admitting to it, they claim the benchmarks were faulty. That kind of behavior is not excusable.

ATi is openly "cheating" on Doom3 benches now. Carmack has already come out and said that their results were not on a mathematical basis identical to what they should be rendering. In realistic terms I think this is a non issue as it always has been- as long as the visual quality is there. I would rather have a company 'cheating' giving me solid IQ then one playing 'fair' and just having a bug that renders everything wrong.

You're probably right about both sides cheating nowadays. As long as the IQ is the same, I don't care if theyre cheating. If, however, the cheating goes way beyond simple optimizations, like there's an article on techreport about it (I'm too lazy to post link), it's a different story, especially if IQ suffers, or they adjust a driver to get higher scores in a known benchmark but have no improvement in an actual game. And I know everybody has their own opinions, but for me Nvidia's case of cheating is significantly more prominent than ATI's.
 

Malak

Lifer
Dec 4, 2004
14,696
2
0
Originally posted by: piromaneak
Wow, so it seems pretty neck and neck... Although I do have some fuel here to throw on you guys fire and give you something to think about... Now this might be becuase of the benches used or not but I have noticed that in games where the api DirectX is used, yes ATi usually has a lead on it but nVidia is not far behind but with games that run the OpenGL api, nVidia leads by a larger gap. So it seems to me that nVidia is best at running OpenGL (Hence Doom 3) but can also render DirectX games almost as good as ATi can. Can any of you confirm or deny this?

It comes down to what games you play. Most games are directx, I don't think I have any opengl games installed. So it's a moot point, I'll go ATI. Pretty much the only guy who still uses opengl is Carmack, and I don't like any of his games, so I'm ok as long as nobody uses his engine in any games I might want to play. So far there's just been one, so I should be safe.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
Originally posted by: munky
You're probably right about both sides cheating nowadays. As long as the IQ is the same, I don't care if theyre cheating.


I think the more pertinent question is "If the IQ remains the same, is it cheating?"

I would say no. 1000 ways to skin a cat and whatnot.


 

piromaneak

Senior member
Feb 27, 2005
225
0
0
becuase there's alot more to it than just "playing the games you like" hatim :p

I like to research these sort of things 1)because I have an inordinate amount of free time and 2)I like to know that I will be making a sound purchase of a product from a manufacturer who actually gives a flip and supports their customers
 

The Green Bean

Diamond Member
Jul 27, 2003
6,506
7
81
Originally posted by: piromaneak
becuase there's alot more to it than just "playing the games you like" hatim :p

I like to research these sort of things 1)because I have an inordinate amount of free time and 2)I like to know that I will be making a sound purchase of a product from a manufacturer who actually gives a flip and supports their customers


Fine then. :p I bought my 6800 becuase the X800 series wasnt avail back then in pakiland..Had to get rid of my GF4TI before it went even lower in value :) ....Payed $450 shipped for it back in Nov....But if I were to buy now, I would get a 6600GT for $220-250...

Nice little upgrade over a GF4 Ti...Hope this one lasts as long as that :)
 

bersl2

Golden Member
Aug 2, 2004
1,617
0
0
Originally posted by: malak
Originally posted by: piromaneak
Wow, so it seems pretty neck and neck... Although I do have some fuel here to throw on you guys fire and give you something to think about... Now this might be becuase of the benches used or not but I have noticed that in games where the api DirectX is used, yes ATi usually has a lead on it but nVidia is not far behind but with games that run the OpenGL api, nVidia leads by a larger gap. So it seems to me that nVidia is best at running OpenGL (Hence Doom 3) but can also render DirectX games almost as good as ATi can. Can any of you confirm or deny this?

It comes down to what games you play. Most games are directx, I don't think I have any opengl games installed. So it's a moot point, I'll go ATI. Pretty much the only guy who still uses opengl is Carmack, and I don't like any of his games, so I'm ok as long as nobody uses his engine in any games I might want to play. So far there's just been one, so I should be safe.

*** WARNING ? THREADJACK IN PROGRESS ***

If more devs used OpenGL, those companies would make more money.

Why? Because if you program using OpenGL, your games become many times more portable. If your games are many times more portable, Mac and Linux ports economically make sense.

I assert that ports can be profitable, if you use portable APIs in the first place. I don't know of any sort of hard evidence that can support or detract from this, so I'm only making an assertion.

And if only the graphical API isn't enough portability for you, you can also use SDL which, when compiled for a Windows environment, creates the proper DirectX call(s) for input, 2D/3D graphics, images and movies, fonts, sound and mixing, the works.

OK, I'm done. Back to your regularily-scheduled thread.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: Insomniak
Originally posted by: munky
You're probably right about both sides cheating nowadays. As long as the IQ is the same, I don't care if theyre cheating.


I think the more pertinent question is "If the IQ remains the same, is it cheating?"

I would say no. 1000 ways to skin a cat and whatnot.

Exactly. I hate the word "cheating" when referring to this type of thing. Cheating would be adding something to the drivers that causes software to display a frame rate higher than what you're actually seeing.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: malak
Originally posted by: piromaneak
Wow, so it seems pretty neck and neck... Although I do have some fuel here to throw on you guys fire and give you something to think about... Now this might be becuase of the benches used or not but I have noticed that in games where the api DirectX is used, yes ATi usually has a lead on it but nVidia is not far behind but with games that run the OpenGL api, nVidia leads by a larger gap. So it seems to me that nVidia is best at running OpenGL (Hence Doom 3) but can also render DirectX games almost as good as ATi can. Can any of you confirm or deny this?

It comes down to what games you play. Most games are directx, I don't think I have any opengl games installed. So it's a moot point, I'll go ATI. Pretty much the only guy who still uses opengl is Carmack, and I don't like any of his games, so I'm ok as long as nobody uses his engine in any games I might want to play. So far there's just been one, so I should be safe.


I will garauntee you that MANY MANY more games come from the DIII engine (OGL). While most games are DX right now, OGL is in transition. DIII just put OGL back up there. Trust me if you are banking on DIII being the last OGL game, you are wrong.

-Kevin
 

Smeagol

Junior Member
Jan 15, 2005
8
0
0
Just look at how many games were based off of the aged Quake 3 engine. Tons of games used it. In it's prime, it was arguably the best in graphics compared with other gaming engines.

That aside for many years I have gone ATI, even back in the days of the ancient Rage128. I had an original Radeon back in the day as well. I liked them for image quality, and DVD playback, and they decently played games. Recently I myself had been doing tons of research on video cards to see what fit my wants and needs. I needed something powerful, that would run todays games, and also something with capabilities beyond gaming. I needed VIVO, and ATI's solutions are quite expensive on the AIW front. I decided to settle on the MSI 6600GT AGP VIVO card. I've yet to run it because I just ordered my new system, but in my opinion ATI is too expensive when compared to years past. You may get somewhat superior performance, but these days it comes at an absolute premium.
 

MisterChief

Banned
Dec 26, 2004
1,128
0
0
Originally posted by: Smeagol
Just look at how many games were based off of the aged Quake 3 engine. Tons of games used it. In it's prime, it was arguably the best in graphics compared with other gaming engines.

That aside for many years I have gone ATI, even back in the days of the ancient Rage128. I had an original Radeon back in the day as well. I liked them for image quality, and DVD playback, and they decently played games. Recently I myself had been doing tons of research on video cards to see what fit my wants and needs. I needed something powerful, that would run todays games, and also something with capabilities beyond gaming. I needed VIVO, and ATI's solutions are quite expensive on the AIW front. I decided to settle on the MSI 6600GT AGP VIVO card. I've yet to run it because I just ordered my new system, but in my opinion ATI is too expensive when compared to years past. You may get somewhat superior performance, but these days it comes at an absolute premium.

:thumbsup: