• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

why are some nv fans so hard on the competing ati products?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Yea from what i read, its in the price range of the X600, so i dont know for sure, but the X600 is a native PCI-E, has 128-bit buss, 128/256 Ram, 4 or 8 pipelines, 2 vertex shaders, cant remember the rest.

Im still a little vague if its supposed to be up agains the X600 and im a bit vague on the specs too, anyone care to enlighten me?
 
Originally posted by: GeneralGrievous
shamrock: When i tried installing those it said they were for a quadro card and pci express?!???

this is what guru3d had to say about them:

"The french site station drivers stumbled into a new set of ForceWare drivers. The files are dated on the 26 July 2004 and will work with the entire range of Nvidia graphics cards after an .ini alteration. We think these drivers originally where targeted at the porofessional Quadro series. We haven't had time to test these fully and bare in mind they are not WHQL so please be careful. We do not know where these drivers originate from or even know anything about their validity."

i don't really think these would qualify as an "official" release driver for the purpose of benchmarking...
 
ok, i went ahead and patched far cry to v1.2. all settings "very high", aa/af set to "application controled" in cp; turned both aniso and trilinear opts to ON, and benchmarked on the 6177's:

Operating System: Windows XP Professional (5.1, Build 2600) Service Pack 2 (2600.xpsp_sp2_rtm.040803-2158)
Processor: Intel(R) Pentium(R) 4 CPU 3.20GHz (2 CPUs)
Memory: 1024MB RAM
DirectX Version: DirectX 9.0c (4.09.0000.0904)
Card name: NVIDIA GeForce 6800 GT
Driver Version: 6.14.0010.6177 (English)

1280x1024
DX9c / patch 2: Average FPS: 48.89
DX9c / patch 2: Average FPS: 42.22 (opts OFF)
DX9b / patch 1: Average FPS: 43.36

1600x1200
DX9c / patch 2: Average FPS: 34.69
DX9c / patch 2: Average FPS: 29.27 (opts OFF)
DX9b / patch 1: Average FPS: 33.12

i dunno; i don't see a substantial performance improvement using the sm3 path. x800pro perf tanked with patch 2 tho, however crytek pulled the patch due to issues with ati. i'll re-run with cat4.8 which will be released next week (these support sm2b, including full support for geometry instancing).
 
Originally posted by: CaiNaM
Originally posted by: ShamrockWhy do you use NVIDIA "official" drivers and ATI beta drivers? Unless ATI has issued the 4.9b drivers in the last few days.

because those were the latest drivers avail from the respecitve manufacturer's sites...


uhh, no...I am on ATI's site right now, and I see Cat 4.7? I could not find 4.9b on ANY page ATI has

Cat 4.9b is beta too. If you want to bench fairly, bench the 61.77 vs Cat 4.7 please. I foyu want to bench beta drivers, bench BOTH cards with betas.

And General, they are beta drivers for Quardo, but they DO work with the modded .inf file, as you can see, they are working quite well. I suggest using the guru3d file, not the one from Dell.

Here is the thread at NVNews about it

http://www.nvnews.net/vbulletin/showthread.php?t=34396
 
Originally posted by: Shamrock
Originally posted by: CaiNaM
Originally posted by: ShamrockWhy do you use NVIDIA "official" drivers and ATI beta drivers? Unless ATI has issued the 4.9b drivers in the last few days.

because those were the latest drivers avail from the respecitve manufacturer's sites...


uhh, no...I am on ATI's site right now, and I see Cat 4.7? I could not find 4.9b on ANY page ATI has

Cat 4.9b is beta too. If you want to bench fairly, bench the 61.77 vs Cat 4.7 please. I foyu want to bench beta drivers, bench BOTH cards with betas.

And General, they are beta drivers for Quardo, but they DO work with the modded .inf file, as you can see, they are working quite well. I suggest using the guru3d file, not the one from Dell.

Here is the thread at NVNews about it

http://www.nvnews.net/vbulletin/showthread.php?t=34396

:roll:

http://www.ati.com/support/infobase/4547.html
 
I didnt read a single thread in here but I will answer the question asked in the topic. For me it is because of all the bad experiences I have had with ATI tech support and driver issues. I HATE ATI THEY HAVE CAUSED ME TOO MUCH STRESS & I WILL NEVER BUY THEIR PRODUCTS IF I CAN AVOID IT & THE SAME GOES OR HP & THEIR BULLSHIT PAY FOR DRIVERS TO MAKE YOUR SCANNER WORK WITH WINDOWS 2000 OPERATING SYSTEM CRAP! Yes, they make me angry...
 
Originally posted by: Luthien
I didnt read a single thread in here but I will answer the question asked in the topic. For me it is because of all the bad experiences I have had with ATI tech support and driver issues. I HATE ATI THEY HAVE CAUSED ME TOO MUCH STRESS & I WILL NEVER BUY THEIR PRODUCTS IF I CAN AVOID IT & THE SAME GOES OR HP & THEIR BULLSHIT PAY FOR DRIVERS TO MAKE YOUR SCANNER WORK WITH WINDOWS 2000 OPERATING SYSTEM CRAP! Yes, they make me angry...

when was the last time you used an ati product? ati's drivers have been pretty solid for a while now. I admit, during the 8500, early 9700 days, their drivers weren't flawless. But since then, they've become acceptable, if not excellent in my opinion.


also, i do notice the anti-ati sentiment in this forum. i guess everyone was just waiting for an acceptable nvidia card to come out so they can cheer about it.

Either way, both the x800 and 6800 are great cards. You really can't go wrong with either.
 
Originally posted by: Rollo
Anybody else think this "Kobra" is somebody that banned and returns with a new IP to enlighten us with his hatred of ATI?

And if even one of you points out "Rollo, you're the biggest ATI hater of all!" I'm going to use the fact that I've purchased and used way more ATI cards than them like Rick James holding a hot crack pipe on his catch of the day.

I agree...no matter what discussion is going on he tends to throw a "and my ATI card overheated..." in there...if you ask me I would say it was his own stupidity that his supposed ATI card overheated...I mean, face the facts, when something has so much hot air blowing in its direction...well, it's going to get toasty...
 
Some NV fans are hard on both ATI products AND ATI users. But to tell you the truth, the constant debates are fun. At least a lot more fun than Doom3 (which I happen to love by the way!).
 
Originally posted by: gururu
Some NV fans are hard on both ATI products AND ATI users. But to tell you the truth, the constant debates are fun. At least a lot more fun than Doom3 (which I happen to love by the way!).

Debates are fun up until the point they get nasty and personal.

To JHBBALL: Start yourself a poll and ask how many ATI users have to change drivers depending on which game they want to play. "I admit, during the 8500, early 9700 days, their drivers weren't flawless." Total understatement. While ATI drivers have improved substantially, they are still no where near acceptable. ATI apparently finds it extremely difficult to write a driver that would satisfy all games requirements as nvidia has for many years now. I was dumbfounded to find out how many ATI users had to remove their current driver that worked well in a particular game, only so they can play another title without crashing to their desktop. CoD is a perfect example. VPU recovery yada yada was fixed with 3.10, but returned with the very next driver update. What..... is...... the..... deal.....? The last ATI card I owned was a 9700. And that was the last one I will own. I have already made it perfectly clear in this forum that my preference is nvidia for my own reasons and made no bones about it. I am not anti-ATI either as many many folks swear by them. They are just not my cup of coffee. (I don't drink tea) 😛
 
Originally posted by: Kobra
Originally posted by: DefRef
The only difference between today's ATI fanchimps and the 3dfx zombies of the past is that ATI has much better business acumen. Their partisans are still angry sheep, but what else is new?

PLEASE don't bring up the 3DFX retards.. I was happy to see 3DFX go out of business if nothing than to shut up those retards that worshipped it like some god and religion... Yea, the ATI turds are bordering on 3DFX fanboism, but not quite yet.

I think a ton of people were burned on the 9800 series, with the dimestore heatsink and fan, overheating issues, compatibility troubles, and overall lack of manufacturing quality present in them. I mean the 9800's only had about what, 8 months in the highlight and now they are absolute garbage compared to the 6800GT? It would all be very funny to me, except I lost around $150ish on my piece of crap 9800Pro I had to RMA 3 times, and finally gave away on Ebay for half the price I paid. I'll never buy an ATI product again. Oh ya, and their drivers STILL suck beyond sucky.. Nothing like having to swap drivers every other day depending on the game I was playing.. Losers.


Don't even think of bashing 3dfx, retard. 3dfx MADE GAMING WHAT IT IS TODAY. On top of that, 3dfx has a lot to do with Nvidia's technology and bringing AA to the market. Nvidia bought out what was left of 3dfx tech , and things such as SLI were developed initially by 3dfx. Stop being such a noob, just because you got in to gaming after the 3dfx revolution doesn't mean people that liked the cards are retards. And if your one of the noobs that preferred the crappy early nv lines over the vast performance increase of the 3dfx glide cards then you truly are a fanboy.
 
Originally posted by: Drayvn
Originally posted by: nitromullet
Originally posted by: ScrewFace
As soon as ATI rewrites their OpenGL code to make it match nVidia's my X800 XT PE will wipe the floor with even the GeForce 6800 Ultra Extreme. Just look at the fill-rate difference: 8.32 Gigatexels for the X800 XT PE and a puny 6.4 Gigatexels for the GeForce 6800 Ultra. And, you don;t have to go out and blow hard-earned money on a 480 watt power supply. Shader M3.0 be damned!:evil:
Don't hold your breath.... If it was that trivial, I'm sure they would have done it already. It's not as if the release of Doom3 snuck up on anyone, but yet ATi's OpenGL performance was not improved in time for it's launch.

I think it was because they were totally focused on the improvements of the D3D code, that they might have lost sight of the OGL, or they might have thought that these cards wouldnt have done so badly, but they have done amazing work with D3D, so im confident they can do the same with OGL also

Like I said before, it's not exactly as if Doom3 snuck up up on us. If ATi didn't get advance copies directly from id during development, they surely could have gotten hold of the alpha floating around (isn't the rumor that is was leaked through ATi? thought I read that somewhere...), so there is no way that they couldn't have known how well their cards would perform in Doom3.

nVidia has put much more resources into their OpenGL drivers than ATi because of their line of workstation cards and their commitment to offer Linux support for all their cards, and it would surprise me if ATi could make up for their lack of commitment in this area in a short amount of time. While I do think that ATi will be able to come up with better OpenGL drivers at some point, I don't think we'll be seeing any "wiping the floor with the 6800 Ultra Extreme" by ATi anytime soon. IMO, that is just wishful (fanboy) thinking.

Another thing to consider is that it may not just be a driver issue. The superior abilities of the nVidia cards in OpenGL may be due to fundamental design differences between the nVidia chip and the ATi chip. The thing I don't understand in this debate is that the argument is always that nVidia hardware simply doesn't run DirectX quite as fast as ATi, but that ATi hardware could run OpenGL just as well as nVidia if they had a better driver. I just don't understand the double standard... Has anyone ever considered that nVidia may just has an edge in OpenGL due to their architecture?
 
Originally posted by: nitromullet

Like I said before, it's not exactly as if Doom3 snuck up up on us. If ATi didn't get advance copies directly from id during development, they surely could have gotten hold of the alpha floating around (isn't the rumor that is was leaked through ATi? thought I read that somewhere...), so there is no way that they couldn't have known how well their cards would perform in Doom3.

I think JC pretty much gave ATI the finger once he found out they leaked the alpha. ATI was probably pretty aware of the message too.

nVidia has put much more resources into their OpenGL drivers than ATi because of their line of workstation cards and their commitment to offer Linux support for all their cards, and it would surprise me if ATi could make up for their lack of commitment in this area in a short amount of time. While I do think that ATi will be able to come up with better OpenGL drivers at some point, I don't think we'll be seeing any "wiping the floor with the 6800 Ultra Extreme" by ATi anytime soon. IMO, that is just wishful (fanboy) thinking.

I wholeheartedly agree

Another thing to consider is that it may not just be a driver issue. The superior abilities of the nVidia cards in OpenGL may be due to fundamental design differences between the nVidia chip and the ATi chip. The thing I don't understand in this debate is that the argument is always that nVidia hardware simply doesn't run DirectX quite as fast as ATi, but that ATi hardware could run OpenGL just as well as nVidia if they had a better driver. I just don't understand the double standard... Has anyone ever considered that nVidia may just has an edge in OpenGL due to their architecture?

More annoying is that so many proclaim the 6800 core the king of the world based on its performance in one game.
 
hmm..

there's been some talk of having problems with ati as a reason to go with nvidia, and i can certainly understand that as they are so close, those types of reasons can easily sway someoneone to one side or the other, or "tip the scales".

it's kinda of strange, but i've had cards from at least both sides each of the last several generations (with matrox, s3 tossed in the mix), and have really not had THAT many issues with any of them - it's generally been performance or IQ that's lead me towards one or the other.. and overheating problems w/ a 9800? never heard that problem before..

at any rate, this GT is the only card that's given me serious concern, and that's due to the heat... runs 72c @idle and mid to upper 90s @load. its not my system, as my Pro runs30c less in the same conditions. after long periods of play, D3 will have gfx issues, and start "pausing".. trying to exit once it starts doing this causes the system to lock. the last time it did that yesterday, it messed up some files and i had to spend a couple hours reinstalling the OS from scratch. looking into the danger den cooling block (sytstem is liquid cooled), but damn.. that's another $130ish. nvidia needs a more robust stock cooling system.
 
it also runs incredibly hot even at default clocks compared to my pro: 72c @ idle compared to 46c,
My 6800U idles at only 50c-55c.

but as fast as it is in D3 compared to my x800p, the x800p is that much faster than my GT in farcry.
4.6 and 4.7 have IQ issues with FC. Maybe it's different on the X800 Pro but on the 9700P 4.7 doesn't even render any waves on the shoreline and instead produces a jagged sawblade.
 
My GT idles at 51C. Under load (FarCry & Doom3) it goes up to 59 to 61 at most. And the room I am in has no A/C and is kinda warm all the time. Go Figure.
 
Originally posted by: Bumrush99
Originally posted by: Kobra
Originally posted by: DefRef
The only difference between today's ATI fanchimps and the 3dfx zombies of the past is that ATI has much better business acumen. Their partisans are still angry sheep, but what else is new?

PLEASE don't bring up the 3DFX retards.. I was happy to see 3DFX go out of business if nothing than to shut up those retards that worshipped it like some god and religion... Yea, the ATI turds are bordering on 3DFX fanboism, but not quite yet.

I think a ton of people were burned on the 9800 series, with the dimestore heatsink and fan, overheating issues, compatibility troubles, and overall lack of manufacturing quality present in them. I mean the 9800's only had about what, 8 months in the highlight and now they are absolute garbage compared to the 6800GT? It would all be very funny to me, except I lost around $150ish on my piece of crap 9800Pro I had to RMA 3 times, and finally gave away on Ebay for half the price I paid. I'll never buy an ATI product again. Oh ya, and their drivers STILL suck beyond sucky.. Nothing like having to swap drivers every other day depending on the game I was playing.. Losers.


Don't even think of bashing 3dfx, retard. 3dfx MADE GAMING WHAT IT IS TODAY. On top of that, 3dfx has a lot to do with Nvidia's technology and bringing AA to the market. Nvidia bought out what was left of 3dfx tech , and things such as SLI were developed initially by 3dfx. Stop being such a noob, just because you got in to gaming after the 3dfx revolution doesn't mean people that liked the cards are retards. And if your one of the noobs that preferred the crappy early nv lines over the vast performance increase of the 3dfx glide cards then you truly are a fanboy.

:thumbsup:
 
BFG10K
it also runs incredibly hot even at default clocks compared to my pro: 72c @ idle compared to 46c,
My 6800U idles at only 50c-55c.

but as fast as it is in D3 compared to my x800p, the x800p is that much faster than my GT in farcry.
4.6 and 4.7 have IQ issues with FC. Maybe it's different on the X800 Pro but on the 9700P 4.7 doesn't even render any waves on the shoreline and instead produces a jagged sawblade.

which brand card did you get?

mine is bfg w/ the new dual fan cooler - which is sounding like it's not as efficient as the ref. cooler.

i've seen posts by others (there's an entire thread on it over at nvnews) with temps as high or even higher than mine, and posts about gfx corruptions as well. the temp by itself doesn't mean much (nor my temps compared to yours), but what alarms me is that the pro in the very same rig under the same conditions runs 30c cooler.

as for the rendering issues in far cry, i didn't have those issues, so perhaps it's an r3xx thing?

one issue i am having w/ the 6177's is in daoc, where textures are missing for some character models.. certain chars are running around w/o faces and such, lol...

upsided to nv's software is i like their dual monitor controld over the ati "hydrovision", and i find the auto game profiles a nice touch. and of course, the popup blocker 😉

Originally posted by: keysplayr2003
My GT idles at 51C. Under load (FarCry & Doom3) it goes up to 59 to 61 at most. And the room I am in has no A/C and is kinda warm all the time. Go Figure.

grrr..... 🙁

i even tried different compund (as5), but it didn't vary the temps much.

right now, my case temp is 46, cpu temp is 41, and gpu is 79 - and that's forcing the fan 10 100% in 2d (if you d/l the gainward expertool utility, which works on all 6800s, you can override the dynamic speed settings & manually change the fan speeds).
 
Originally posted by: CaiNaM
Originally posted by: nitromullet
Yep, General is right. My GT gets up to 77-78C with rthdribl, but runs 50-55C at idle.

mine hits 100c


Either you have very poor case cooling or your card cooling is defective IMO. My BFG GT hits similar temps to nitromullet.
 
Back
Top