Whatever happened to the days of NVIDIA touting their "unified driver achitechure"?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
2. Things other than NVIDIA drivers cause things Vista reports as "NVIDIA driver errors". (e.g. overheating, RAM failure from OCing, video card failure from OCing, PSU failure)
That's quite true, but it also equally applies to ATi (and Intel to a lesser extent).

3. How many of the NVIDIA driver errors came from leaked/unfinished drivers?
People are often forced to use them in the absence of nVidia's official drivers. I know I did because sometimes I couldn?t play the games I wanted and nVidia often took months to release anything official.

With ATi we?ve had monthly official WHQL drivers since 2002. In fact some years they released more than 12 official drivers.

6. If you're MS, and being sued, you might well want to point out "it's other companies fault too".
This seems to be an invalid argument. The fact is Microsoft were reporting nVidia, ATi and Intel, plus their own crashes. Are you suggesting they inflated nVidia?s figures?

4. There were NVIDIA driver errors because the unified arch was new, and Vista was new.
That's probably true, but look at the difference between the crashes: 28.8% vs 9.3%.

Even accounting for a unified architecture and possible nVidia market superiority (we don?t count Vista issues since Microsoft counts their own bugs), that's still over three times the crashes with nVidia.

That figure can?t totally be explained away by other factors so the bottom-line is that nVidia?s drivers are generally inferior to ATi?s, and now we have figures to prove it.

That and don?t forget the numerous driver problems on XP which obviously had nothing to do Vista or early adopters. Also feedback on XP of the 2900 was generally positive online; certainly more positive than my experiences with the G80 on XP.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: SunnyD
Originally posted by: nerp
I used to be one of those people that said ATI had horrible drivers.

But now that I have both nvidia and ATI cards in a pair of Vista boxes, I can say with assuredness that ATI has much better, tighter, cleaner and more stable drivers than nvidia hands down.

Using the nvidia control panel feels like poking around some college student's java app for class.

I should mention I didn't intend to single out NVIDIA. ATI is guilty of this as well, but ATI didn't start shouting "UNIFIED DRIVER" from the get go. In fact ATI has done a better job of incorporating a more unified solution to date than NVIDIA, going so far as to include several of their mobile products... but the key with ATI is that when a new chip comes out, the new drivers to support it also include all the down-level products that ATI "currently" supports. I still don't find it "cool" that ATI only supports some OEMs directly and not others (Dell *cough*).

Originally posted by: BFG10K
I think nVidia are holding back the 17x.xx series intentionally on older cards and will only release them after the 9xxx line is no longer ?hot? in the reviews.

I suspect the reason for this is the performance gains of the drivers make the 9xxx series look better compared to the 8xxx series.

And I agree completely as well. Marketing is fine, but if you're going to do it that way make it a beta driver, not a WHQL driver.

Sunny, what video card do you have that you need drivers for.
Unified drivers are great, but I don't see the "requirement" for them.
In other words, if you have a 9600GT, there are drivers for it. If you have that card, why on earth would you need a driver that includes all previous generations of nvidia card support?

I know I mentioned before that the current 9 series drivers need to go through QA testing with previous gens of cards, and will ultimately be a part of the unified driver, but I don't exactly see what the issue is. You need to explain it better. Like "why" you have to edit INF files. What is the reason you can't find a driver to support your current video card?

Let me know, maybe I can help.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: LOUISSSSS
Originally posted by: nRollo
My take on this:



2. Things other than NVIDIA drivers cause things Vista reports as "NVIDIA driver errors". (e.g. overheating, RAM failure from OCing, video card failure from OCing, PSU failure)
=
6. If you're MS, and being sued, you might well want to point out "it's other companies fault too".
aren't you doing the same thing for nvid right now? blaming someone else for nvidia driver errors

3. How many of the NVIDIA driver errors came from leaked/unfinished drivers?
how many official drivers have nvidia given us for 8 series and earlier cards?
beta drivers = unfinished driver?

just added in some comments there.. can u explain a bit to me?

6. I'm not doing anything for NVIDIA in regard to blaming others, just telling you what I've seen on the nZone driver forum, my own personal experience, and one of the reasons I was given ESA is a handy thing to have.

For example, the first 1200W power supply I bought for 3 way SLi had 2 of the 6 pin PCIE on one line, which I split between two 8800GTXs. (so they each had one single line, and one split line) When I start gaming, it would work for a little while, then I'd either starting getting the "nvkdm has stopped responding" TDR error, or sometimes even a BSOD that would reference the nvidia drivers. These would have reported to MS as "NVIDIA driver errors", when I really just needed a new PSU. When I got one, the errors stopped entirely. (and if I would have had an ESA system at the time, I could have seen the psu failure as my problem and known what to replace)

3. Beta drivers are not unofficial drivers, they've gone through the same QA as WHQL. They just haven't been sent to MS with a big check for MS certification, and MS certification has never meant "these drivers are guaranteed to work with everything". The drivers I was referring to are the leaked drivers NVIDIA hasn't finished QA on, but people at hardware OEMs or devs leak to the Internet.

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
4. There were NVIDIA driver errors because the unified arch was new, and Vista was new.
That's probably true, but look at the difference between the crashes: 28.8% vs 9.3%.

Even accounting for a unified architecture and possible nVidia market superiority (we don?t count Vista issues since Microsoft counts their own bugs), that's still over three times the crashes with nVidia.

That figure can?t totally be explained away by other factors so the bottom-line is that nVidia?s drivers are generally inferior to ATi?s, and now we have figures to prove it.

That and don?t forget the numerous driver problems on XP which obviously had nothing to do Vista or early adopters. Also feedback on XP of the 2900 was generally positive online; certainly more positive than my experiences with the G80 on XP.

I'm only going to address the unified arch point because I've discussed the monthly WHQL vs NVIDIAs release frequency in another current thread BFG.

MS changed some of the Vista specs relatively close to the launch of Vista, so some of the development had to be re-done.

The other thing is that it wouldn't surprise me if unifed vs non did have double or even triple the amount of driver errors.

When you consider those drivers had to be rewritten from the first line of code, and the differences in how a unified arch works compared to the old fixed function designs, it wouldn't surprise me at all.

The core design of the "newest" ATi product at Vista launch had been in place since late 2005 and didn't have to use DX10. For all we know, one of the reasons the R600 was delayed till late Q2 2007 could have been ATi trying to get the drivers to work with Vista.

In any case, this is all sort of ancient history, so really "who cares" except MS in their lawsuit?

Not like any of us are going to be travelling back in time today and using early Vista or NVIDIA's early release drivers.

As far as the "generally inferior drivers" thing goes, I could have been 3% of those errors myself with that (IMO) unfortunately designed PSU- when you start getting the TDR errors because of a hardware fault- you get a LOT.

People who believe you and think this is evidence of inferior drivers could probably buy an ATi card and get inferior hardware instead.
Drivers can be fixed, but the inefficient VLIW arch, lack of ROPs, lack of TMUs, and shader resolve AA are here to stay. (and in the absence of games coded for VLIW [which there aren't and won't be] I can't think of a situation any of these GPU design choices would be advantageous)

Something for readers of threads like this to think about.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
evidently rather related and very good news ...
- i didn't have a chance to check it as i am running off to work

aloha

Nvidia's Unified Drivers are back! - Beta release for all cards!

http://www.nvidia.com/object/winxp_174.74.html
# Supports GeForce FX, 6, 7, 8, and 9 series GPUs including these newly released GPUs:

* GeForce 9800 GX2
* GeForce 9600 GT
* GeForce 8300
* GeForce 8200
* GeForce 8100/NVIDIA nForce 720a
* NVIDIA nForce 730a

# Supports single GPU and NVIDIA SLI? technology on DirectX 9 and OpenGL.
# Adds new PureVideo HD features for GeForce 9800 GX2 and 9600 GT:

* Dynamic Contrast Enhancement
* Dynamic Blue, Green & Skin Tone Enhancements
* Dual-Stream Decode Acceleration*

# Improved performance on many DirectX 9 and OpenGL applications.
# Numerous game and application compatibility fixes.

GeForce 9800 GX2
GeForce 9600 GT
GeForce 8800 Ultra
GeForce 8800 GTX
GeForce 8800 GTS 512
GeForce 8800 GTS
GeForce 8800 GT
GeForce 8800 GS
GeForce 8600 GTS
GeForce 8600 GT
GeForce 8600 GS
GeForce 8500 GT
GeForce 8400 GS
GeForce 8400 SE
GeForce 8400
GeForce 8300 GS
GeForce 8300
GeForce 8200
GeForce 8100/NVIDIA nForce 720a
NVIDIA nForce 730a
GeForce 7950 GX2
GeForce 7950 GT
GeForce 7900 GTX
GeForce 7900 GT/GTO
GeForce 7900 GS
GeForce 7800 SLI
GeForce 7800 GTX
GeForce 7800 GT
GeForce 7800 GS
GeForce 7650 GS
GeForce 7600 GT
GeForce 7600 GS
GeForce 7600 LE
GeForce 7500 LE
GeForce 7350 LE
GeForce 7300 SE
GeForce 7300 LE
GeForce 7300 GT
GeForce 7300 GS
GeForce 7200 GS
GeForce 7100 GS
GeForce 7150 / NVIDIA nForce 630i
GeForce 7100 / NVIDIA nForce 630i
GeForce 7100 / NVIDIA nForce 620i
GeForce 7050 / NVIDIA nForce 630i
GeForce 7050 / NVIDIA nForce 610i
GeForce 7050 PV / NVIDIA nForce 630a
GeForce 7025 / NVIDIA nForce 630a
GeForce 6800 XT
GeForce 6800 XE
GeForce 6800 Ultra
GeForce 6800 Series GPU
GeForce 6800 LE
GeForce 6800 GT
GeForce 6800 GS/XT
GeForce 6800 GS
GeForce 6800
GeForce 6700 XL
GeForce 6610 XL
GeForce 6600 VE
GeForce 6600 LE
GeForce 6600 GT
GeForce 6600
GeForce 6500
GeForce 6250
GeForce 6200SE TurboCache?
GeForce 6200 TurboCache?
GeForce 6200 LE
GeForce 6200 A-LE
GeForce 6200
GeForce 6150SE nForce 430
GeForce 6150 LE
GeForce 6150
GeForce 6100 nForce 420
GeForce 6100 nForce 405
GeForce 6100 nForce 400
GeForce 6100
GeForce PCX 5900
GeForce PCX 5750
GeForce PCX 5300
GeForce FX 5950 Ultra
GeForce FX 5900ZT
GeForce FX 5900XT
GeForce FX 5900 Ultra
GeForce FX 5900
GeForce FX 5800 Ultra
GeForce FX 5800
GeForce FX 5700VE
GeForce FX 5700LE
GeForce FX 5700 Ultra
GeForce FX 5700
GeForce FX 5600XT
GeForce FX 5600SE
GeForce FX 5600 Ultra
GeForce FX 5600
GeForce FX 5500
GeForce FX 5200LE
GeForce FX 5200 Ultra
GeForce FX 5200
GeForce FX 5100
So now we know "what happened" .. it's back
:thumbsup:
























. . . and was this really necessary?
People who believe you and think this is evidence of inferior drivers could probably buy an ATi card and get inferior hardware instead.
.. sigh :(

Your reasons "why" were clear and logical - solid, positive info from a Focus Group Member! - BUT i thought we were going to stick to the positives - as evidenced by what we each posted just last night ... please. NVIDIA is evidently unifying their drivers again - awesome news! .. there is no need to be derisive to the competition who is readying their own HW r700/770 as a real competitor to 9800 series.

[addendum - if you want to edit out your negative, i will edit out my comment on it . . . and it will just disappear - i do not want to remain confrontational with you - at all!]
rose.gif
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: BFG10K
That's probably true, but look at the difference between the crashes: 28.8% vs 9.3%.
Crash data that ultimately resulted in numerous Hot Fixes for MS' buggy OS. At one point NV listed 6 Vista Hot Fixes that should've probably been resolved before RTM, most notably the virtual memory allocation bug that also adversely affected numerous game titles in Vista (but not XP). Honestly this is the first time that I can remember IHV and ISVs directly linking to MS hot fixes on such a broad scale; made it much easier for the end-user at least.


 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Amusing that the latest laptop review on Anandtech has a page called "More Information on NVIDIA Drivers", and there's a blog post with a section "NVIDIA GeForce 8200 update-" talking about... drivers! Or rather, both pages talk about the unsatisfactory state of Nvidia drivers at the moment.
 

hooflung

Golden Member
Dec 31, 2004
1,190
1
0
nRollo I will back you up on Vista's stupid error reporting not being 'kind' to Nvidia. Last night I had my ram out of spec, yeah I put 2 where a 5 should have been in my timings and Vista 64 kept blowing up in games with a BSOD blaming the Nvidia driver IRQ and it was easily resolved by just letting my motherboard pick the timings ( which it did properly ) and viola no crashes, memtest86 3.4 ran non stop for 45 minutes without a single hiccup just to make sure.

Ok so I can you and Nvidia some slack here. Microsoft released numbers and let people interpret them negatively to reduce flaq of Vista being somewhat of the abortion it, as nice as it looks and as good as some features actually ARE, is.

However
People who believe you and think this is evidence of inferior drivers could probably buy an ATi card and get inferior hardware instead.

That is just is plain fainboi crap. Utter crap. The design of the ATI cards is actually far more impressive than the Nvidia 8 series. The Nvidia has their tried and true brute force method that still works for them but elegant it is not. Better 'cards' they are not. From a technology stand point AMD is a cut above and Nvidia is at the end of this generation with a marketing disaster because they can't seem to really improve past year long standards their past cards set. Not unlike the FX series from Nvidia years ago, and the Xenon Xbox 360 gpu, the R6 based cores are VLIW and that means 1) compiler tuning means everything 2) drivers mean everything and 3) developers need to account for them.

AMD has 1 and 2 down and slowly they are getting 3. Nvidia had neither 1, 2 or 3 when they released the FX. The Xbox 360 is a major console and console games that shine on the 360 that that eventually make their way to Gaming on Windows might shift the lime light to AMD and force Nvidia to turn their unified shaders into a VLIW arch, which might be very well what they are doing for the 200. If you want to make blatent statements about inferiority then do so with reason in your mouth instead of fanboi sword. Say, "Games are developed currently rely on old school brute force and more elegant solutions still run inefficient today." You might run the risk of games in 2009-2010 running fine on Todays AMD hardware while Nvidia suffer because the development shifted. PS3 games are less likely to hit the PC than Xbox 360. And the way gaming on the PC is going for AAA titles, we might see more 'console' friendly development close the gap and overtake the speed the Nvidia arch has over AMD Today
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
the nvidia drivers still use a unified architechture. They just don't tout it or allow you to install on anything out of the box so that users wouldn't complain and demand support for obsolete products. Also they don't want to break existing capabilities or bother testing for compatibility.
By adding your card to the inf file you can get ANY nvidia driver to install on ANY nvidia hardware.
I suspect WHQL might also has something to do with it.

The link for the so called "unified drivers beta" provided in this thread is only geforce 5+...

with a modded inf you can install everything since the TNT2... INCLUDING laptop ones.

http://www.laptopvideo2go.com/ makes a custom inf that lists all nvidia drivers and allows them all to install. However, by installing a much much newer driver on an old peice of untested hardware you risk introducing new bugs. However from my experience it fixes many more bugs then it creates.

PS. as for the inferior drivers issue. Nvidia drivers did account for 3 times more crashes in vista in 2007. But:
1. Nvidia has more cards in the market.
2. If nvidia had early issues with vista that doesn't mean that their drivers are inferior, just that it took them longer to get proper vista support. I want to see the total crashes in LAST MONTH. Not in "2007"
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: hooflung
nRollo I will back you up on Vista's stupid error reporting not being 'kind' to Nvidia. Last night I had my ram out of spec, yeah I put 2 where a 5 should have been in my timings and Vista 64 kept blowing up in games with a BSOD blaming the Nvidia driver IRQ and it was easily resolved by just letting my motherboard pick the timings ( which it did properly ) and viola no crashes, memtest86 3.4 ran non stop for 45 minutes without a single hiccup just to make sure.

Ok so I can you and Nvidia some slack here. Microsoft released numbers and let people interpret them negatively to reduce flaq of Vista being somewhat of the abortion it, as nice as it looks and as good as some features actually ARE, is.

Thank you, I wish more people realized this.

Originally posted by: hooflung
However
People who believe you and think this is evidence of inferior drivers could probably buy an ATi card and get inferior hardware instead.



That is just is plain fainboi crap. Utter crap. The design of the ATI cards is actually far more impressive than the Nvidia 8 series. The Nvidia has their tried and true brute force method that still works for them but elegant it is not. Better 'cards' they are not. From a technology stand point AMD is a cut above and Nvidia is at the end of this generation with a marketing disaster because they can't seem to really improve past year long standards their past cards set. Not unlike the FX series from Nvidia years ago, and the Xenon Xbox 360 gpu, the R6 based cores are VLIW and that means 1) compiler tuning means everything 2) drivers mean everything and 3) developers need to account for them.

AMD has 1 and 2 down and slowly they are getting 3. Nvidia had neither 1, 2 or 3 when they released the FX. The Xbox 360 is a major console and console games that shine on the 360 that that eventually make their way to Gaming on Windows might shift the lime light to AMD and force Nvidia to turn their unified shaders into a VLIW arch, which might be very well what they are doing for the 200. If you want to make blatent statements about inferiority then do so with reason in your mouth instead of fanboi sword. Say, "Games are developed currently rely on old school brute force and more elegant solutions still run inefficient today." You might run the risk of games in 2009-2010 running fine on Todays AMD hardware while Nvidia suffer because the development shifted. PS3 games are less likely to hit the PC than Xbox 360. And the way gaming on the PC is going for AAA titles, we might see more 'console' friendly development close the gap and overtake the speed the Nvidia arch has over AMD Today

The problem with all this is that it never really matters a whole lot what the "games of tomorrow" run better on because by the time the "games of tomorrow" get here, the "cards of today" are the last thing you'd want to run them.

Whether the design of the R6XX line is more "forward thinking" remains to be seen, but it seems totally unlikely to me that games in 2009 and 2010 are going to kick ass on today's $175 mid range AMD card because they predicted a future where all games will be written to not have dependent instructions to increase parallelism for R600/cards like it, and games will no longer need TMUs so that deficiency won't matter either.

There's a lot of "ifs" in your suppositions, and people really shouldn't buy products based on "ifs" and what the console market is up to. PS3 is a decent size player in that market as well, and will likely become more so as it's the cheapest way to get a Blue Ray player and console by far.

There was no "fanboi sword" in my post, it's just more than fair if you're replying to a post that states "NVIDIA has inferior drivers" to note that drivers are in a constant state of developnment, and if the products using "inferior" driver are already the industry leaders by far at almost every price point, the "inferior" drivers become somewhat of a moot point.
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
Originally posted by: nRollo
The problem with all this is that it never really matters a whole lot what the "games of tomorrow" run better on because by the time the "games of tomorrow" get here, the "cards of today" are the last thing you'd want to run them.

Whether the design of the R6XX line is more "forward thinking" remains to be seen, but it seems totally unlikely to me that games in 2009 and 2010 are going to kick ass on today's $175 mid range AMD card because they predicted a future where all games will be written to not have dependent instructions to increase parallelism for R600/cards like it, and games will no longer need TMUs so that deficiency won't matter either.

I was able to play new games for 3-4 years with my Voodoo 5500 that wasn't even supported with new drivers because the company was bought out by nVidia. Just because some people get new VC's every 6 months, doesn't mean the rest of us will throw money away to play a game on occasion. Some of us would actually like to buy a game and not have to worry about whether it will run on our computer because we don't have the latest and greatest hardware.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Martimus
Originally posted by: nRollo
The problem with all this is that it never really matters a whole lot what the "games of tomorrow" run better on because by the time the "games of tomorrow" get here, the "cards of today" are the last thing you'd want to run them.

Whether the design of the R6XX line is more "forward thinking" remains to be seen, but it seems totally unlikely to me that games in 2009 and 2010 are going to kick ass on today's $175 mid range AMD card because they predicted a future where all games will be written to not have dependent instructions to increase parallelism for R600/cards like it, and games will no longer need TMUs so that deficiency won't matter either.

I was able to play new games for 3-4 years with my Voodoo 5500 that wasn't even supported with new drivers because the company was bought out by nVidia. Just because some people get new VC's every 6 months, doesn't mean the rest of us will throw money away to play a game on occasion. Some of us would actually like to buy a game and not have to worry about whether it will run on our computer because we don't have the latest and greatest hardware.

2010 is three years out from when R6XX cards were made.

What card from 2004 is providing acceptable performance at the current games?

This way of thinking about computer gaming is just alien to me. If you buy top end, you get 1-2 year max unless you're talking about 10X7 no AA stuff.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Martimus
Originally posted by: nRollo
The problem with all this is that it never really matters a whole lot what the "games of tomorrow" run better on because by the time the "games of tomorrow" get here, the "cards of today" are the last thing you'd want to run them.

Whether the design of the R6XX line is more "forward thinking" remains to be seen, but it seems totally unlikely to me that games in 2009 and 2010 are going to kick ass on today's $175 mid range AMD card because they predicted a future where all games will be written to not have dependent instructions to increase parallelism for R600/cards like it, and games will no longer need TMUs so that deficiency won't matter either.

I was able to play new games for 3-4 years with my Voodoo 5500 that wasn't even supported with new drivers because the company was bought out by nVidia. Just because some people get new VC's every 6 months, doesn't mean the rest of us will throw money away to play a game on occasion. Some of us would actually like to buy a game and not have to worry about whether it will run on our computer because we don't have the latest and greatest hardware.

This is true for some cards, like the x1900xt and the 9700pro, which aged much better than the competition, and ran games decently even a year or more after after the cards launched. HL2 ran fine on my 9800pro, and so did Oblivion and Bioshock on my x1900xt. But I don't credit this to a more "forward-looking" architecture, just to a more advanced and flexible one. For example, no matter how you cut it, the r300 had 8 pipes versus 4 on the nv30. The r580 had not only 48 shaders but also decoupled texture units, which allowed it to outperform cards with more texture units.

But in the case of the r600, I believe it really is bottlenecked by texturing ability. Modern games may not rely so much on pure multitexturing like old DX7 games, but they still stress texture units with things like AF, and various buffers that get used for advanced shader effects. This puts the r600 at a disadvantage against the g80/g92 in games, even if it wins in brute shading power.
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
Originally posted by: nRollo
Originally posted by: Martimus
Originally posted by: nRollo
The problem with all this is that it never really matters a whole lot what the "games of tomorrow" run better on because by the time the "games of tomorrow" get here, the "cards of today" are the last thing you'd want to run them.

Whether the design of the R6XX line is more "forward thinking" remains to be seen, but it seems totally unlikely to me that games in 2009 and 2010 are going to kick ass on today's $175 mid range AMD card because they predicted a future where all games will be written to not have dependent instructions to increase parallelism for R600/cards like it, and games will no longer need TMUs so that deficiency won't matter either.

I was able to play new games for 3-4 years with my Voodoo 5500 that wasn't even supported with new drivers because the company was bought out by nVidia. Just because some people get new VC's every 6 months, doesn't mean the rest of us will throw money away to play a game on occasion. Some of us would actually like to buy a game and not have to worry about whether it will run on our computer because we don't have the latest and greatest hardware.

2010 is three years out from when R6XX cards were made.

What card from 2004 is providing acceptable performance at the current games?

This way of thinking about computer gaming is just alien to me. If you buy top end, you get 1-2 year max unless you're talking about 10X7 no AA stuff.

Maybe it is because I still use a CRT, but 1024x768 is still fine for me if I have to do it. I only bought the 3850 because I didn't think my x800 xl would play the games that I wanted to ask for for Christmas. I found out that it actually did play the Crysis demo, which amazed me, even when I put it on all Medium settings it was playable. I expected single digit frame rates. I wanted a 8800GT, but they were $300+ at the time, and that was over my budget. My X800XL played Stalker without any issues at 1024x768, and played NWN2 (the only newer games I had bought with that card) along with the Crysis demo.

I don't know about the whole R600 architecture being forward looking or anything, but I do know that I don't look at a Video Card purchase as being something that will only last me a year.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: munky
This puts the r600 at a disadvantage against the g80/g92 in games, even if it wins in brute shading power.

Just curious how you came to that conclusion as I think shader design is the gambit that won this generation for Nvidia. Pretty well documented that ATI's superscalar 5x1 design results in 64 worst-case, 320 best-case shaders but as with anything that relies on developer implementation and vendor specific optimizations, this approach loses out to a solid performing general purpose design. NV's real win imo was running core and shader clocks asynchronously, which allowed them to match ATI's best case shader performance and double/triple their worst case shader performance.
 

PingSpike

Lifer
Feb 25, 2004
21,765
615
126
Originally posted by: Martimus
Originally posted by: nRollo
Originally posted by: Martimus
Originally posted by: nRollo
The problem with all this is that it never really matters a whole lot what the "games of tomorrow" run better on because by the time the "games of tomorrow" get here, the "cards of today" are the last thing you'd want to run them.

Whether the design of the R6XX line is more "forward thinking" remains to be seen, but it seems totally unlikely to me that games in 2009 and 2010 are going to kick ass on today's $175 mid range AMD card because they predicted a future where all games will be written to not have dependent instructions to increase parallelism for R600/cards like it, and games will no longer need TMUs so that deficiency won't matter either.

I was able to play new games for 3-4 years with my Voodoo 5500 that wasn't even supported with new drivers because the company was bought out by nVidia. Just because some people get new VC's every 6 months, doesn't mean the rest of us will throw money away to play a game on occasion. Some of us would actually like to buy a game and not have to worry about whether it will run on our computer because we don't have the latest and greatest hardware.

2010 is three years out from when R6XX cards were made.

What card from 2004 is providing acceptable performance at the current games?

This way of thinking about computer gaming is just alien to me. If you buy top end, you get 1-2 year max unless you're talking about 10X7 no AA stuff.

Maybe it is because I still use a CRT, but 1024x768 is still fine for me if I have to do it. I only bought the 3850 because I didn't think my x800 xl would play the games that I wanted to ask for for Christmas. I found out that it actually did play the Crysis demo, which amazed me, even when I put it on all Medium settings it was playable. I expected single digit frame rates. I wanted a 8800GT, but they were $300+ at the time, and that was over my budget. My X800XL played Stalker without any issues at 1024x768, and played NWN2 (the only newer games I had bought with that card) along with the Crysis demo.

I don't know about the whole R600 architecture being forward looking or anything, but I do know that I don't look at a Video Card purchase as being something that will only last me a year.

The x8xx series cards were kind of neat. It seems like when they first came out, their drivers were immature and of course they lacked SM3 or 2 or whatever. But, once the drivers matured I felt like they became a great value option during nvidia's 7 series era and I still have one as a secondary card...to circumvent horrible compatibility problems old games have with the 8800GT I have.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Martimus
I only bought the 3850 because I didn't think my x800 xl would play the games that I wanted to ask for for Christmas. I found out that it actually did play the Crysis demo, which amazed me, even when I put it on all Medium settings it was playable.
Heh, that is because most of the medium settings don't do anything without SM3. ;)
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
Originally posted by: nRollo
There was no "fanboi sword" in my post, it's just more than fair if you're replying to a post that states "NVIDIA has inferior drivers" to note that drivers are in a constant state of developnment, and if the products using "inferior" driver are already the industry leaders by far at almost every price point, the "inferior" drivers become somewhat of a moot point.

Can you explain what you mean there? To me the hardware relies on drivers so how can they ever be a "moot point"?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Ok this is quite a derailment. Shouldn't those things be discussed in a different thread? this one is supposed to be about unified driver capability.
 

NYHoustonman

Platinum Member
Dec 8, 2002
2,642
0
0
Originally posted by: taltamir
Ok this is quite a derailment. Shouldn't those things be discussed in a different thread? this one is supposed to be about unified driver capability.

Indeed. Apparently bashing, rightfully so, one aspect of one company means it's justified to bash a different company and not expect any backlash.

The more things change...

 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: chizow
Originally posted by: BFG10K
That's probably true, but look at the difference between the crashes: 28.8% vs 9.3%.
Crash data that ultimately resulted in numerous Hot Fixes for MS' buggy OS. At one point NV listed 6 Vista Hot Fixes that should've probably been resolved before RTM, most notably the virtual memory allocation bug that also adversely affected numerous game titles in Vista (but not XP). Honestly this is the first time that I can remember IHV and ISVs directly linking to MS hot fixes on such a broad scale; made it much easier for the end-user at least.

Another thing people seem to be forgetting about this 3:1 ratio of NVIDIA driver crashes is that there's about a 3:1 ratio of NVIDIA cards in people's computers, so they're going to have more Vista crashes as well.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
I'm only going to address the unified arch point because I've discussed the monthly WHQL vs NVIDIAs release frequency in another current thread BFG.
We have? Please, remind me what the outcome of that was because I can?t remember. Thanks. :)

MS changed some of the Vista specs relatively close to the launch of Vista, so some of the development had to be re-done.

The other thing is that it wouldn't surprise me if unifed vs non did have double or even triple the amount of driver errors.
Okay, but that still doesn?t account for the numerous reported problems with the 7xxx (and earlier) compared to the X1xx series. That and these figures are from 2007 which had plenty of time to incorporate ATi unified offerings.

In any case, this is all sort of ancient history, so really "who cares" except MS in their lawsuit?
Consumers care, because it sets the precedent for nVidia driver support. We still have numerous drivers issues on nVidia and they?re still releasing drivers whenever they feel like it rather than sticking to a robust schedule.

An nVidia owner simply doesn?t know when the next driver is coming, unlike an ATi owner.

As far as the "generally inferior drivers" thing goes, I could have been 3% of those errors myself with that (IMO) unfortunately designed PSU- when you start getting the TDR errors because of a hardware fault- you get a LOT.
Again this sort of thing equally applies to ATi?s, Microsoft?s and Intel?s (to a smaller degree) figures and can just as easily inflate their scores.

People who believe you and think this is evidence of inferior drivers could probably buy an ATi card and get inferior hardware instead.
Drivers can be fixed, but the inefficient VLIW arch, lack of ROPs, lack of TMUs, and shader resolve AA are here to stay. (and in the absence of games coded for VLIW [which there aren't and won't be] I can't think of a situation any of these GPU design choices would be advantageous)
This is true to a degree, but the question is how long will these nVidia fixes take? If they take long enough (or never happen) the hardware will become a non-factor or even a paper-weight for that particular game if it can?t play it. 18 months after the G80?s launch I still can?t play Red Faction while my Intel GMA runs it without issue.

Also good software optimizations go a long way to addresses lacking hardware and ATi have proven they can deliver robust and efficient optimizations to even aging hardware.

I found a good article on Anandtech comparing driver optimizations of older ATi and nVidia cards. Here?s the conclusion:

Overall then, there is a trend worth noting, however counterintuitive it is. While we have said that it is NVIDIA that traditionally makes the most of its drivers, this was clearly not the case with the previous generation. Whether it's a testament to what the Catalyst team can do versus the ForceWare team, a hardware generational difference, or both, when both the normal and high quality tests are factored in, ATI is the victor for getting the most out of its drivers.
Click.

Another thing people seem to be forgetting about this 3:1 ratio of NVIDIA driver crashes is that there's about a 3:1 ratio of NVIDIA cards in people's computers, so they're going to have more Vista crashes as well.
3:1? Those figures don?t look right according to these.
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
Originally posted by: TheSnowman
Originally posted by: Martimus
I only bought the 3850 because I didn't think my x800 xl would play the games that I wanted to ask for for Christmas. I found out that it actually did play the Crysis demo, which amazed me, even when I put it on all Medium settings it was playable.
Heh, that is because most of the medium settings don't do anything without SM3. ;)

I'm sure that you are right, because it is a completely different game with my 3850. Ok, not completely different, but the little effects - like the water splashing on the screen - didn't happen on the X800 but do on the 3850.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
BFG,
I think you and I will likely have to agree to disagree on your issues.

As you know, I think "Red Faction" (from 2001) not working is evidence of NVIDIA wisely allocating their time to games people can still remember, while you think it's evidence of "inferior drivers".

I don't think an article from early 2006 has much place in a thread in 2008.

Your own article showed a 2+:1 ratio of desktop NVIDIA to AMD, so in a way you proved my point, but I'm sure you consider that a victory because you found stats that weren't 3:1.

We just look at things differently.


P.S. I sincerely hope anyone buying a video card to play "Red Faction" these days will take your words to heart and not buy a G8X or G9X card. For "Red Faction" they should be getting a GF3 or Radeon 8500!
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: nRollo
Another thing people seem to be forgetting about this 3:1 ratio of NVIDIA driver crashes is that there's about a 3:1 ratio of NVIDIA cards in people's computers, so they're going to have more Vista crashes as well.
Certainly not. That stat is pertaining to add-in graphics cards. 70% NV, 30% AMD. No intel, S3, or Matrox?

By far, most people use intel graphics in their comptuer. The overall stat is 80% or something ridiculous.