Originally posted by: SniperDaws
For fuck sake does every fucking thread need to turn into an "im better than you" argument.
Just download the drivers, install them, Smile and say Thanks you ungratful bunch of wankers.
LOL!!!!
SniperDaws FTW
Originally posted by: SniperDaws
For fuck sake does every fucking thread need to turn into an "im better than you" argument.
Just download the drivers, install them, Smile and say Thanks you ungratful bunch of wankers.
We will buck that trend, just give it some time. :beer:Originally posted by: buck
So much for the video forums getting better....
Originally posted by: nRollo
Originally posted by: Janooo
Originally posted by: nRollo
...
Ackmed:
Might help to note that:
1. About everything under the sun (from PSU overheats to RAM OCing) reports to Vista as a display driver error.
2. Most people have NVIDIA based graphics cards, they have over 70% of the discrete market. (so it makes sense they'll have the most "driver errors")
This article puts the ratio of NVIDIA to ATi marketshare at 90% to 10% before the 3800s, and 70% to 30% after.
If you have 90% of the market, you're going to have the most reported errors.
90/10 is pure BS if they don't specify more info about market share.
For example Q3 of 2007. NV had 37.8% and AMD 17.5% of the desktop shipments. It's 70/30 before 38xx came out and it's 60/40 in Q4 of 2007 after 38xx were released.
I don't see a link to that 60/40 split?
If anyone here cares about what happens with integrated graphics that would be included in the info you linked to, they should stop reading my posts now, because I could really care less what the "computers" with integrated graphics do. (with the exception of one interesting solution due on the market soon)
Originally posted by: KBTuning
This same driver set is available for Vista 64-bit as well for those who are wondering... im in the middle of downloading it right now
Originally posted by: SickBeast
We will buck that trend, just give it some time. :beer:Originally posted by: buck
So much for the video forums getting better....
Originally posted by: SickBeast
That's 90% of the add-in GFX card market.Originally posted by: Pabster
Originally posted by: nRollo
If you have 90% of the market, you're going to have the most reported errors.
Exactly.
90% of Vista users have an intel IGP. :light:
Originally posted by: nRollo
Originally posted by: SickBeast
That's 90% of the add-in GFX card market.Originally posted by: Pabster
Originally posted by: nRollo
If you have 90% of the market, you're going to have the most reported errors.
Exactly.
90% of Vista users have an intel IGP. :light:
If you can back that statistic, I'll retract, but my thought is most of the early adopters of Vista were people who likely had more high end graphics than Intel integrated.
Also, just speculation here, but I doubt people with Intel integrated are doing anything that might cause a computer to crash. They have to disable Aero on some of them, they're not doing any kind of 3d gaming or content creation. My guess is mom n' pop didn't generate a lot of TDRs looking their retirement accounts with Windows Explorer, or writing a letter with MS Word. (they for sure weren't driving their PSUs to failure, OCing their RAM, running DX10 apps, and the other myriad of things that caused driver errors to be falsely reported)
Originally posted by: DerekWilson
Originally posted by: nRollo
Originally posted by: SickBeast
That's 90% of the add-in GFX card market.Originally posted by: Pabster
Originally posted by: nRollo
If you have 90% of the market, you're going to have the most reported errors.
Exactly.
90% of Vista users have an intel IGP. :light:
If you can back that statistic, I'll retract, but my thought is most of the early adopters of Vista were people who likely had more high end graphics than Intel integrated.
Also, just speculation here, but I doubt people with Intel integrated are doing anything that might cause a computer to crash. They have to disable Aero on some of them, they're not doing any kind of 3d gaming or content creation. My guess is mom n' pop didn't generate a lot of TDRs looking their retirement accounts with Windows Explorer, or writing a letter with MS Word. (they for sure weren't driving their PSUs to failure, OCing their RAM, running DX10 apps, and the other myriad of things that caused driver errors to be falsely reported)
why does he have to back his statistic for you to retract something that is obvious FUD and based only on your assumption, opinion, and pro NVIDIA bias?
A year+ ago, Intel was clearly on top in terms of graphics in PCs. A year and a half ago x1950xtx was on top of the add-in world and AMD was doing well.
Right after the 8 series launch it is ridiculous to claim an assumption that most vista early adopters bought nvidia hardware because it supported DX10 as a reason for the elevated crash rate. to pass this off as fact is damaging to yourself and NVIDIA and to our forums.
how many people here overclock their graphics cards ... not most by a long shot.
now lets take that a step further ... how many people tried to overclock their graphics cards on Vista near launch when we were lucky to all of our drivers installed and have sound come out our speakers?
people pushing their hardware happens at a relatively low rate compared to the adoption of that hardware (with the possible exception being midrange intel CPUs just because they are way too easy to OC and just beg for it). combine that with an unstable new OS and people will steer clear of pushing their systems in order to increase their chance at stability.
and i'm sure you could "care less" what computers with integrated graphics do. especially if they crash at a lower rate than NVIDIA cards. Initially with vista and early drivers, gaming wasnt the only problem for me. I'd get graphics drivers resetting while surfing the web. intel, amd, and nvidia. you can't factor out 2d performance, as people experienced crashes here as well.
And how many people try to run games on an intel chipset? i bet a lot of people bought their "vista ready" computers and assumed it could run a game ... i bet a lot of people had a bad experience with their graphics, but it doesn't seem like that had windows crash on them.
dismissing intel and amd and saying that the increase in driver crash rates for nv are due to a ludicrously inflated market share is not something i'm going to let slide under the radar.
watch yourself.
Originally posted by: DerekWilson
Originally posted by: nRollo
Originally posted by: SickBeast
That's 90% of the add-in GFX card market.Originally posted by: Pabster
Originally posted by: nRollo
If you have 90% of the market, you're going to have the most reported errors.
Exactly.
90% of Vista users have an intel IGP. :light:
If you can back that statistic, I'll retract, but my thought is most of the early adopters of Vista were people who likely had more high end graphics than Intel integrated.
Also, just speculation here, but I doubt people with Intel integrated are doing anything that might cause a computer to crash. They have to disable Aero on some of them, they're not doing any kind of 3d gaming or content creation. My guess is mom n' pop didn't generate a lot of TDRs looking their retirement accounts with Windows Explorer, or writing a letter with MS Word. (they for sure weren't driving their PSUs to failure, OCing their RAM, running DX10 apps, and the other myriad of things that caused driver errors to be falsely reported)
why does he have to back his statistic for you to retract something that is obvious FUD and based only on your assumption, opinion, and pro NVIDIA bias?
A year+ ago, Intel was clearly on top in terms of graphics in PCs. A year and a half ago x1950xtx was on top of the add-in world and AMD was doing well.
Right after the 8 series launch it is ridiculous to claim an assumption that most vista early adopters bought nvidia hardware because it supported DX10 as a reason for the elevated crash rate. to pass this off as fact is damaging to yourself and NVIDIA and to our forums.
how many people here overclock their graphics cards ... not most by a long shot.
now lets take that a step further ... how many people tried to overclock their graphics cards on Vista near launch when we were lucky to all of our drivers installed and have sound come out our speakers?
people pushing their hardware happens at a relatively low rate compared to the adoption of that hardware (with the possible exception being midrange intel CPUs just because they are way too easy to OC and just beg for it). combine that with an unstable new OS and people will steer clear of pushing their systems in order to increase their chance at stability.
and i'm sure you could "care less" what computers with integrated graphics do. especially if they crash at a lower rate than NVIDIA cards. Initially with vista and early drivers, gaming wasnt the only problem for me. I'd get graphics drivers resetting while surfing the web. intel, amd, and nvidia. you can't factor out 2d performance, as people experienced crashes here as well.
And how many people try to run games on an intel chipset? i bet a lot of people bought their "vista ready" computers and assumed it could run a game ... i bet a lot of people had a bad experience with their graphics, but it doesn't seem like that had windows crash on them.
dismissing intel and amd and saying that the increase in driver crash rates for nv are due to a ludicrously inflated market share is not something i'm going to let slide under the radar.
watch yourself.
Surely you knew this was gonna happen when you let him back? He's paid by nvidia to do exactly this. Will you be vetting all his posts? Sounds like a lot of work to me.Originally posted by: DerekWilson
why does he have to back his statistic for you to retract something that is obvious FUD and based only on your assumption, opinion, and pro NVIDIA bias?
A year+ ago, Intel was clearly on top in terms of graphics in PCs. A year and a half ago x1950xtx was on top of the add-in world and AMD was doing well.
Right after the 8 series launch it is ridiculous to claim an assumption that most vista early adopters bought nvidia hardware because it supported DX10 as a reason for the elevated crash rate. to pass this off as fact is damaging to yourself and NVIDIA and to our forums.
how many people here overclock their graphics cards ... not most by a long shot.
now lets take that a step further ... how many people tried to overclock their graphics cards on Vista near launch when we were lucky to all of our drivers installed and have sound come out our speakers?
people pushing their hardware happens at a relatively low rate compared to the adoption of that hardware (with the possible exception being midrange intel CPUs just because they are way too easy to OC and just beg for it). combine that with an unstable new OS and people will steer clear of pushing their systems in order to increase their chance at stability.
and i'm sure you could "care less" what computers with integrated graphics do. especially if they crash at a lower rate than NVIDIA cards. Initially with vista and early drivers, gaming wasnt the only problem for me. I'd get graphics drivers resetting while surfing the web. intel, amd, and nvidia. you can't factor out 2d performance, as people experienced crashes here as well.
And how many people try to run games on an intel chipset? i bet a lot of people bought their "vista ready" computers and assumed it could run a game ... i bet a lot of people had a bad experience with their graphics, but it doesn't seem like that had windows crash on them.
dismissing intel and amd and saying that the increase in driver crash rates for nv are due to a ludicrously inflated market share is not something i'm going to let slide under the radar.
watch yourself.
Originally posted by: DerekWilson
dismissing intel and amd and saying that the increase in driver crash rates for nv are due to a ludicrously inflated market share is not something i'm going to let slide under the radar.
watch yourself.
Originally posted by: DerekWilson
Originally posted by: nRollo
Originally posted by: SickBeast
That's 90% of the add-in GFX card market.Originally posted by: Pabster
Originally posted by: nRollo
If you have 90% of the market, you're going to have the most reported errors.
Exactly.
90% of Vista users have an intel IGP. :light:
If you can back that statistic, I'll retract, but my thought is most of the early adopters of Vista were people who likely had more high end graphics than Intel integrated.
Also, just speculation here, but I doubt people with Intel integrated are doing anything that might cause a computer to crash. They have to disable Aero on some of them, they're not doing any kind of 3d gaming or content creation. My guess is mom n' pop didn't generate a lot of TDRs looking their retirement accounts with Windows Explorer, or writing a letter with MS Word. (they for sure weren't driving their PSUs to failure, OCing their RAM, running DX10 apps, and the other myriad of things that caused driver errors to be falsely reported)
why does he have to back his statistic for you to retract something that is obvious FUD and based only on your assumption, opinion, and pro NVIDIA bias?
A year+ ago, Intel was clearly on top in terms of graphics in PCs. A year and a half ago x1950xtx was on top of the add-in world and AMD was doing well.
Right after the 8 series launch it is ridiculous to claim an assumption that most vista early adopters bought nvidia hardware because it supported DX10 as a reason for the elevated crash rate. to pass this off as fact is damaging to yourself and NVIDIA and to our forums.
how many people here overclock their graphics cards ... not most by a long shot.
now lets take that a step further ... how many people tried to overclock their graphics cards on Vista near launch when we were lucky to all of our drivers installed and have sound come out our speakers?
people pushing their hardware happens at a relatively low rate compared to the adoption of that hardware (with the possible exception being midrange intel CPUs just because they are way too easy to OC and just beg for it). combine that with an unstable new OS and people will steer clear of pushing their systems in order to increase their chance at stability.
and i'm sure you could "care less" what computers with integrated graphics do. especially if they crash at a lower rate than NVIDIA cards. Initially with vista and early drivers, gaming wasnt the only problem for me. I'd get graphics drivers resetting while surfing the web. intel, amd, and nvidia. you can't factor out 2d performance, as people experienced crashes here as well.
And how many people try to run games on an intel chipset? i bet a lot of people bought their "vista ready" computers and assumed it could run a game ... i bet a lot of people had a bad experience with their graphics, but it doesn't seem like that had windows crash on them.
dismissing intel and amd and saying that the increase in driver crash rates for nv are due to a ludicrously inflated market share is not something i'm going to let slide under the radar.
watch yourself.
Originally posted by: chizow
While I don't necessarily agree with Rollo's approach, there's certainly some indicators out there that back his claim and directly refute some of the claims you make above. I linked to this earlier, but here's the Steam Survey with ~1.5 million unique samples. Its not perfect by any means, but its certainly solid evidence that would probably only be rivaled by Microsoft's own data. MS has released total number of errors by vendor, but without detailing % by vendor or even frequency per sample or even error type that doesn't do us much good.
You said yourself you experienced errors in Vista with parts from all 3 vendors, yet you never mentioned whether 1 vendor was more or less susceptible based on your experiences. As a reviewer you have access to more parts than most consumers so that feedback would've been valuable. There's plenty, including myself and Rollo, who have acknowledged TDR errors attributed to NV's drivers, but ultimately were fixed external the video card and its drivers. And again, that's before we even start talking about all of MS' hot fixes that largely undid the offending changes to Vista's video stack.
How is it irrelevant? The only thing that would make it irrelevant is it emphasizes PCs used for gaming which you could argue limits the scope of potential errors to gaming applications. However, the data in the survey is completely relevant, when it started matters not as it is a snapshot of all hardware in that time frame purchased past to present. Sure there's going to be some migration as some PCs switch older parts for newer parts, but I can already tell you based on that Steam data the assumptions you made about the X19XX parts enjoying anything close to G80s success are simply UNTRUE based on:Originally posted by: DerekWilson
the steam data is irrelevant. it was started nov 2007, which is well after most of the problems on all hardware came under control.
Thanks for the reply about which solution was more problematic. I wouldn't mind more feedback no, because again, you guys will have more experience than your typical end-user simply because you have access to hardware from all vendors. I do recall some of Ryan's writings on the topic in his Messy Transition articles and the memory allocation fix nearly eliminated all crashes in Vista. Now, if MS makes a change to their video stack that causes a video driver to crash, then turns around and hot fixes it and those crashes go away, who's ultimately at fault? I don't disagree with the possibility that NV's drivers are more susceptible to crashes, but again the fact that ATI and Intel both had ~10% and that major fixes came from MS to solve these crashes tells me you can't assume much at all from that 30% number. The main thing that bothers me is that there's those using this as an opportunity to take shots at NV's driver quality from that % without even owning NV hardware or ever using Vista.....since you asked, i did personally have more problems with nvidia solutions than with any other. i feel confident in saying that our other reviewers had more problems with nvidia as well if the frequency of our internal emails on the subject are any indicator.
gary key and ryan smith would be the ones on the front lines with me with vista and i can ask them if you want a definitive answer.
yes, 3rd party apps and drivers did cause problems. yes ms hotfixes solved a lot of issues. but these things affected everyone. even if it might not have been nvidia's fault on a technicality, robustness in a system is still an important factor in driver design. if you are saying that these factors affected nvidia more heavily, you are also saying that nvidia's drivers are not as resilient as other companies' drivers, which is still an issue that should be taken into consideration.
We do. Need to make sure it outweighs the FUD.Originally posted by: DerekWilson
first, thanks, but we don't need that whole thing quoted in every post ...
Well if my XP experiences were anything to go on, nVidia's Vista majority driver crashes cannot be totally explained by other factors.The main thing that bothers me is that there's those using this as an opportunity to take shots at NV's driver quality from that % without even owning NV hardware or ever using Vista.....
Originally posted by: BFG10K
Even ignoring all of that, the fact remains we haven't had an official WHQL driver on XP for the 8xxx series since December 2007. Meanwhile ATi?s entire supported line-up got Cat 7.12, 8.1, 8.2, 8.3 with 8.4 coming next month. And they?ve been doing this since 2002.
No mater what spin someone puts on this, ATi?s driver support is simply superior to nVidia?s and has been for years.
I love them - it means rapid progress can be made and that's great for the consumer. In any case you?re perfectly free to only update whenever you feel like it but for those that need it the option?s there.To be fair, I don't like monthly driver releases.
That certainly happened, but rarely. What I find with nVidia is when things are broken they generally take months if not years to fix, and this is particularly bad when the problem?s serious (e.g. the DEP issue on SP2 which took over six months to fix).For years, one month's catalyst would break something that was fixed the previous month (they had (not sure what's up now) two driver teams working on versions that would leapfrog each other).
Well since you?re talking about the control panels, both vendors suck and I use third party utilities. At least with ATi?s CCC you can access most 3D image quality settings right from the tray, unlike nVidia?s tray which doesn?t appear to have been updated since the GF 6xxx series.NVIDIA has almost always offered more customization options (until ATI finally listened and added clock adjustment and more quality sliders for things).
I would disagree with this. After ATi got their act together with the Catalyst program back in ?02, I?ve found their drivers are more compatible and robust in general, especially in games and settings not actively benchmarked. nVidia appear to spend far too much time making drivers that produce pretty graphs for reviewers at the cost of everything else.In my opinion, NV had better drivers on XP (though I wasn't testing graphics drivers when XP launched),
They were unified before and it still took months for a new driver. All that?ll happen now is 9xxx users will have to wait months for drivers like the rest of us.::EDIT:: oh yeah, and hooray for a move back toward unified drivers ... hopefully they'll be whql soon.
Originally posted by: BFG10K
That certainly happened, but rarely.For years, one month's catalyst would break something that was fixed the previous month (they had (not sure what's up now) two driver teams working on versions that would leapfrog each other).
Originally posted by: BFG10K
Well if my XP experiences were anything to go on, nVidia's Vista majority driver crashes cannot be totally explained by other factors.
Originally posted by: DerekWilson
I'm willing to admit that my perspective also has to do with the fact that the capability and ease of use of these drivers directly affect my ability to do my job efficiently. This might not make all my views of the overall subject matter relevant to the average gamer.