nVidia Keeps Rehashing the 9xxx Series

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: batmang
Originally posted by: Sirlaughalot87
But why are the market going the "multi" GPU way! I don´t want to use bad scaling tech as SLI or Crossfire, i want ONE card, i don´t want two of those energy leechers in my case, i don´t want two cards that can malfunction; One card is more then enough...as it should be.

Why twin turbo when you could use one large turbo.
(This makes no sense just like this thread.. but I'm posting it anyway.)

In this case, that's the exact opposite analogy.

One turbo takes longer to spin up (and spin down) than two small ones. This causes 'turbo lag', or the perception of unresponsiveness. Latency is critical in the stoplight grand prix, quarter mile and even track racing. In almost all cases you can anticipate need and try to build up a full head of steam in the turbo before you need the power, but this requires a lot more driver attention and skill than the guy with the great big chuffing V8 with cylinders the size of Folgers coffee cans, a roots or Lysholm screw type supercharger and 600 ft-lb of axle snapping torque off idle.

The other approach to multiple turbos is multi-stage turbo: the first turbine feeds the second one.

In any event, both two smaller vs one larger approaches reduce 'latency' -- the time you need to wait between sticking your foot in the turbo and feeling your face peel back. The second approach also allows for moar airflow and pressure! Since everyone knows air + fuel in = power out...

In the case of multi-GPU it's the other way around. Multiple slower GPUs will give you significantly more input lag than one fast GPU if used in AFR mode.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
jared... where does all this pent up anger come from? Did nVidia kill your father and you are now out for revenge?
 

surfsatwerk

Lifer
Mar 6, 2008
10,110
5
81
Originally posted by: taltamir
jared... where does all this pent up anger come from? Did nVidia kill your father and you are now out for revenge?

Jared is not left-handed, fyi.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
yah, I tell Huang, "My name is Amigo Montoya, You keel my father. Prepare to die"

Originally posted by: taltamir
jared... where does all this pent up anger come from? Did nVidia kill your father and you are now out for revenge?

Simply put, I think they are sitting on new technology.
 

Continuity28

Golden Member
Jul 2, 2005
1,653
0
76
The only thing I've begun to dislike is nVidia's drivers lately.

I'm still using 93.71 for my 7900GT, because newer drivers break a bunch of things on this card. Video files don't display correctly over DVI - broken vsync, heck, even some older games do it too, where no matter how vsync is enabled, the sync breaks halfway down the screen. Some OpenGL games randomly flicker out to a full screen of diagonal bars - which you can alt-tab out of, but it reappears when you tab back in the game. 163.75 doesn't have the latter issue, but the drivers since 169.xx do for me. 174.74 is the same.

Maybe if I had an 8 series or 9 series card, I wouldn't be getting these issues with new drivers, but it just seems so many of their drivers are hit and miss.

If I were to upgrade right now, I'd probably move to a 9800GTX because the price is right. However, I want to wait and see what happens this next generation, especially from ATI because of the nVidia driver issues.