Anybody know if NVIDIA fixed the 2560x1600 4xAA bug?

Woofmeister

Golden Member
Jul 18, 2004
1,385
1
76
I've got two Nvidia GTX 275s running in SLI and am thinking of picking up a Dell 3007 this weekend to run games at 2560 x 1600. However, I know that at least as late as mid-May, NVIDIA driver 182.81 was still suffering from the 2560 x 1600 4xAA bug drastically dropping frame rates. See review of my video configuration here.

Of course NVIDIA's now up to GeForce 186.18 in WHQL drivers, so it's possible the problem has been fixed by now.

Can anybody confirm this?
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
There are no bugs, Crysis at 2560x1600 with 4xAA is the bug. The game won't be playable at 2560x1600 with 4xAA until next gen.

With GTX 275 SLI, you can expect to play 1920x1200 with no AA. Assuming all High with Very High Shaders, High Shaders look bad. If you like High Shaders then surely your fps will be higher.

I run 1920x1080 with no AA and I still cannot get 60 fps at all times with my setup. It's between 40 and 60, but on average closer to 60. At 2560x1600 I get between 30 and 60 with an average around 45.

Sure, if I turned Shaders to High I could probably run 2560x1600 at 60 fps, and maybe turn on some AA, but the game looks very bad with High Shaders imho. I'll take VH Shaders before AA.

Check this page for accurate numbers, keep in mind your cards are basically GTX 280s
http://www.anandtech.com/video/showdoc.aspx?i=3520&p=7


 

Woofmeister

Golden Member
Jul 18, 2004
1,385
1
76
Originally posted by: JAG87
There are no bugs, Crysis at 2560x1600 with 4xAA is the bug. The game won't be playable at 2560x1600 with 4xAA until next gen.

With GTX 275 SLI, you can expect to play 1920x1200 with no AA. Assuming all High with Very High Shaders, High Shaders look bad. If you like High Shaders then surely your fps will be higher.

I run 1920x1080 with no AA and I still cannot get 60 fps at all times with my setup. It's between 40 and 60, but on average closer to 60. At 2560x1600 I get between 30 and 60 with an average around 45.

Sure, if I turned Shaders to High I could probably run 2560x1600 at 60 fps, and maybe turn on some AA, but the game looks very bad with High Shaders imho. I'll take VH Shaders before AA.

Check this page for accurate numbers, keep in mind your cards are basically GTX 280s
http://www.anandtech.com/video/showdoc.aspx?i=3520&p=7

Tom's Hardware believes it is a bug rather than an actual hardware limitation.

Nvidia still has a problem at 2560x1600 with 4xAA enabled?we?ve seen this one over and over in a number of different games.

I'm inclined to agree.

Note the bug is even more obvious in STALKER Clear Sky.

The most striking result here is the drop from 1920x1200 to 2560x1600. The same bug seen in Crysis manifests itself here as well.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,998
126
I'm looking at the figures, but I don't see a problem anywhere. The fact that the other resolutions barely have a performance hit tells me they're CPU limited on that configuration.

Obviously 2560x1600 with 4xAA finally strains the GPUs so they suffer a performance loss. Not only that, but Crysis and Stalker are known to use voracious amounts of video memory, to the point of 2 GB cards showing a benefit over 1 GB cards.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
As BFG10K said, were not talking going from 70 fps to 7 fps, 19x12 only gets around 31 fps in stalker clear sky, so 25x16 getting 7 fps isn't incredible.

Apart from crysis and stalker clear sky, I have no problems running 2560x1600 4xAA. The only other hurdle I've ran into are games with Physx, for example Mirror's Edge. That's not playable at 25x16 with 4xAA physx on. You either turn physx off or lower the resolution.

PS. don't take anything Tom's says seriously, they have some really ignorant people writing articles over there. I like how if there is more than a 30% delta in performance, "oh it must automatically be an nvidia bug". No way, it couldn't possible be that we reached the limits of the hardware...

 

Woofmeister

Golden Member
Jul 18, 2004
1,385
1
76
Duly noted. Has anybody seen any reviews showing how the new 2 GB cards handle Crysis and STALKER at 2560x1600 with 4xAA?
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
But Mirrors Edge does have a bug, even with the lowest settings with PhysX on, it still running great until you break something that activates PhysX effects, it won't run smoother anymore, even if you look straight face to face to a wall which usually skyrockets the FPS, the game runs in the low 20's no matter what, and the game is never playable again unless if you restart it. It doesn't even tax my CPU at all, barely uses 1 core which runs under 43%, the PhysX implementation cripples it's performance with a purpose.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
E8:
You can turn off PhysX.
Obviously, PhysX designed to run on GPUs will not, or cannot run on you CPU. Your CPUs architecture is not CUDA based. It is X86 based. I can only imagine why you would think any CPU would be expected to run code meant for massively parallel CUDA hardware.
 

error8

Diamond Member
Nov 28, 2007
3,204
0
76
Originally posted by: Woofmeister

Tom's Hardware believes it is a bug rather than an actual hardware limitation.

Nvidia still has a problem at 2560x1600 with 4xAA enabled?we?ve seen this one over and over in a number of different games.

I'm inclined to agree.

Note the bug is even more obvious in STALKER Clear Sky.

The most striking result here is the drop from 1920x1200 to 2560x1600. The same bug seen in Crysis manifests itself here as well.

Tom's Hardware is retarded! They lost credibility a long time ago, ever since they said that the i7 cpus are bad overclockers, just because they didn't know how to do it properly.

In my opinion STALKER or Crysis gets over 1 gb of video memory at 2560X1600, especially when AA is used. That is no bug, is just what the current graphic cards are capable of, in this generation. So you can have like 4XGTX 275 cards, but it will still crawl at 2560X1600 4XAA, because you have only 898 mb of vram.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
Originally posted by: Keysplayr
E8:
You can turn off PhysX.
Obviously, PhysX designed to run on GPUs will not, or cannot run on you CPU. Your CPUs architecture is not CUDA based. It is X86 based. I can only imagine why you would think any CPU would be expected to run code meant for massively parallel CUDA hardware.

Why PhysX based games are single threaded? Smells like Conspiracy, while is true that GPU are much more powerful, why Havok Phisics or Cryphisics runs fine on a CPU? Why older PhysX games runs fine on a CPU? Why current PhysX implementations cripple it's performance? What you stated there is just your opinion and no facts, why in mirrors edge I stand in front of a wall which is suppose to skyrocket your FPS because you are just rendering the wall on screen, and yet the FPS doesn't increase at all? I did turn PhysX off, it doesn't make a difference in the inmersiveness of the game anyways.

But back on topic, nVidia has been known that it's memory management is a bit more buggy and tends to run out of VRAM easier, STALKER and Crysis at such high resolutions will be more GPU bound than VRAM bound, I don't think a GTX 280 or HD 4890 with 2GB of VRAM can play both games at such high resolution with FSAA, probably in Tri SLI or Quad CF.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Conspiracy. Ok. Yeah that woud be a bit OT here. Since you felt the need, maybe a new thread would be a good outlet for your conspiracy theories.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
It's not conspiracy.

Physx is absolutely nothing like Havoc and Crytek physics, not even ballpark. Neither one of them can do fluids and cloths properly, so don't bother comparing the two and arguing that they run on CPU while physx chokes.

Now on the other hand, does it impress me? No not really. Physics is all about how objects act in response to your actions, Havoc and Crytek have that covered pretty well. Pretty water and fancy cloths are nothing more than eye candy. That's what physx is all about and that's why it's so easy to market.





 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
81
http://enthusiast.hardocp.com/...wxMCwsaGVudGh1c2lhc3Q=

CPU Physics

Ghostbusters proves to us that games can employ gameplay-physics and effects-physics all calculated by the CPU, leaving the GPU free to handle what it is designed for: Graphics. NVIDIA has a vested interest in PhysX, and they push it rather aggressively. A Quad-Core CPU is vastly underutilized in a lot of games, since the focus has been on graphics, and video cards have become so powerful. GPUs are so powerful, in fact, that designers are looking for something more to do with them. NVIDIA and AMD both want general-purpose computing to be done on the GPU, and that is great. But NVIDIA wants to push PhysX on developers and consumers to sell more NVIDIA-based video cards, not to improve gameplay based physics.

Ghostbusters is proof that the GPU-physics approach isn?t the only valid one. The CPU is the natural place for physics calculations, and with quad-core CPUs available and virtual octo-core CPUs with i7, and higher core CPUs on the horizon, CPU-physics has the potential to be exploited much further.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: evolucion8
http://enthusiast.hardocp.com/...wxMCwsaGVudGh1c2lhc3Q=

CPU Physics

Ghostbusters proves to us that games can employ gameplay-physics and effects-physics all calculated by the CPU, leaving the GPU free to handle what it is designed for: Graphics. NVIDIA has a vested interest in PhysX, and they push it rather aggressively. A Quad-Core CPU is vastly underutilized in a lot of games, since the focus has been on graphics, and video cards have become so powerful. GPUs are so powerful, in fact, that designers are looking for something more to do with them. NVIDIA and AMD both want general-purpose computing to be done on the GPU, and that is great. But NVIDIA wants to push PhysX on developers and consumers to sell more NVIDIA-based video cards, not to improve gameplay based physics.

Ghostbusters is proof that the GPU-physics approach isn?t the only valid one. The CPU is the natural place for physics calculations, and with quad-core CPUs available and virtual octo-core CPUs with i7, and higher core CPUs on the horizon, CPU-physics has the potential to be exploited much further.

You know this is all going to backfire on ya when Havok is finally run on a GPU, right?
Again, start a new thread. Please.
 

AmdInside

Golden Member
Jan 22, 2002
1,355
0
76
I've played with AA on and off at 2560x1600 and quite honestly, as I am playing, I can not tell a difference. It just seems to me when you are playing at such a high screen resolution, you do not need AA.