GeForce 310.33 Beta Drivers Boost Performance By Up To 15%

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
What the hell happened in this thread? Haha.

Also for the record, Hokies stated he works for Galaxy, which is an nVidia only partner, so marketing thread? Sure, why not ;)

Why do people even get into it? After spending a month in the P&N section (goddamn election year) I've come to almost hate all of you flag waving jack-offs. Sure, I can ignore you all, but then this place would be empty since I'd have to ignore myself at some point (LOL.)

But on the topic, are there any WoW benches? I saw Blackened (AMD shill, remember guys?) posting about performance improvements for his 680s. Since WoW is on my plate, I only care bout that (right now.)

KTHXBAI!

There is no issue with someone affiliated with a company posting here. It's different when someone who is affiliated with a company trolls and flamebaits. I think they should be given a very short rope and hung with it if they start trouble. We need to be careful not to start a witchhunt though. I don't look for troublemakers, but I don't recall Hokies being a troll.

In Railven's defense I think he's being tongue in cheek with both of his references here.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Since I just installed these last night, on my system (Her's in the sig) we got no difference in WoW: Panda Land.

The card ran the game fine anyways, so it wasn't a loss, though.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Imho,

If some feel that someone is trolling, report the post, tolerance is key, or ignore the poster. Even if a post tries to raise a con -- still a fair view - because discussions would be just pros.

Sometimes the best posts, discussions and debates are from two opposing points-of-view -- pros-and-cons -- informative and entertaining - what is tough reading is personal aspects.
 

Schmide

Diamond Member
Mar 7, 2002
5,712
978
126
This is kind of annoying. AMD releases their drivers claiming improvement, and we get reviews out the wazzo. Not one site that I can find has reviewed/confirmed, the improvements for nVidia. I've only seen one (H) that used the new GeForce 310.33 drivers.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
AMD would have gotten their media partners on board to get that much coverage. I'm sure nVidia could have done the same if they wanted to. They must not think that it's that big of a deal. :shrug:
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Thanks for the Hardocp link:

http://hardocp.com/article/2012/10/31/galaxy_geforce_gtx_660_ti_gc_3gb_sli_review/11

hardocp said:
We've beaten this topic into the ground in the past, but it has been a while since we brought it up. There is a smoothness to SLI we just can't put in words. We know for a fact that NVIDIA uses an algorithm that smoothes "frametime" in SLI. We don't know what it’s called, or even how it works, but we know it exists, and we know NVIDIA employs some special sauce when it comes to SLI. It is something that can only be felt, as you play a game, it is not something that shows up in a framerate over time graph. So what you see is AMD CFX winning in framerate, but not winning in frametime or overall game smoothness.

I believe its called frame rate metering. Kepler hardware supports hardware frame metering and so far, I believe it may be implemented for the GTX 690:

http://www.geforce.com/whats-new/articles/article-keynote/

nVidia said:
Improved Frame rate Metering

Kepler introduces hardware based frame rate metering, a technology that helps to minimize stuttering. In SLI mode, two GPUs share the workload by operating on successive frames; one GPU works on the current frame while the other GPU works on the next frame. But because the workload of each frame is different, the two GPUs will complete their frames at different times. Sending the frames to the monitor at varying intervals can result in perceived stuttering.

The GeForce GTX 690 features a metering mechanism (similar to a traffic meter for a freeway entrance) to regulate the flow of frames. By monitoring and smoothing out any disparities in how frames are issued to the monitor, frame rates feel smoother and more consistent.


Example:

http://www.tomshardware.de/Fastest-VGA-Card-2012,testberichte-241127-8.html
 
Last edited:

HurleyBird

Platinum Member
Apr 22, 2003
2,800
1,528
136
There's no such thing as a free lunch. Metering frames, essentially holding back for a small time to order the frames correctly, increases latency. Until AMD and Nvidia can come up with something better than the current band aid solutions for multi-GPU rendering I'm steering clear of both of them.
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
There's no such thing as a free lunch. Metering frames, essentially holding back for a small time to order the frames correctly, increases latency. Until AMD and Nvidia can come up with something better than the current band aid solutions for multi-GPU rendering I'm steering clear of both of them.

Have you tried it?..just asking
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
This is kind of annoying. AMD releases their drivers claiming improvement, and we get reviews out the wazzo. Not one site that I can find has reviewed/confirmed, the improvements for nVidia. I've only seen one (H) that used the new GeForce 310.33 drivers.

The main reason why AMD pushed so hard to have the drivers reviewed is simply because their drivers completely changed the current GPU landscape. Nvidia's new drivers are nowhere near as game changing as 12.11 is.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
The main reason why AMD pushed so hard to have the drivers reviewed is simply because their drivers completely changed the current GPU landscape. Nvidia's new drivers are nowhere near as game changing as 12.11 is.


5% ? You are setting up drama that does not exist.

At 1920x1200 we’re seeing a roughly 5% across the board performance improvement for both the 7970 and the 7950. Everything except Starcraft II sees at least a marginal improvement here, with Starcraft II being the lone exception due to the previous issues we’ve run into with the 1.5 patch. The 7770 also sees some gains here but they aren’t quite as great as with AMD’s other cards; the average gain is just 4% at 1680x1050, with gains in individual games being shallower on the 7770 than they are on other cards.
Interestingly even on the 7970 the largest gains are at 1920x1200 and not 2560x1600.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
5% ? You are setting up drama that does not exist.

5% overall is actually a lot for a driver update. Of course the biggie was 28%+ in BF3. If you play BF3 it is a game changer. This isn't from the release drivers, either. It's since 12.7's, which had some sizable overall increases already.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Nvidias frame metering really is excellent. I have used crossfire for years and at the beginning of 2012 got a pair of 7970s to find the stutter unbareble. Eventually changed to 680s and the improvement was obvious.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Nvidias frame metering really is excellent. I have used crossfire for years and at the beginning of 2012 got a pair of 7970s to find the stutter unbareble. Eventually changed to 680s and the improvement was obvious.

I thought the frame metering was only part of the GTX 690? Unless I misunderstood what I read (likely.)

AMD is a joke with multi-cards right now. Glad I only use one! :D
 

TheUnk

Golden Member
Jun 24, 2005
1,810
0
71
I thought the frame metering was only part of the GTX 690? Unless I misunderstood what I read (likely.)

AMD is a joke with multi-cards right now. Glad I only use one! :D

The nvidia quote above states that Keplar introduced it and the 690 features it. That's not to say that other 600 series cards aren't featuring it. In any case, I get no feeling of microstutter with sli 670s.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Sli, has/had a software frame metering but with Kepler, it has improved frame metering and more hardware based. Personally tried to get information from nVidia on hardware frame metering with Kepler in the live GTX 690 review at PC perspective - -basically received, SirPauly good to see you here and sorry we are not here today and not ready officially to discuss hardware frame metering on Kepler. In other words, nothing, hehe, and tried to find anything and the sites that did ask, basically said, nVidia isn't talking.

The GTX 690 could be using hardware based frame metering right now and maybe the rest of SLi is using software based frame metering -- there is no way of knowing based on how quiet nVidia is.

Maybe a reviewer or editor can ask nVidia again.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
There was a software program to do a similar thing for AMD that I read about lately. They claimed it was pretty decent. It was just some random dudes program.

The 690 seems ok, I didn't focus on thinking about the stuttering from SLI thus I didn't really notice it. Then again I didn't really notice it or think about it with my older AMD crossfire setup either.

E* http://translate.google.com/transla...test-VGA-Card-2012,testberichte-241127-8.html
The Radeon Pro program.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
5% ? You are setting up drama that does not exist.

Drama that doesn't exist? The 12.11's completely changed the performance landscape. Also, that 5% is from one reviewer with a specific set of benchmarks and IQ selections. Other reviewers saw 7-10% across the board depending on the specific AMD gpu. It may not seem like much, but when it puts you in a position to be faster than the competition at a specific price point it is a huge deal.

Nvidia needs to lower prices to make their GPU's attractive again. I'm of the opinion that they needed to lower their prices on the gtx680 before these drivers, but that's mainly because I think paying the premium for 1-2 physx titles a year is ridiculous.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Drama that doesn't exist? The 12.11's completely changed the performance landscape. Also, that 5% is from one reviewer with a specific set of benchmarks and IQ selections. Other reviewers saw 7-10% across the board depending on the specific AMD gpu. It may not seem like much, but when it puts you in a position to be faster than the competition at a specific price point it is a huge deal.

Nvidia needs to lower prices to make their GPU's attractive again. I'm of the opinion that they needed to lower their prices on the gtx680 before these drivers, but that's mainly because I think paying the premium for 1-2 physx titles a year is ridiculous.

Nvidia GPU's are attractive enough without them having to lower prices. A driver update or two down the road and things will be back where they were, however small a difference that might be. And you know who else thinks 1 or 2 PHysX titles per year are ridiculous? Just ask all those using Hybrid PhysX right now. They think it's worthless too. But they are second guessing their decisions because by using two cards, defeats the power consumption arguments they used previously. Better off just using one card to do it all and only use a second card if you really need it.
By the way, Borderlands 2 ALONE is worth the price of PhsyX admission. But it isn't alone.
 
Last edited:

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
Nvidia GPU's are attractive enough without them having to lower prices. A driver update or two down the road and things will be back where they were, however small a difference that might be.

You're trying really hard to play damage control. Try again when Nvidia is back in the lead :thumbsup:

If you have a problem with someone, please report the post. Instead you're thread crapping, and because of your long history of doing that you're now on a vacation
-ViRGE
 
Last edited by a moderator:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Man, some of you may want to argue somewhere else. You guys already wrecked the AMD performance driver thread with off-topic AMD-vs-NV chatter, and now you want to wreck this one too, with the same off-topic chatter? Ugh.

Here's something on topic: when will Anandtech buy or rent a high-speed camera so we can stop playing guessing games with frametimes (or more accurately, ignoring frametimes altogether) and MEASURE the ACTUAL FPS.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
As an advocate of GPU Physics, have been disappointed by the amount of titles and was hoping for 6-12 AAA titles a year. Was always curious what nVidia's view was -- on the lack of GPU Physic content?

Outstanding question by Zogrim:


PhysXInfo.com: Over last years, amount of GPU PhysX games is actually decreasing. There were five games in 2009, three in 2010 and so far only one in 2011. How can you explain that?

Tony Tamasi: It was a choice on our part. We had a large amount of resources we could otherwise dedicate to content, but we needed to advance the core technology. We needed to get PhysX 3 done, and we needed to get APEX done to the degree where it is usable by game developers. We had to put a lot of resources there, which meant that some of those resources weren’t directly working on games.

But in the long term, game developers can actually use PhysX and APEX, and make use of the GPU without significant amounts of effort, so that a year or two years from now more games will come out using GPU physics.



Rev Lebaredian: When we initially acquired Ageia, we made a big effort to move many games over to GPU PhysX. We learned a lot in that period of time: getting GPU physics into games, what are the problems, what works and what doesn’t. That gave us the opportunity to regroup, refocus, and figure out how to do it correctly.

We made a conscious decision. After we did a bunch of PhysX and APEX games in 2009 and early 2010, we said “Ok, we have learned enough, we need to sit down and focus on finishing APEX and changing it based on what we just learned, as well as PhysX 3”. Doing as many titles as we were doing before was just going to slow us down.

It made more sense to slow down the content pipeline but get the tools right, but that puts us in the position when once those are complete, it is actually less work for us to get PhysX in games.

This slowdown has not been because of any problems. It is something that we have decided to do.


Exclusive: NVIDIA talks present and future of PhysX Technology

http://physxinfo.com/news/6419/exclusive-nvidia-talks-present-and-future-of-physx-technology/

With nVidia finishing SDK 3.0 -- will see if there is more content moving forward into 2013 and 2014.
 

Granseth

Senior member
May 6, 2009
258
0
71
As an advocate of GPU Physics, have been disappointed by the amount of titles and was hoping for 6-12 AAA titles a year. Was always curious what nVidia's view was -- on the lack of GPU Physic content?

(...)

I have been hoping for a long time that physics would get more involved in gameplay and not just some glorified bling.

But sadly I usually gets disappointed when I see a title thats been focusing on GPU physics the physics just become distracting and makes it look unrealistic, and just opposite what I want physics to do in a game.
I think Mirrors Edge did a better job of physx than the new Batman games for example, but I think that it's in driving games that I like physics advancements best. Mostly you don't see it, but you feel it, and that is what I want. Improved immersion and improved gameplay.

So lets hope (at least I do) that physx manages to push immersion and gameplay forward too. And before you beat me down, I do get that physx improves the environment in the games, it's just that they mostly overdo it so it just becomes unrealistic and distracting.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I have been hoping for a long time that physics would get more involved in gameplay and not just some glorified bling.

But sadly I usually gets disappointed when I see a title thats been focusing on GPU physics the physics just become distracting and makes it look unrealistic, and just opposite what I want physics to do in a game.
I think Mirrors Edge did a better job of physx than the new Batman games for example, but I think that it's in driving games that I like physics advancements best. Mostly you don't see it, but you feel it, and that is what I want. Improved immersion and improved gameplay.

So lets hope (at least I do) that physx manages to push immersion and gameplay forward too. And before you beat me down, I do get that physx improves the environment in the games, it's just that they mostly overdo it so it just becomes unrealistic and distracting.

I agree, we don't need a bazillion papers or shards of glass or whatever just to show off hard particles. Overdoing it would seem to detract from user experience. Other things like more realistic hair (instead of the totally unrealistic hair in almost every game, if not every game), better cloth effects, etc. add to the game. I like the fog effects they had in Batman, even if it was unnecessary to have fog there in the first place. But I can see a use for those fog effects in other games like rowing a boat through the mists/fog to get to an enchanted island or something like that.. for atmosphere.
 
Status
Not open for further replies.