(Youtube) Frame Rating and FCAT Explanation and Discussion with NVIDIA *PCPER*

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
There was some bias, they ignored some of the minor issues Nvidia was having in a couple of the slides.

He also didn't want to admit that Titan was at times better than the 690 despite the lower frame rates. Though he did eventually you could tell he was conflicted by it. He also went so far as to say they're still working on it. It's not like they're going to say "we won" and pack up and go home, there is still work to do even with their cards.

However it presents a compelling issue for me with my 7950's, I knew going into this before AMD was exposed their drivers weren't on par with Nvidia in MGPU. However when you consider Nvidia has hardware and software solutions for this that goes back since before Fermi, what chance does AMD have with software to be comparable anytime soon?

Nvidia has been working on making MGPU better for several years, AMD has been working on it for several months, I have little faith :(


I like the guy from Nvidia though, he's goofy and smart at the same time.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
There was some bias, they ignored some of the minor issues Nvidia was having in a couple of the slides.

He also didn't want to admit that Titan was at times better than the 690 despite the lower frame rates. Though he did eventually you could tell he was conflicted by it. He also went so far as to say they're still working on it. It's not like they're going to say "we won" and pack up and go home, there is still work to do even with their cards.

However it presents a compelling issue for me with my 7950's, I knew going into this before AMD was exposed their drivers weren't on par with Nvidia in MGPU. However when you consider Nvidia has hardware and software solutions for this that goes back since before Fermi, what chance does AMD have with software to be comparable anytime soon?

Nvidia has been working on making MGPU better for several years, AMD has been working on it for several months, I have little faith :(


I like the guy from Nvidia though, he's goofy and smart at the same time.



If you are still somewhat happy with your 7950s hold off until the new fix drivers are released.

Did you catch this graph in the vid? 680 ran out of VRAM, that would also happen to the 690 if in the same scenario I would think. Seems that 3gb is still perfectly adequate for triple monitor setups.

 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
If you are still somewhat happy with your 7950s hold off until the new fix drivers are released.

Did you catch this graph in the vid? 680 ran out of VRAM, that would also happen to the 690 if in the same scenario I would think. Seems that 3gb is still perfectly adequate for triple monitor setups.


How do you know?
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
How do you know?


The large spike between 1 and 2. Then all the large jaggies ahead of it. Its an indication and really expected at that resolution. This just proves that Nvidia should have refrained from releasing a 2gb GTX 770, especially if priced at $450.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Could be anything, at that point since it's a 5 second blurp. If the next 55 seconds looked like that it would be more compelling for your position.
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
Could be anything, at that point since it's a 5 second blurp. If the next 55 seconds looked like that it would be more compelling for your position.


Speculation at this point, but something definitely wrong. I blamed it on VRAM due to the resolution.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
The large spike between 1 and 2. Then all the large jaggies ahead of it. Its an indication and really expected at that resolution. This just proves that Nvidia should have refrained from releasing a 2gb GTX 770, especially if priced at $450.

I take it you're trying to steer the thread in any other direction than intended? By accident of course?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Of course there is probably one setting that you could lower and fix the issue. And lets be honest, those FPS are not acceptable on any of those cards, so you'd definitely have reason to lower a few settings.
 
Last edited:

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
I take it you're trying to steer the thread in any other direction than intended? By accident of course?



No, I am not. Why do you say that? I just found it interesting that they flew past that graph on the slide and didn't mention anything of it. Dude, Nvidia should be aware that 2gb is too little for a card of $450 - 500. You as a focus group member should be stressing this too, but you just like to bash anyone that has an opinion it seems. I purchased a GTX 780 for the fact it has more than 2gb of RAM and around Titan performance. It was $650 but I personally don't have the buyers remorse because I was a Titan owner and 6gb of VRAM is dumb especially if I game with a single monitor. I can't wait to get home and crack open a beer and game tonight with my 780. :)


Let me be clear Keys. I am not an AMD or Nvidia fanboy. I take things as I see it so quit trying to make me out to being a shill. It's quite surprising how much garbage you spread around here, why don't you start backing up your responses with facts instead of feeding me one liners? Thanks.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
No, I am not. Why do you say that? I just found it interesting that they flew past that graph on the slide and didn't mention anything of it. Dude, Nvidia should be aware that 2gb is too little for a card of $450 - 500. You as a focus group member should be stressing this too, but you just like to bash anyone that has an opinion it seems. I purchased a GTX 780 for the fact it has more than 2gb of RAM and around Titan performance. It was $650 but I personally don't have the buyers remorse because I was a Titan owner and 6gb of VRAM is dumb especially if I game with a single monitor. I can't wait to get home and crack open a beer and game tonight with my 780. :)


Let me be clear Keys. I am not an AMD or Nvidia fanboy. I take things as I see it so quit trying to make me out to being a shill. It's quite surprising how much garbage you spread around here, why don't you start backing up your responses with facts instead of feeding me one liners? Thanks.

Considering that benchmark showed every card giving too low of FPS with the settings used, it does not show that 2GB is not enough. You have to first pick playable FPS, or even ideal FPS, then test to see if 2GB is enough. And I mean playable on the 3gb-6gb cards, so you know if 2gb was a limitation.

Most every case (every one I've seen), the only times 2gb has not been enough, has been when using settings that gave subpar FPS unless in 3-way SLI/CF.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Measuring the reasons... sounds legit

First and last reason why all this FCAT is BS was in the video(@0:40:30). People think they found a solution for stuttering in frame limiter or vsync. And what do they say?"No, my tool measures that there is still stuttering. Don't let your eyes fool you!".

But I don't care since I don't use multi GPU setup.

They made a very stupid statement, if your eyes fool you so what? It's what you see that's important, if you don't notice it you shouldn't be in the least bit concerned about frame-times variation.What really happens is not important as long as you don't notice it and you feel that the game is smooth. As for 30fps being smooth a lot of console games are capped at 30fps and people enjoy it. While it will never feel as smooth as 60fps you can get used to it, as long as it's constant 30fps and you don't have drop-downs from 60fps to 30fps because that's always painfully apparent and distracting.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
They made a very stupid statement, if your eyes fool you so what? It's what you see that's important, if you don't notice it you shouldn't be in the least bit concerned about frame-times variation.What really happens is not important as long as you don't notice it and you feel that the game is smooth. As for 30fps being smooth a lot of console games are capped at 30fps and people enjoy it. While it never will feel as smooth as 60fps you can get used to it, as long as it's constant 30fps and you don't have drop-downs from 60fps to 30fps because that's always painfully apparent and distracting.

I'm not convinced that if you don't see it, you should ignore it. The reason I say this, is because it still smoother, and ultimately, the objective is to choose the smoothest setup possible for the budget. While it might not matter in many cases, what happens when future games start pushing your GPU harder, and your FPS drop, then those difference become exposed.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
I'm not convinced that if you don't see it, you should ignore it. The reason I say this, is because it still smoother, and ultimately, the objective is to choose the smoothest setup possible for the budget. While it might not matter in many cases, what happens when future games start pushing your GPU harder, and your FPS drop, then those difference become exposed.

Our eyes fool as all the time, we don't perceive world as it is. If I don't notice something I'm not in the least bit concerned about it. As for future games, that is another matter altogether, I was only discussing that if you don't notice something you shouldn't care. When frame rates tank most gamers will buy a new graphics card anyway, so for a lot of people it's a moot point. Last but not least for the price on NV cards you can buy something from AMD that delivers vastly more fps, think about a titan and 3x7950, I highly doubt that the titan will feel smother in future games.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Our eyes fool as all the time, we don't perceive world as it is. If I don't notice something I'm not in the least bit concerned about it. As for future games, that is another matter altogether, I was only discussing that if you don't notice something you shouldn't care. When frame rates tank most gamers will buy a new graphics card anyway, so for a lot of people it's a moot point. Last but not least for the price on NV cards you can buy something from AMD that delivers vastly more fps, think about a titan and 3x7950, I highly doubt that the titan will feel smother in future games.

There obviously is a balance between FPS and frame variance. Though, you may think the 7950 3-CF would always be smoother, you'd be wrong. In some of these tests, they 7970 CF gives lower actual FPS once you ignore runt and dropped frames, but you are also comparing against an extreme GPU, that has an extreme price. I'd much rather use 3-way SLI with 670 or 660ti's than 7950's.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
There obviously is a balance between FPS and frame variance. Though, you may think the 7950 3-CF would always be smoother, you'd be wrong. In some of these tests, they 7970 CF gives lower actual FPS once you ignore runt and dropped frames, but you are also comparing against an extreme GPU, that has an extreme price. I'd much rather use 3-way SLI with 670 or 660ti's than 7950's.

I don't think that 3x7950 would be always smother then a slower single card, hence there's Titan in my rig, but I think it's more future proof then a single Titan because those cards have a lot more raw performance. When a single card tanks to 20-30 fps it doesn't feel smooth to me, I would certainly preferred CF system that would deliver 50-60fps. They started to improve their CF so in the future it should be better but I never buy cards for the future gaming as I consider that stupid. It's better to just buy a card in the future when your current one is no longer adequate.
I compare it to the Titan because that's what I have and that was the choice I had to make. 3x7950 would be even cheaper then a single titan, before I had a 4-way CF and the problems I had with that set-up steered me away from multi-gpu.
 
Last edited:

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
I wish they would start developing and analysing some frame time data for games played in the range of 90-150fps or so, for us 120hz folks. Dropped frames and frame 'latency' are interesting and all, but as implied a few times in the video they are more damaging at lower frame rates
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I wish they would start developing and analysing some frame time data for games played in the range of 90-150fps or so, for us 120hz folks. Dropped frames and frame 'latency' are interesting and all, but as implied a few times in the video they are more damaging at lower frame rates

It is more noticeable for sure, but the whole point of 90+ FPS is to get extremely smooth FPS. Doesn't frame variance kind of hurt the point of going for extreme smoothness and responsiveness? Or are we fooling ourselves about 90+ FPS helping at all?

I'm pretty confident that up to 80 FPS helps, as that is the point 1st person shooters don't make me nausea, but after that, I'm not sure.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I like the technology and the discussion that ensued. I think more research and focus on multi-GPU to reduce stuttering will make it a more viable solution, in my eyes anyway.

However, I would like to see actual gameplay and representation of the technology. The problem with testing of this session is you're adding another step/removing yourself further from the actual experience, using a generalization of a generalization to imply fidelity, which many times isn't the case. I think side by side comparison of slo-mo video playback or other more tangible advantages should be the gold standard, as people fall prey to coercion by numbers.

Let me be clear Keys. I am not an AMD or Nvidia fanboy. I take things as I see it so quit trying to make me out to being a shill. It's quite surprising how much garbage you spread around here, why don't you start backing up your responses with facts instead of feeding me one liners? Thanks.
He'll try to paint you to be a fanboy/shill so he doesn't look as bad, but great work standing your ground against his nonsense.
 

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
I watched the full first hour, skipping the Q&A portion. Ironically, it left me with a question...

How do you know if your multi-gpu setup is metered? Metering is obviously different from v-sync, but are all multi-gpu setups metered now or is this only an Nvidia thing? They didn't really go into detail about when it is active and when it is not.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I watched the full first hour, skipping the Q&A portion. Ironically, it left me with a question...

How do you know if your multi-gpu setup is metered? Metering is obviously different from v-sync, but are all multi-gpu setups metered now or is this only an Nvidia thing? They didn't really go into detail about when it is active and when it is not.

FPS metering is only an Nvidia thing, and from the video, it appears it first started with the 8000 series.

AMD is currently working on a prototype driver that will offer metering in the 2nd half of the year. The last I heard, they are shooting for July, and early tests show it should work reasonably well.
 

cbrunny

Diamond Member
Oct 12, 2007
6,791
406
126
It is more noticeable for sure, but the whole point of 90+ FPS is to get extremely smooth FPS. Doesn't frame variance kind of hurt the point of going for extreme smoothness and responsiveness? Or are we fooling ourselves about 90+ FPS helping at all?

I'm pretty confident that up to 80 FPS helps, as that is the point 1st person shooters don't make me nausea, but after that, I'm not sure.

My take was that high frame variance = jerky visuals.

High frame rate with high frame variance = really fast, possibly less noticeable but still actually jerky visuals.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
My take was that high frame variance = jerky visuals.

High frame rate with high frame variance = really fast, possibly less noticeable but still actually jerky visuals.

That is pretty much what I was trying to say. Higher FPS help, and so does less variance, so I don't think you should ignore variance, even with high FPS. Both matter.