• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Oh, The Irony

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Ah! and Termie shines light on the situation 🙂

And yeah, that is really cool. When I read the "problem" in the OP I could only think "well, it would be awesome if this was intended behavior."

Well done nVidia.

First world GPU problems 😛
 
So basically, now, you're questioning his validity because it's an Nvidia card. Ok, got it.

No, apparently you didn't get it.

Really, had you made a post about CFX 7970's and microstuttering, or whatever, and said the same thing about having issues, I'd be questioning your validity to complain just as I am now given the plethora of information that has recently come to light regarding CFX (and sometimes SLI) frame time issues.

As I VERY CLEARLY STATED, I would have questioned his complaining if he had gone another route and then complained about problems which were well documented already (before his purchase).
 
Anyhow, Crysis 3 is throttling for no good reason since you appear to have missed that part.

You said it was "very minor", so I chose not to say anything about it.

Other than that, you're free to complain. I'm also free to point out much of which you are complaining about is either not a problem or it's a problem related specifically to one game in which Nvidia has acknowledged they are working on and has apologized for. Nvidia has also said they are looking into the throttling. Sounds like there are some hiccups that were overlooked in driver QA but it's nothing detrimental and will be fixed soon.
 
There are quite a few games that I've had to set maximum performance because the adaptive would keep it clocked too low or the changing or something would introduce stutter (even when the fps is 60). I'm not a fan of that option.
 
You said it was "very minor", so I chose not to say anything about it.

Other than that, you're free to complain. I'm also free to point out much of which you are complaining about is either not a problem or it's a problem related specifically to one game in which Nvidia has acknowledged they are working on and has apologized for. Nvidia has also said they are looking into the throttling. Sounds like there are some hiccups that were overlooked in driver QA but it's nothing detrimental and will be fixed soon.



I think the problem is.. the longer he waits, the lesser the value the card is (or at least the perception). I can bet that this card has been tested for months, if not a year internally at Nvidia. For a $1000 card, its unacceptable, especially from Nvidia.


I find it interesting some of you are criticizing someone who owned more videocards than probably 95% of AT. I'm sure he knows how to enable/disable vsync. The GK110 throttling issue is happening on a much larger scale... go over to EVGA forums.
 
Last edited:
I find it interesting some of you are criticizing someone who owned more videocards than probably 95% of AT. I'm sure he knows how to enable/disable vsync. The GK110 throttling issue is happening on a much larger scale... go over to EVGA forums.

I realize the throttling issue is real, but in the one of the two instances he was referring to, it sounded to me like it was throttling simply because it didn't need the extra horsepower - the built in energy saving feature Termie mentioned.

Just because someone has owned more video cards does not make that person impervious to criticism. Posting opinions, experiences, and (for some people) wrong information on a web forum opens a person up to any and all points of criticism, comments, evaluations, etc.
 
I realize the throttling issue is real, but in the one of the two instances he was referring to, it sounded to me like it was throttling simply because it didn't need the extra horsepower - the built in energy saving feature Termie mentioned.

Just because someone has owned more video cards does not make that person impervious to criticism. Posting opinions, experiences, and (for some people) wrong information on a web forum opens a person up to any and all points of criticism, comments, evaluations, etc.

Did I post up any wrong information?
 
Why can't I bitch about a card I own? Anyhow, Crysis 3 is throttling for no good reason since you appear to have missed that part.

Be careful. You'll be accused of being an AMD shill like Grooveriding was when he complained about his SLI 480's.

I can remember when software throttling was first introduced on the 580 with Furmark. The general consensus was, "As long as it's not causing throttling in games it's OK." It seems like we've now passed that point, but many still feel like it's OK. What would they have to do before the fan base complains? This is a $1000 flagship card and it seems like it's being protected like it's a budget card. People aren't blowing up 7970's or 7990's by running them wide open. You would think nVidia could design as robust a card as a 7970 for the kind of money they're getting for Titan.
 
No, that would happen regardless of Vsync. The GTX600/Titan series has the ability to downclock when presented with very low loads. This is completely separate from Adaptive Vsync, and it's a feature not a bug. In my opinion it's one of the smartest things to happen in video card development in a long time. It means you can game at extremely low power levels.

When I play Age of Empires, my clocks are at about 500MHz, and the load is below 100w. That's CPU, GPU, the whole shebang.

The only issue I have is when occasionally a low clock speed would not ramp up fast enough when something intensive happens in a game that just before was doing nothing intensive. I would like to be able to control this a bit. I know KBoost exists and would do this by locking your clocks to whatever you set, but I cannot see my GPU utilization with that function enabled(sometimes I like to see what is going on when performance suffers in a new game). Can't have it both ways I guess.
 
Just because you didn't kill it today, doesn't mean it won't die tomorrow.

Electromigration is funny like that.

Titan isn't really a flagship card in the sense that we're used to from Nvidia. It's more like a gimped piece of the Titan super computer than a desktop gaming card.

Also until we see otherwise, there is no reason to believe the Titan isn't robust, it's simply defined in it's capability by Nvidia.


Boost is stupid either way, 1.0 was stupid, AMD's version is stupider, and 2.0 is just as dumb. Boost is cool for people who don't overclock I guess, or if it worked better at hitting clocks and TDP across more games, or if it wasn't gimping top end... But I've yet to see anything that suggests boost is anything other than garbage for overclockers.

It's like boost on a i5 or i7, who gives a hoot?
 
Really? Nice worthless post.

You've been on the rag alot lately, between your love affair with FX1 and Lonberg foaming at the mouth anything related physX, this place has been a whine-and-moan fest.

Keep at it bro, your posts are the embodiment of "worth" 😉
 
You guys have to remember that quite a few posters here will defend anything Nvidia until their fingers bleed. They prove it everyday, in fact they're proving it right now in other threads.
 
You guys have to remember that quite a few posters here will defend anything Nvidia until their fingers bleed. They prove it everyday, in fact they're proving it right now in other threads.

And just as many well exaggerate, like they are doing now!
 
Eh?

Titan isn't like a GTX 580, or a GTX 285, or a 8800 Ultra.


It's premium can be tied directly to it's CUDA DP, something none of the previous Nvidia flagship cards offered.

In no way does it gaming performance warrant it's price tag, however compare a 690 to a Titan in DP and you'll figure something out I'm sure.


53224.png
 
Eh?

Titan isn't like a GTX 580, or a GTX 285, or a 8800 Ultra.


It's premium can be tied directly to it's CUDA DP, something none of the previous Nvidia flagship cards offered.

In no way does it gaming performance warrant it's price tag, however compare a 690 to a Titan in DP and you'll figure something out I'm sure.

So what you're saying is, that it's a 1000$ mid range card, right? Gotcha
 
So what you're saying is, that it's a 1000$ mid range card, right? Gotcha

I don't typically defend Balla (He's on my ignore list but when he's quoted...) he's not saying that at all in the post you've quoted. What he's saying is that the reason it's a $1000 card, instead of a $___ (pick a number between $500 and $1000), is because it has 1/3 dp. That makes it worth $1k to some, justifying the asking price. From that point of view, he's correct.

That said, I think he's putting the cart before the horse. I believe the 6gig VRAM and 1/3 dp is there to help justify the price. Clearly the average Titan buyer doesn't need or use those features. They were going to charge a grand for this thing and added bullet points until they achieved the relative value to justify it.
 
Back
Top