***Official Reviews Thread*** Nvidia Geforce GTX Titan - Launched Feb. 21, 2013

Page 29 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
You sound pretty sure of yourself here. And I don't have any idea what you're talking about. Hope you know what you're doing, wouldn't want to damage that credibility of yours. It's precious.
The irony is you talking about credibility.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Titan did push me towards buying a 7950. Seriously. After seeing what a poor value it is and seeing the 7970 not far behind for $600 less I was convinced about buying a 7 series card. A little research and went with a voltage unlocked 7950 and free games.:thumbup:

I'm leaning heavily towards the 7950 too, but after watching some Madden 13 online videos and seeing some of what next gen consoles (PS3) are offering I might just get a console and wait another 4-5 years for PC's to completely outpace them again.

I had more fun playing with my hardware than I did playing a lot of the games that came out over the past three years. Nvidia is not only removing the prospect of value for your dollar, but they're also cutting out enthusiasts and skyrocketing prices...
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I suppose you think you have some? Please.
Plenty. As much as you try to draw lines and force users into sides, most of us here are enthusiasts who just enjoy talking about hardware. Despite your every attempt to label people who disagree with you or don't cheer nvidia as "AMD fanboys" or "other shills" to validate your partisan stance, they simply aren't here.

In either case, you already have your scarlet letter in your signature so it isn't worth discussing further.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Plenty. As much as you try to draw lines and force users into sides, most of us here are enthusiasts who just enjoy talking about hardware. Despite your every attempt to label people who disagree with you or don't cheer nvidia as "AMD fanboys" or "other shills" to validate your partisan stance, they simply aren't here.

In either case, you already have your scarlet letter in your signature so it isn't worth discussing further.

That is funny. You just described exactly what you do on these boards. Weird. Freud anyone?
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Tremie, excuse me if I missed some other posts of yours.

So cfx and sli take less time to draw each frame, but there is more variation between the time it takes to draw each successive frame. Single cards take longer to draw each frame, but there is much less variation.

I guess my question would be where the cutoff point is. How low would the frame times have to be compared to a single card for multiple gpu with its higher variation to be a better experience? I guess that's different for each person.

This is assuming there are no spikes and settings don't get reduced.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Tremie, excuse me if I missed some other posts of yours.

So cfx and sli take less time to draw each frame, but there is more variation between the time it takes to draw each successive frame. Single cards take longer to draw each frame, but there is much less variation.

I guess my question would be where the cutoff point is. How low would the frame times have to be compared to a single card for multiple gpu with its higher variation to be a better experience? I guess that's different for each person.

This is assuming there are no spikes and settings don't get reduced.

We have to be careful what we claim as fact. It's possible/reasonable to assume microstutter is caused by more variation in frame times with multiple GPU's compared to single GPU's. That's never been proven though. It might be PCPer's "phantom frames"? It might be screen tearing being called microstuttering by people who don't know the difference? It might be driver optimization being far more difficult and hit&miss with multi GPU? It might be all or none of the above?
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
We have to be careful what we claim as fact. It's possible/reasonable to assume microstutter is caused by more variation in frame times with multiple GPU's compared to single GPU's. That's never been proven though. It might be PCPer's "phantom frames"? It might be screen tearing being called microstuttering by people who don't know the difference? It might be driver optimization being far more difficult and hit&miss with multi GPU? It might be all or none of the above?

Yeah, I'm still trying to wrap my head around this new method and what those numbers mean to ME.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
We have to be careful what we claim as fact. It's possible/reasonable to assume microstutter is caused by more variation in frame times with multiple GPU's compared to single GPU's. That's never been proven though. It might be PCPer's "phantom frames"? It might be screen tearing being called microstuttering by people who don't know the difference? It might be driver optimization being far more difficult and hit&miss with multi GPU? It might be all or none of the above?

Still trying to ignore the facts eh? ^^
 
May 13, 2009
12,333
612
126
I'm leaning heavily towards the 7950 too, but after watching some Madden 13 online videos and seeing some of what next gen consoles (PS3) are offering I might just get a console and wait another 4-5 years for PC's to completely outpace them again.

I had more fun playing with my hardware than I did playing a lot of the games that came out over the past three years. Nvidia is not only removing the prospect of value for your dollar, but they're also cutting out enthusiasts and skyrocketing prices...

I bought a 7950 for a lot of the reasons you listed. Nvidia trying to gouge us really doesn't sit with me well. I specifically found a 7950 that was voltage unlocked. And value for the dollar is off the charts for the 7950 and they are including 2 of the biggest games of the year for free. At $280 AR it was as big a no brainer as they get.
 

kukreknecmi

Junior Member
Dec 11, 2012
7
0
0
For Compute Tests, Why the Anand just took the lowest numbers for DGEMM and SGEMM tests on 7970?

The PDF they've pointed out says :

SGEMM : 2646 GFLOP
DGEMM : 848 GFLOP

Yet Anand review took the lowest numbers as:

SGEMM : 2382 GFLOP
DGEMM : 689 GFLOP

any idea??

Anand review
http://www.anandtech.com/show/6774/nvidias-geforce-gtx-titan-part-2-titans-performance-unveiled/3

Azui Uni Fast Matrix mul paper :
ftp://ftp.u-aizu.ac.jp/u-aizu/doc/Tech-Report/2012/2012-002.pdf

I know that the numbers i wrote differs cuz of using different kernels. Yet Anand review never mentioned about that, so i wonder.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I bought a 7950 for a lot of the reasons you listed. Nvidia trying to gouge us really doesn't sit with me well. I specifically found a 7950 that was voltage unlocked. And value for the dollar is off the charts for the 7950 and they are including 2 of the biggest games of the year for free. At $280 AR it was as big a no brainer as they get.

You would never have bought the Titan in the first place...so?
 
May 13, 2009
12,333
612
126
You would never have bought the Titan in the first place...so?

At $1000 not a chance. Fact is that I've owned most of the high end cards from both camps for several years now. I've had two different 670's and now a 7950. So if the titan doesn't appeal to me in the slightest is that my fault or Nvidia's fault? I mean there are only so many computer geeks out there and even less willing to go into the several hundred dollar territory for a video card.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Yeah still trying to wrap my head around this new method and what those numbers mean to ME.

I agree, it's confusing, but I think TechReport has done a lot to clear things up.

http://en.wikipedia.org/wiki/Micro_stuttering
Micro stuttering is inherent to multi-GPU configurations using alternate frame rendering (AFR), such as nVidia SLi and AMD CrossFireX but can also exist in certain cases in single-gpu systems.[1][2][3][4]

It's a term used in Nvidia's unofficial sli guide. Which is now, been lost in the new forums. At least in it's original form. And in many other articles the last 5 years or so.

That Wikipedia article is technically correct, but a little out of date.

While microstutter could be described as the variation in frame times, that itself is not what a user would experience as stuttering. In other words, every variation from an average frame time will not be perceived if it's too small a variation for our eye to detect.

What I believe TechReport has documented to be perceptible "latency" is a frame time that is significantly longer than the average, where the average is very low. Their threshold is 50ms. They have been able to match up the spikes over 50ms to perceptible pauses in the game video output.

Now, it still helps to have a fast card or set of cards. A card where every frame time is over 50ms could be very consistent and not suffer from microstutter or "latency", but it would simply be slow, as in average frames per second.

So think of it this way - video cards have three main performance parameters:

(1) average speed (most accurately reported as milliseconds/frame rather than frames/second)
(2) ability to maintain average speed, or essentially standard deviation of frame times
(3) longest frame time (analogous to what we've traditionally looked at as minimum fps, but at a much more granular level)

I posted about this 18 months ago when I perceived stutter on my single HD5850 in BF3 despite a frame rate that would have typically been considered as smooth (45fps, 30min), where the image would simply appear to stop moving. It was then that I suggested that microstutter was not a phenomenon limited to dual card setups.

That is why the Wikipedia article is out of date, and also why using the term microstutter could be misleading, due to its association with dual card setups. There is nothing unique about a dual card setup that limits a frame time anomaly to such systems. Any card can suffer from peak frame times significantly and perceptibly out of line with its average frame time.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I agree, it's confusing, but I think TechReport has done a lot to clear things up.



That Wikipedia article is technically correct, but a little out of date.

While microstutter could be described as the variation in frame times, that itself is not what a user would experience as stuttering. In other words, every variation from an average frame time will not be perceived if it's too small a variation for our eye to detect.

What I believe TechReport has documented to be perceptible "latency" is a frame time that is significantly longer than the average, where the average is very low. Their threshold is 50ms. They have been able to match up the spikes over 50ms to perceptible pauses in the game video output.

Now, it still helps to have a fast card or set of cards. A card where every frame time is over 50ms could be very consistent and not suffer from microstutter or "latency", but it would simply be slow, as in average frames per second.

So think of it this way - video cards have three main performance parameters:

(1) average speed (most accurately reported as milliseconds/frame rather than frames/second)
(2) ability to maintain average speed, or essentially standard deviation of frame times
(3) longest frame time (analogous to what we've traditionally looked at as minimum fps, but at a much more granular level)

I posted about this 18 months ago when I perceived stutter on my single HD5850 in BF3 despite a frame rate that would have typically been considered as smooth (45fps, 30min), where the image would simply appear to stop moving. It was then that I suggested that microstutter was not a phenomenon limited to dual card setups.

That is why the Wikipedia article is out of date, and also why using the term microstutter could be misleading, due to its association with dual card setups. There is nothing unique about a dual card setup that limits a frame time anomaly to such systems. Any card can suffer from peak frame times significantly and perceptibly out of line with its average frame time.

The opening paragraph of the linked definition encompasses your point.
Micro stuttering is inherent to multi-GPU configurations using alternate frame rendering (AFR), such as nVidia SLi and AMD CrossFireX but can also exist in certain cases in single-gpu systems.[1][2][3][4]
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
The opening paragraph of the linked definition encompasses your point.

I disagree. The statement that microstutter is "inherent" to dual card setups is what is just wrong. As we've seen from multiple tests of the Titan, in some cases the Titan has more "microstutter" than a 690, and this finding has made people on this board very dismissive of the new testing method.

It is not inherent to dual card setups. It's inherent to any card that exhibits peak frame times that are significantly longer than the average frame times. Yes, due to the nature of AFR, it's likely that dual card setups are more susceptible to exhibiting long frame times, but as is made evident from the testing, that may be a trend but it is not a necessary outcome of using dual cards.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
I disagree. The statement that microstutter is "inherent" to dual card setups is what is just wrong. As we've seen from multiple tests of the Titan, in some cases the Titan has more "microstutter" than a 690, and this finding has made people on this board very dismissive of the new testing method.

It is not inherent to dual card setups. It's inherent to any card that exhibits peak frame times that are significantly longer than the average frame times. Yes, due to the nature of AFR, it's likely that dual card setups are more susceptible to exhibiting long frame times, but as is made evident from the testing, that may be a trend but it is not a necessary outcome of using dual cards.

Not for anything Termie, but those folks were dismissive of the new testing method well before Titan. I do agree that the testing method, or rather the control of it, does need refinements.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I disagree. The statement that microstutter is "inherent" to dual card setups is what is just wrong. As we've seen from multiple tests of the Titan, in some cases the Titan has more "microstutter" than a 690, and this finding has made people on this board very dismissive of the new testing method.

It is not inherent to dual card setups. It's inherent to any card that exhibits peak frame times that are significantly longer than the average frame times. Yes, due to the nature of AFR, it's likely that dual card setups are more susceptible to exhibiting long frame times, but as is made evident from the testing, that may be a trend but it is not a necessary outcome of using dual cards.
I disagree with you, and your personal weighing of words is akin to semantics. You can't change history and the knowledge base already out there, that has identified frame time issues. Most articles, point out the issue or problem can happen with single cards. Not surprisingly this isn't a recent problem. It almost comes down to a matter of quality control and/or design error, by driver.
That's my opinion.