• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

***Official Reviews Thread*** Nvidia Geforce GTX Titan - Launched Feb. 21, 2013

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
That's called cheating and it's a pretty big accusation. Can they back that up?

They measure both with FRAPS and on the DVI ports...there is an...."difference" between what is rpported..and what is displayed...read the article...this will be fun ^^
 
I think most people also forget that the gtx580 3gb was selling for $600 when the 7970 was released.

Clearly there were no better performance per dollar cards out there at that time.

Certainly there weren't any $300 GTX 570s. Nor were there any $350 GTX 570 2.5GB cards.

Certainly the only suitable comparison was the $600+ GTX 580 cards.
 
Oh oh...do you know what you have done now?
Prepare for the red flames...facts about multi GPU and it's downfalls always gets peoples caps to go off 😉

Read the link you idiot, they used GTX 690 and GTX 680 SLi but excluded crossfire:
One question I know will be asked about this review is that in our benchmarks today you will not see results from AMD CrossFire configurations in 2-Way or 3-Way combinations. AMD is only represented by a single Radeon HD 7970 GHz Edition card while we are using both the GTX 690 dual-GPU card, GTX 680s in SLI and GTX TITANs in SLI.
So you were saying about the downfalls of multi-GPU?
 
Clearly there were no better performance per dollar cards out there at that time.

Certainly there weren't any $300 GTX 570s. Nor were there any $350 GTX 570 2.5GB cards.

Certainly the only suitable comparison was the $600+ GTX 580 cards.

Why wouldn't you compare the top of the line card for both manufacturers? I didn't say that nvidia didn't have far better value's available at the time. I said that their best single gpu was $600 and that AMD's was $550.
 
Read the link you idiot, they used GTX 690 and GTX 680 SLi but excluded crossfire:

So you were saying about the downfalls of multi-GPU?

AFR, profiles, microstutter, bad/negativescaling...if I am paying $999 for a card...I wouldn't pay for that ^^
 
But SLI is awesome. Why get a Titan when you can tri-SLI used GTX 470s for less than a single $300 GPU?

/rehashing from 7970 introduction
 
jCyEgbS.jpg


http://www.computerbase.de/artikel/grafikkarten/2013/test-nvidia-geforce-gtx-titan/7/

What a joke. Look at the $400 card vs the $1000 one.

Can't wait for the big juicy price drops on the Titanic, will be worth buying after $300 is lopped off the price.
 
Can you please read the part about their reason?
It is a little bit suspect that you ignore it...
Actually, I did:
The basic premise of Frame Rating is that the performance metrics that the industry is gathering using FRAPS are inaccurate in many cases and do not properly reflect the real-world gaming experience the user has
I agree, so let's see what tests your going to conduct.
Because of that, we are working on another method that uses high-end dual-link DVI capture equipment to directly record the raw output from the graphics card with an overlay technology that allows us to measure frame rates as they are presented on the screen, not as they are presented to the FRAPS software sub-system. With these tools we can measure average frame rates, frame times and stutter, all in a way that reflects exactly what the viewer sees from the game.
Oh, sweet! Let's get to it then!
We aren't ready to show our full sets of results yet (soon!) but the problems lie in that AMD's CrossFire technology shows severe performance degradations when viewed under the Frame Rating microscope that do not show up nearly as dramatically under FRAPS. As such, I decided that it was simply irresponsible of me to present data to readers that I would then immediately refute on the final pages of this review - it would be a waste of time for the reader and people that skip only to the performance graphs wouldn't know our theory on why the results displayed were invalid.
:colbert:
 
But SLI is awesome. Why get a Titan when you can tri-SLI used GTX 470s for less than a single $300 GPU?

/rehashing from 7970 introduction

1.28GB frame buffer, sure seems to have carried me past the woeful first mid-range 28nm products, at least now I can buy them for 50% less than release price!
 
:thumbsup: This is basically the analogy for what they did with the 680.

The self-loathing is amusing... ^_^

GK110 seems to be a decent gaming GPU, wonder why it took them so long to finally get it released as a Geforce card. Way too expensive for my tastes, but people will buy it at that price.
 
The self-loathing is amusing... ^_^

GK110 seems to be a decent gaming GPU, wonder why it took them so long to finally get it released as a Geforce card. Way too expensive for my tastes, but people will buy it at that price.

I beleive they had to fill other orders before they could launch the consumer product.
 
The self-loathing is amusing... ^_^

GK110 seems to be a decent gaming GPU, wonder why it took them so long to finally get it released as a Geforce card. Way too expensive for my tastes, but people will buy it at that price.

They earn more money on a Tesla K20 than a GeForce Titan...is it really that hard to understand?
 
So GTX Titan?

I wish, amazing card but it costs too much for me.

Probably a 7950 for under $250 once AMD fixes their 1.3 year old drivers.

Hoping AMD can bring some competition on 20nm, Maxwell is a much bigger jump than fermi > kepler, I hope Nvidia doesn't run away with it and release mid-range for 1k and high end for 5k.
 
I wish, amazing card but it costs too much for me.

Probably a 7950 for under $250 once AMD fixes their 1.3 year old drivers.

Hoping AMD can bring some competition on 20nm, Maxwell is a much bigger jump than fermi > kepler, I hope Nvidia doesn't run away with it and release mid-range for 1k and high end for 5k.

Aren't their 1.3 year old drivers better in terms of smoothness than your gtx470's?

I thought they were. So why wait? Upgrade now.
 
How do the Tri- 470's hold up in games? I'm sure they are still pretty beastly.

GPU performance wise they'd be around the 690, but the frame buffer is holding them back at this point. Plus we're talking about overclocked 900+w vs 250, tech moves fast, they were fun while it lasted though.

Aren't their 1.3 year old drivers better in terms of smoothness than your gtx470's?

I thought they were. So why wait? Upgrade now.

Not even close.
 
I wish, amazing card but it costs too much for me.

Probably a 7950 for under $250 once AMD fixes their 1.3 year old drivers.

Hoping AMD can bring some competition on 20nm, Maxwell is a much bigger jump than fermi > kepler, I hope Nvidia doesn't run away with it and release mid-range for 1k and high end for 5k.

Just 1? Wouldn't that be a downgrade?
 
GPU performance wise they'd be around the 690, but the frame buffer is holding them back at this point. Plus we're talking about overclocked 900+w vs 250, tech moves fast, they were fun while it lasted though.

Best to wait for 20nm GTX 670/680 replacement, my 7950 I game on all the time would keep you too much on edge worrying about the drivers.

http://www.guru3d.com/articles_pages/geforce_gtx_470_2_and_3_way_sli_review,5.html

http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan/19.html
 
Back
Top