GeForce Titan coming end of February

Page 81 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
The people who purchase 7990s, 690s, Titans, etc, are a very small percetage of the world. The people who seem outraged at performance or price, far outnumber the people who can actually get one without mommy's CC or missing a car payment.

Some of us can 'afford' a $900 item, it doesn't mean we lack the rationality to consider whether the card offers something compelling. I buy something when I consider it reasonable. Relabeling something or trying to create a new product from an old segment doesn't fool most of us.

Why you keep whining about people with signatures dating back a while goes beyond me when you yourself have an old i7 from what, 2008? If money was no object as you claim why wouldn't you have an i7 6 core which is better at gaming by quite a bit vs. your old processor. You talk big but according to your signature you either have a little budget and like to brag/pretend when it comes to NV, or are just another nvidiot. (Or alternatively you rationalize that the 3930k for example doesn't offer enough performance to upgrade for the price, which is precisely what's occurring in this thread :p )

Either way it's easy to see through your message.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
True, that's true, but that would rarely, rarely be the case. Just look at how HD 5870 really "sucked", failing to fully utilize all of its 1600sp as one would have expected on paper. HD 6870 with only 1120sp was only about 9% slower, while the 5870 still had about 15% more bandwidth, and gobs more texturing power also.
I just found this correlation with pretty much all other applicable video cards in my research, by comparing the specs and seeing where the actual performance is at. Speed usually > number of shaders.

That's why overclocking it would pretty much give a rather linear gain, (assuming that the bandwidth is also increased linearly).
On the other hand, doubling the number of shaders (along with bandwidth) might not give as linear of a gain, while keeping the core at the same clock.

Indeed but how do you know the sp on both the 5870 and 6870 are same? they must have tweaked individual sp.NV increased the number of cuda cores in Kepler but made them weaker compared to Fermi.They choose this path as they hope applications in the future will benefit more from extra shaders.Your observation is valid as very few consumer apps stress all the shaders.
 

Ibra

Member
Oct 17, 2012
184
0
0
Meanwhile in real world:

10fdtza.gif
 

Tweak155

Lifer
Sep 23, 2003
11,449
264
126
These benchies fake. Took the fake graph http://i.imgur.com/1ds2B5J.jpg and type them out, oh my! :biggrin: Hope soon real benchies are coming to find out if really lots faster or just little like nvidia benchies.

Not to mention it looks like the graph is already in % while keeping Titan at the 100% level (meaning Titan = 100%, and the other cards are trying to cover that 100%). Thus, you only need the difference between the two percentages, not some fancy math to come up with a different number...... So the numbers are actually lower.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,275
3,159
146
Please keep the baiting and trolling to a minimum guys.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I think you should get 3 Titans and give away 2 of your 670s to people who can't afford GPU upgrades because you are a very kind/generous person. :awe:

87c.jpg


Nerd fact of the day: GTX Titan is like 2 GTX660Tis with slightly lower clocks.

GTX660Ti -> Titan

1344 CUDA cores x 2
24 ROPs x 2
112 TMUs x 2
Memory bandwidth 144GB/sec x 2

Go to this chart = 2x GTX660Ti * (837 / 915) = 83% faster than GTX660Ti
http://www.computerbase.de/artikel/grafikkarten/2013/test-17-grafikkarten-im-vergleich/3/

This is what we'd get with Titan at 837mhz:

1.920 × 1.080 4xAA/16xA
GTX690 = 204%
Titan = 183%
HD7970GE = 132%
GTX680 = 128%
GTX660Ti = 100%

Rating - 2.560 × 1.600 4xAA/16xAF
GTX690 = 217%
Titan = 183%
HD7970GE = 143%
GTX680 = 130%
GTX660Ti = 100%

Average performance compared to GTX690 ~ 87% of GTX690
Average performance increase over HD7970 GE ~ 33%
Average performance increase over GTX680 ~ 42%

This is what we'd get with Titan at 915 Base / 975mhz Boost:

1.920 × 1.080 4xAA/16xA
GTX690 = 204%
Titan = 200%
HD7970GE = 132%
GTX680 = 128%
GTX660Ti = 100%

Rating - 2.560 × 1.600 4xAA/16xAF
GTX690 = 217%
Titan = 200%
HD7970GE = 143%
GTX680 = 130%
GTX660Ti = 100%

Average performance compared to GTX690 ~ 95% of GTX690
Average performance increase over HD7970 GE ~ 46%
Average performance increase over GTX680 ~ 55%

No it's not.
Comparing multi-GPU to single GPU is...retarded.
Microstutter, AFR issues, profiles...are you being deliberately dishonest now?
 

boxleitnerb

Platinum Member
Nov 1, 2011
2,605
6
81
In the first batch only ASUS and EVGA will launch the GeForce GTX Titan. This will be followed by Colorful, Gainward, Galaxy, Gigabyte, Inno3D, MSI and Palit with more models. NVIDIA did not forbid to modify the cards, meaning, manufactures are free to introduce custom models, maybe even with a custom cooling. The MSRP is not yet officially confirmed but it seems that $900/€900 mark is the final price.
http://videocardz.com/39721/nvidia-geforce-gtx-titan-released
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
http://www.nvidia.com/titan-graphics-card

First previews go live.

http://www.anandtech.com/show/6760/nvidias-geforce-gtx-titan-part-1

http://www.tomshardware.com/reviews/geforce-gtx-titan-gk110-review,3438.html

http://www.guru3d.com/articles_pages/geforce_gtx_titan_preview_reference,1.html

http://hexus.net/tech/reviews/graphics/51857-nvidia-geforce-gtx-titan-6gb-graphics-card-overview/

http://www.bit-tech.net/hardware/2013/02/19/nvidia-geforce-gtx-titan-first-look/1

http://www.kitguru.net/components/graphic-cards/zardon/nvidia-launch-gtx-titan-powerhouse-gpu/

GeForce GTX Titan is designed to be overclocked. NVIDIA received a lot of heat when they started limiting Voltages. Obviously they have done so in order to prevent high RMA rates. For GeForce GTX Titan this changes. At default your card will be locked at a maximum Core voltage of 1.162 mV.

Now read this very carefully, the board partners like MSI, EVGA and others get to decide whether or not you may unlock Voltage control. Inside the NVIDIA driver you can opt to unlock Voltage by agreeing to an EULA. That EULA will try to make you understand that applying higher voltages will decrease the lifespan of the product. So if the GeForce GTX Titan has been built for a theoretical 5 years productivity at 1.162 mV then tweaking the Voltage towards 1.250 mV could (in theory) halve that lifespan. Now here's the good news; unlocking the Voltage will not result in losing your warranty, let me be very clear about that. However, if you have a 2 year warranty and after 3 years the card dies as a result of voltage tweaking and thus the reduced lifespan... that would be the consequence and at your risk.

....

And that's where this article ends, sorry! Now we'll share the manufacturer suggested retail prices.

EUR 800 Ex VAT
GBP 827 Inc. VAT
USD 999 Inc. VAT

The Titan isn’t worth $600 more than a Radeon HD 7970 GHz Edition.

index.php



http://www.maximumpc.com/article/news/nvidia_unleashes_titan2013

It's hitting 80C, 5C hotter than the 690!!

This turd just keeps smelling worse.
The MSRP Price is even higher, runs Hot from the factory. How much OCing can you do if it's already hitting 80C?
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
You didnt understand his post, always try two times to avoid retarded posting.

And you suffer from the delusion that you can compare FPS across single and multi GPU solutions just like that...point your finger at yourself.

I can expand the list now.

- Use invented GPU's as "argument"
- Uses false numbers
- Compares FPS between single and multi GPU directly.

If you look beyond his walls of red herrings...his logic is flawed, false or his is lying.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
They got it to 1176mhz. Voltage unlocked. I think we are in for a real treat in a couple days!
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I mention this specifically as the Geforce GTX Titan has been designed to overclock. The AIB partners will be allowed to offer voltage unlocked SKUs. And combined with GPU Boost 2.0 you will see this product boosting towards the 1100~1150 MHz range once you tweak it.


Nice boosts considering its size!
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91

To me these benchmarks look highly suspect. It looks more like the 690s are running into the vram wall on crysis 3, MP3 and FC3 (to have double the performance gain on certain games and virtually none on others). Furthermore, quad SLI adds almost nothing for the fourth card so there may not even be a game there which runs appreciable better (more than 30%) on three titans than on three 680s (1.5 690s).

Does anyone think the skyrim result is funny? Will skyrim be cpu bottlenecked at 5760 x 1080?
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,275
3,159
146
Good news on the OC/OV side of things! This may be my first nvidia card in a long time, assuming I can afford it :O Still hoping AMD brings a counter of course, or at least major price drops on 7970s would also work for me :D
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
I mention this specifically as the Geforce GTX Titan has been designed to overclock. The AIB partners will be allowed to offer voltage unlocked SKUs. And combined with GPU Boost 2.0 you will see this product boosting towards the 1100~1150 MHz range once you tweak it. The reference clock however is 836 MHz with a boost clock of 876 MHz.

About time for that.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
And you suffer from the delusion that you can compare FPS across single and multi GPU solutions just like that...point your finger at yourself.

I can expand the list now.

- Use invented GPU's as "argument"
- Uses false numbers
- Compares FPS between single and multi GPU directly.

If you look beyond his walls of red herrings...his logic is flawed, false or his is lying.

He took the 660ti 's performance and doubled it
He did NOT take the SLI performance of the 660ti
Titan specs out at exactly double the functional units of the 660ti
Hope that helps
 
Status
Not open for further replies.