UPDATETITAN Z Base 705mhzBoost 876mhz

csbin

Senior member
Feb 4, 2013
899
600
136
http://videocardz.com/50349/nvidia-geforce-gtx-titan-z-launch-postponed


yNI.png

rNI.jpg



sNI.jpg



uNI.jpg

tNI.jpg




780TI SLI vs 295X2:hmm:
http://www.computerbase.de/2014-04/amd-radeon-r9-295x2-benchmark-test/drucken/


4K
vNI.jpg



2K

wNI.jpg




UPDATE:
FTI.jpg
 
Last edited:

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Seems like 780ti SLI is still the multi-gpu winner for price/performance/efficiency. No idea whythe premium is so huge for the single cards this time.

Why wouldnt you want the 6GB of VRAM at 4k for less money with the new TI? 4x2 on a $1500 card seems anemic assuming you are getting the card for 4k.
 
Last edited:

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
I call bulls***.

If Nvidia could match Hawaii performance at this much lower clocks than the card would be out selling like hotcakes instead of being delayed.

Looking at the graphs and your comments makes me tbink....What have you been smoking?

I didnt see TitanZ in the graphs shown. Going by the clocks shown in the graph makes me think TitanZ wont fare so well.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Drop DP, rename it to GTX790, drop the price to 1000$ and then I'm in even at those clocks.
 

Meekers

Member
Aug 4, 2012
156
1
76
If those clocks are true that would explain why it has been delayed. If they released that at $3000 they would get killed by reviewers.
 

rtsurfer

Senior member
Oct 14, 2013
733
15
76
Looking at the graphs and your comments makes me tbink....What have you been smoking?

I didnt see TitanZ in the graphs shown. Going by the clocks shown in the graph makes me think TitanZ wont fare so well.

Thanks.
I was on my phone.
Quickly scrolling through the pictures. As they posted 2 scores for the Ti SLI, I thought the top one was TitanZ. I am not even sure how I missed that.:eek:
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Thanks.
I was on my phone.
Quickly scrolling through the pictures. As they posted 2 scores for the Ti SLI, I thought the top one was TitanZ. I am not even sure how I missed that.:eek:

That makes it worse for both single-cards, since TI sli is a much better solution.
 

Saylick

Diamond Member
Sep 10, 2012
3,888
9,030
136
Drop DP, rename it to GTX790, drop the price to 1000$ and then I'm in even at those clocks.

As much as I wish that would happen, nVidia would never do that. GTX780 Ti is already $700, so if they offered TWO fully enabled GK110 dies at $1000 AND it was a single card solution, you betcha that 780 Ti, 780 non-Ti, and even Titan Black sales would be immediately cannibalized. Everyone and their mothers would be in on it if they reduced the price to $1000.

I'm expecting higher clocks with a revised minimum price of $2500.

EDIT: Scratch the $2000 price tag. That would also cannibalize Titan Black sales as well... but then again, why the heck are people even buying GPUs are this price? Have GPU prices been any higher in the past?
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
As much as I wish that would happen, nVidia would never do that. GTX780 Ti is already $700, so if they offered TWO fully enabled GK110 dies at $1000 AND it was a single card solution, you betcha that 780 Ti, 780 non-Ti, and even Titan Black sales would be immediately cannibalized. Everyone and their mothers would be in on it if they reduced the price to $1000.

I'm expecting higher clocks with a revised minimum price of $2500.

EDIT: Scratch the $2000 price tag. That would also cannibalize Titan Black sales as well... but then again, why the heck are people even buying GPUs are this price? Have GPU prices been any higher in the past?

Considering that the 295x2 isn't selling out and it's bound to be a limited supply card, I'm hoping that people aren't buying them and prices will drop.

Prices right now are stupid high across the board. Consider that we could get the 7950 for <$200 last year and the 280 (non X) is $260 and up and it's the same card.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Seems like 780ti SLI is still the multi-gpu winner for price/performance/efficiency. No idea whythe premium is so huge for the single cards this time.

Why wouldnt you want the 6GB of VRAM at 4k for less money with the new TI? 4x2 on a $1500 card seems anemic assuming you are getting the card for 4k.

It should be noted that two custom cooled 290x's would likely overclock much better than the 295x2 making up ground in those benchmarks all while being much cheaper than 780ti sli.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
As much as I wish that would happen, nVidia would never do that. GTX780 Ti is already $700, so if they offered TWO fully enabled GK110 dies at $1000 AND it was a single card solution, you betcha that 780 Ti, 780 non-Ti, and even Titan Black sales would be immediately cannibalized. Everyone and their mothers would be in on it if they reduced the price to $1000.

I'm expecting higher clocks with a revised minimum price of $2500.

EDIT: Scratch the $2000 price tag. That would also cannibalize Titan Black sales as well... but then again, why the heck are people even buying GPUs are this price? Have GPU prices been any higher in the past?

Back in the 5970 days dual-GPU cards were actually a good deal, just like the card I described. Titans are such an enormous rip-offs that they aren't bought by people who are price conscious in any way so I don't think it would cannibalize Titans, NV would just need to release a dual-gpu Titan with full DP and twice the ram and higher clocks and people would get in line to get fleeced. I think NV would still prefer to sell such a card for 1000$ than 780TI/780.
UPDATE: Enormous rip-offs when used as gaming cards. They are better deals than quadros if someone can use them instead.
 
Last edited:

Bubbleawsome

Diamond Member
Apr 14, 2013
4,834
1,204
146
No wonder they delayed. Triple slot cooling+not being able to claim higher clocks=lower sales.
 
Feb 19, 2009
10,457
10
76
It should be noted that two custom cooled 290x's would likely overclock much better than the 295x2 making up ground in those benchmarks all while being much cheaper than 780ti sli.

Yeah but 2 custom open air cooled R290X is a bitch to keep it purring in a closed case. That is a LOT of heat dumped in a small confined area. Not to mention the top card is going to recycle very warm air and would lead to throttling or extremely noisy fans.

I've got mining rigs with multiple R290s so I know first hand how bad an idea it is to cram two open air 300W cards in a case. :p

As for the computerbase.de chart:
Their MAX for NV = maxing power limit and 100% fan speed = Massive boost clocks. MAX for AMD? Power limit to +50% and 100% fan speed = no throttling but its still running at a max 1018mhz, its essentially stock clocks. They also didnt include BF4 MP with Mantle in the chart, only DX11 Singleplayer results.

This gen AMD is extremely competitive, it's a first for AMD's smaller die to be able to match NV's big die, all the way back to the 4800 series, NV's large die held a 10-15% performance advantage.
 

caswow

Senior member
Sep 18, 2013
525
136
116
Not sure it'll catch up at that speed, but only 375w TDP is pretty incredible.

not that incredible. power consumption scales exponentially not linear if you go higher clocks. that means if you want to have same perf. as 295x2 your consuption would be as high as 295x2's.
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
not that incredible. power consumption scales exponentially not linear if you go higher clocks. that means if you want to have same perf. as 295x2 your consuption would be as high as 295x2's.

nope. exactly opposite is true.
power consumption goes pretty much linear with GPU clock.

it's voltage that skyrockets the power
 

rtsurfer

Senior member
Oct 14, 2013
733
15
76
No wonder Nvidia delayed the card.
It will need to run at atleast 1200Mhz to keep up with the R9 295x2.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
not that incredible. power consumption scales exponentially not linear if you go higher clocks. that means if you want to have same perf. as 295x2 your consuption would be as high as 295x2's.

Nope, flat wrong.

nope. exactly opposite is true.
power consumption goes pretty much linear with GPU clock.

it's voltage that skyrockets the power

Yep.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Not sure it'll catch up at that speed, but only 375w TDP is pretty incredible.

I personally expected it. The gtx690 was way more efficient than SLI'd gtx680's at only 3% loss in performance. The various reviews around the web show the 295x drawing 125+ more watts than two gtx 780 TI cards. Nvidia will heavily bin for the Titan Z, and when coupled with efficiencies that can be utilized on the power delivery and PCB's of dual cards, we arrive at a non-surprising very efficient result.

Anyways, everyone here knows Nvidia's stated boost clocks are conservative. I imagine this thing will boost to 950 or more out of the box without modifications. Still not worth $3000.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
If it boosts to 950 it'll get close to the 295X2 for sure.

Hmm, I'm starting to think it may end up competitive. Still needs a price reduction.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
Nope, flat wrong.



Yep.

While technically he was wrong I think it was a mental shortcut with the point being that you don't need the same voltage at 850MHz that you do at 1100MHz. You could keep the voltage the same but it wouldn't be ideal.
If it boosts to 950 it'll get close to the 295X2 for sure.

Hmm, I'm starting to think it may end up competitive. Still needs a price reduction.

It has the same boost clock as Titan which does boost to 950MHz so it surely will, keeping the boost clock is another matter entirely. My Titan at default doesn't maintain the boost clock very well, it keeps it long enough for most benchmark so it's useful for marketing but completely useless for actual gaming. Titan Z has suspiciously low base clock compared to Titan tough so I think it will mostly keep the boost during short benchmark runs, exploiting the fact that reviewers are too lazy to benchmark warmed-up cards. Cold benchmark runs are purely academic unless someone actually plays in 3 minutes bursts. I think the board power of 375W will limit the sustainability of the boost clock severely, it surely has such a low base clock for a reason.
 
Last edited: