• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

videocardzFirst NVIDIA GeForce GTX TITAN Z review leaks out

csbin

Senior member
http://videocardz.com/50491/first-nvidia-geforce-gtx-titan-z-review-leaks



What happens when you change the launch day of a new card, but you forgot there are also written magazines, which already have the review printed (or digitally generated)?

HFK.jpg

NVIDIA GeForce GTX TITAN Z delayed due to low performance

On May 8th everyone was expecting NVIDIA to launch its new dual-GPU graphics card — GeForce GTX TITAN Z. Unfortunately due to driver issues, the card was delayed (but not yet canceled).
Thanks to the member of LinusTechTips forums (Dipper315) we might finally understand why. Chinese magazine — E-Zone — published its review on May 8th, and fortunately for us, it was too late to take it down.

KFK.png


The E-Zone sample has the same clocks we saw in ASUS press release. The base clock of 706 MHz and boost clock of 876 MHz. However, it does not really matter. These numbers have nothing to do with the clocks you would be getting in real-world scenario. In fact the clock speeds ramp up to 1058 MHz! And regardless how crazy it may sound, it was partially the reason why it was delayed. If the card boost more than 300 MHz and yet offers worse or similar performance than Radeon R9 295X2, then NV has a serious problem (well it’s not something you expect from $3000 graphics card).
Driver issues we were told about, are not only related to the noise, temperature or any hardware malfunction that would cause the delay. The real issue is performance, TITAN Z is simply not fast enough to beat the Radeon R9 295X2.
Have a look at synthetic and gaming performance. TITAN Z lost the battle in 3DMark FireStrike, but won in Batman AO, Stone Giant and Tomb Raider. The game list is very poor, so it’s hard to draw any conclusions. However, it does show one thing. GTX 780 TI SLI is definitely a faster and cheaper option.

IFK.png


Moving on to thermal and power characteristics, we have lower power consumption, but also higher temperature levels than Radeon R9 295X2. GTX TITAN Z pulls just 618W, which is 33W less than 780 TI SLI.

JFK.png


Conclusions?

Chinese magazine draw a simple conclusion. This card is not worth buying, as you can buy four GTX 780 Ti for the same price. To be precise, it’s not only the price tag to blame for the failure of this launch, but the driver. If NVIDIA could somehow make TITAN Z to keep higher clock speeds then it would probably beat R9 295X2, at the expense of noise and power levels.
Source: LinusTechTips
 
So basically 780 Ti in SLI are faster, and cheaper.

No wonder they pulled it back. Hopefully for good.

But that is normal for 2 GPU card is slightly slower in general compared to SLI because they clock them down a lot but here almost same clocks as ref but it way down in few BM .
 
Nvidia did AMD a favor buy announcing the Z at 3000. Any price AMD sell the 295X2 seem like a good deal as long as it under 3000. AMD got a double whammy on this one. Holding the "single card" performance crown (though I don't consider dual GPU single card) while getting good press at the expense of Nvidia.
 
Huh. So it actually is faster than the 295X2 then?

EDIT: lol. I just looked at the clocks and made a bad assumption. It might be faster after the driver fix though.
 
Huh. So it actually is faster than the 295X2 then?

EDIT: lol. I just looked at the clocks and made a bad assumption. It might be faster after the driver fix though.

Look at the Valley benchmark....For the most part AMD doesn't do good in Valley. Shows how weak TitanZ was/is.
 
But the Titan Z wins in Tomb Raider, which is an AMD game. Really interesting.

this is the problem today. everybody is expecting that if a game was backed by company x it has to be crippled on company y products. this is [redacted].

Infraction issued for inappropriate language.
-- stahlhart
 
Last edited by a moderator:
Why is the Titan Black running hotter AND using more power than a 290X? I call shens on these bencmarks.
 
this is the problem today. everybody is expecting that if a game was backed by company x it has to be crippled on company y products. this is [redacted].

Not saying that, but it seems to favor AMD. Not to the absurd degree that the Batman Arkham games favor Nvidia though. That just gets into silly territory.





Why is the Titan Black running hotter AND using more power than a 290X? I call shens on these bencmarks.


If this were biased, the 295X2 wouldn't be the loser in the actual games I think.




Tomb Raider canned benchmark is pretty short....Most likely TitanZ was able to maintain a higher boost clock.

That's a good point.
 
Last edited by a moderator:
Why can't dual GPU cards use a better interlink than an onboard sli/xfire solution? I had a 590 and it was silly to see some games using 1 gpu for lack of support.
 
Not saying that, but it seems to favor AMD. Not to the absurd degree that the Batman Arkham games favor Nvidia though. That just gets into silly territory.

Now that you brought it up, my 290X run the Arkham games like shit when I had Vsync (regardless of setting) on where as the my old GTX 660 SLI have a smooth and steady frame rate with Vsync on.
 
Not saying that, but it seems to favor AMD. Not to the absurd degree that the Batman Arkham games favor Nvidia though. That just gets into silly territory.








If this were biased, the 295X2 wouldn't be the loser in the actual games I think.






That's a good point.

I don't think it's a biased. Just poorly done. So I can't take any of the results with much confidence.
 
I don't think it's a biased. Just poorly done. So I can't take any of the results with much confidence.

Maybe, but the card was delayed for a reason. I would imagine its something more than a driver issue, likely firmware or possibly hardware. The boost clocks seem strange if they are spiking that high early on a load, wouldn't that also increase heat and power resulting in throttling?

Edit: And why are the base clocks and boost clock such a wide difference in comparison to the 780ti SLI? Maybe a problem with heat, perhaps they should have went with a liquid solution like AMD.
 
Last edited:
Nvidia did AMD a favor buy announcing the Z at 3000. Any price AMD sell the 295X2 seem like a good deal as long as it under 3000. AMD got a double whammy on this one. Holding the "single card" performance crown (though I don't consider dual GPU single card) while getting good press at the expense of Nvidia.

Hopefully you are wrong with peoples perception of 295X2 as being a good deal. That would mean that we no longer will be able to count on AMD to bring pricing down.
 
Hopefully you are wrong with peoples perception of 295X2 as being a good deal. That would mean that we no longer will be able to count on AMD to bring pricing down.

No one should be paying more than 550 for a video card. PERIOD. But thanks to Nvidia, anything under 3000 for a video card is consider a "bargain".
 
No one should be paying more than 550 for a video card. PERIOD. But thanks to Nvidia, anything under 3000 for a video card is consider a "bargain".

Pretty bad when paying nearly 3X as much for a dual GPU card as the single GPU model appears to be a good deal. In reality, it should cost less than 2x the single cards.
 
The results are not within the 375tdp i guarantee it even without having this card.

The 6xx watts on the readings is surely on the wall measure.

On the PSU the readings are lower.

Huh. So it actually is faster than the 295X2 then?

EDIT: lol. I just looked at the clocks and made a bad assumption. It might be faster after the driver fix though.

IMO the "driver fix" is a tweak to up the fan speed and frequencies...
 
Last edited:
Back
Top