UPDATETITAN Z Base 705mhzBoost 876mhz

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Bubbleawsome

Diamond Member
Apr 14, 2013
4,834
1,204
146
Wait, are we saying nvidia will be power or thermally limited on a card that is triple slot no less. :eek: No way!
 
Feb 19, 2009
10,457
10
76
If it boosts to 950 it'll get close to the 295X2 for sure.

Hmm, I'm starting to think it may end up competitive. Still needs a price reduction.

Nope, Computerbase.de has a lot of games tested (without Mantle BF4 MP) in their R295X2 review, compared to SLI 780ti at stock that boosts to 1ghz, R295X2 was clearly faster. The SLI 780ti needed its power limit raised and 100% fan "max boost" to beat the R295 that was also running at "max" of 1018mhz.

There's no doubting at 4K and multicard, Hawaii just scales better.

To claim faster, they are going to need around 1.2ghz boost clocks, or whatever 780ti with power limit raised and 100% fan would reach. They did not specify in that review, but looking at their power consumption, it looks to be a pretty big boost.

171mmHA.jpg
 

caswow

Senior member
Sep 18, 2013
525
136
116
nope. exactly opposite is true.
power consumption goes pretty much linear with GPU clock.

it's voltage that skyrockets the power

i actually ment voltage. voltage wont stay the same 700mhz 876mhz or 1100mhz. :$
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
not that incredible. power consumption scales exponentially not linear if you go higher clocks. that means if you want to have same perf. as 295x2 your consuption would be as high as 295x2's.

But it won't be as high as the 295x2's. Like I already said, see gtx690.

46202.png

46163.png


97% performance of SLI'd cards while shaving 30% off the power draw vs. SLI.

I fully expect dual GK110's on 1 card to exhibit this same power savings.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136

That is one of the most misleading graphs I have seen as of late. The noise level for the AMD is at 100%, which is fine if its the highest. But the noise level for the nVidias is shown as 16%? Now yes dB's are logarithmic in scale, but the graph doesn't even represent that.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
http://www.techpowerup.com/200339/asus-announces-the-geforce-gtx-titan-z-dual-gpu-graphics-card.html`
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
Is Nvidia going to allow aftermarket versions? That cooler needs to go.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
While technically he was wrong I think it was a mental shortcut with the point being that you don't need the same voltage at 850MHz that you do at 1100MHz. You could keep the voltage the same but it wouldn't be ideal.


It has the same boost clock as Titan which does boost to 950MHz so it surely will, keeping the boost clock is another matter entirely. My Titan at default doesn't maintain the boost clock very well, it keeps it long enough for most benchmark so it's useful for marketing but completely useless for actual gaming. Titan Z has suspiciously low base clock compared to Titan tough so I think it will mostly keep the boost during short benchmark runs, exploiting the fact that reviewers are too lazy to benchmark warmed-up cards. Cold benchmark runs are purely academic unless someone actually plays in 3 minutes bursts. I think the board power of 375W will limit the sustainability of the boost clock severely, it surely has such a low base clock for a reason.

This^^^

Most sites don't run the cards long enough for them to settle into their stable clocks. I also want to see people's reactions when the VRM's are well north of 100°C.
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
This^^^

Most sites don't run the cards long enough for them to settle into their stable clocks. I also want to see people's reactions when the VRM's are well north of 100°C.

People act as if 295x2 has adequate VRM cooling either.

Hint, it doesn't come close to being adequate.

But keep talking about both these cards as if non-retarded people would buy either of them.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
People act as if 295x2 has adequate VRM cooling either.

Hint, it doesn't come close to being adequate.

But keep talking about both these cards as if non-retarded people would buy either of them.

71C is a bad VRM temp?

HEfhQau.png


Source: Guru3D
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
People act as if 295x2 has adequate VRM cooling either.

Hint, it doesn't come close to being adequate.

But keep talking about both these cards as if non-retarded people would buy either of them.

Source? Everything I have seen (Including the above post) has shown the VRM's are fine.
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
You've never seen a card running Furmark before? :|

They explained clearly that they had to pay off their 10,000 EUR investment on their FLIR.

Apparently they traded 35C for a down-payment on the camera.

I'll be sad when TechPowerUp does the same.

Hopefully their financial situation will allow them to be honest for at least as long as Guru3D was until recently.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
They explained clearly that they had to pay off their 10,000 EUR investment on their FLIR.

Apparently they traded 35C for a down-payment on the camera.

I'll be sad when TechPowerUp does the same.

Hopefully their financial situation will allow them to be honest for at least as long as Guru3D was until recently.

Are Hardware Canucks paid too?

http://www.hardwarecanucks.com/foru...md-radeon-r9-295x2-performance-review-17.html

Legit Reviews?

http://www.legitreviews.com/amd-radeon-r9-295x2-8gb-video-card-review-at-4k-ultra-hd_138950/11

I thought AMD was on the verge of collapse, where are they getting all the money to buy Mantle developers and review sites? :hmm:
 

Hitman928

Diamond Member
Apr 15, 2012
6,753
12,492
136
They explained clearly that they had to pay off their 10,000 EUR investment on their FLIR.

Apparently they traded 35C for a down-payment on the camera.

I'll be sad when TechPowerUp does the same.

Hopefully their financial situation will allow them to be honest for at least as long as Guru3D was until recently.


So, because Guru3D paid a lot of money for their camera, that automatically means that they've sold out to AMD and can no longer be trusted. . .

It couldn't be that Guru3D and tpu put completely different loads on the GPU while testing, that tpu uses a power virus type of application that causes far more heat than any gamer will ever see. It couldn't be a difference in test bench, air flow, ambients, etc. I don't follow you at all.
 

24601

Golden Member
Jun 10, 2007
1,683
40
86
Yeah, because I have serious reason to doubt the creator of GPU-Z.

He's a totally noob that cannot compare to the likes of Legit Reviews.

Totally.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
They explained clearly that they had to pay off their 10,000 EUR investment on their FLIR.

Apparently they traded 35C for a down-payment on the camera.

I'll be sad when TechPowerUp does the same.

Hopefully their financial situation will allow them to be honest for at least as long as Guru3D was until recently.

No site that is run by someone with something even remotely resembling brain tissue between their ears would outright lie like that
 

Abwx

Lifer
Apr 2, 2011
11,997
4,954
136
The temps seem par for the course as far as dual-gpu cards are concerned, if someone wants to see how hot looks, go see a thermal picture of GTX480 under a strenuous load. That's hot, the temperature of under 100C for VRMs doesn't seem worrying at all.

Not as hot as Nvidia s VRMs wich according to hardware.fr are substancialy hotter than on AMD s cards with temperatures reaching up to 120°C, on the 290X review they explicitely say that Nvidia s 780ti power supply is heating more as it was not really designed for extreme overclockings.

http://www.hardware.fr/articles/912-4/bruit-temperatures.html