R9 300 cards listed in new driver - R9 370 is a rebrand

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
There is no doubt that the titan X is less efficient than the gtx 980.

a lot of people have suggested it is the ram. Well, you can easliy figure out about how much power 7ghz Gddr5 uses by doing the math.

The gtx 980 is 4gb vs the titan at 12gb. the loss in performance per watt is due to an extra 8gb or ram

Is the power draw of RAM in use the same as RAM not? The Titan tends to underutilize its RAM in comparison to most other cards.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
Is the power draw of RAM in use the same as RAM not? The Titan tends to underutilize its RAM in comparison to most other cards.

Quantity and voltage will drive up system power regardless of activity, so it makes sense. Maintaining zeroes takes as much power as maintaining ones.
 
Last edited:

goa604

Junior Member
Apr 7, 2015
24
0
0
I cant believe you actually argue with this "ShintaiDK" nvidia troll/apologist instead of ignoring him.

Warning issued for member callout.
-- stahlhart
 
Last edited by a moderator:

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Is the power draw of RAM in use the same as RAM not? The Titan tends to underutilize its RAM in comparison to most other cards.

As another poster said, the GDDR will use power regardless.

It is a pretty well known fact that the Titan X has worse performance per watt than the 980.

With Kepler, nvidia was able to match performance per watt of the gk104 on the gk110. They actually exceeded the gk104 performance per watt in some of the reviews out.

But the Titan x, there is a notable drop in performance per watt. The layout of the SM is exactly the same so the performance per watt should be pretty close. The only major difference is the ram. Titan X is 150% a 980 except for the GDDR, its exactly 3 times as much.

I was just saying that anyone can do the math and see about how much power 4gb of 7Ghz GDDR uses.

I am not wanting to get into the debate that's been going on. Just saying...

But you know, from there a person then could predict about how much power AMD can save going with HBM. It would be a rough guess but more meaningful