Why still no GDDR5 Geforce GT 640?

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
It's been a couple months since the 640 launched, and a quick check on Newegg revealed that there are still no GDDR5 models available. What could the possible reason for this be? It sells for ~$100, and GDDR5 memory has been the standard for $100+ graphics cards since early 2010 with the launch of the Radeon HD 5670. It loses to AMD's last gen 6670 with GDDR5 as often as it wins, the 7700 series cards laugh at it, and even the aging Radeon HD 5700/6700 series and Nvidia's own Geforce GTS 450 breeze past it. It just makes no sense at this price range.

Nvidia could easily have a contender though, if they gave it GDDR5 memory! On paper, the 640's fill rates are actually greater than the 7750. It has the same amount of texture units and raster operators. It's just crippled by having only 35% of the 7750's memory bandwidth.

Why is Nvidia letting AMD walk all over the 640? Why don't they release a GDDR5 version and start competing? Any ideas?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
My guess is NV still wants to clear stock of sub $200 Fermi parts.
Plus NV is selling everything they can make, so presumably the bulk of their GK107 shipments are going into higher margin mobile parts. The desktop GT 640 is practically a dumping ground for chips that draw too much power - they just need to get rid of chips here, they don't necessarily need (or want) a high-demand desktop video card competing with their mobile GK107 allocation.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
TDP of the GT640 is 65W. Adding GDDR5 would most likely make it require an extra power connector. Hence the reason why.

However why AIBs didnt do it yet, good question.

Plus we still miss the GK106 in between.
 

coffeejunkee

Golden Member
Jul 31, 2010
1,153
0
0
My guess is NV still wants to clear stock of sub $200 Fermi parts.

But GTX560 or 550 are faster anyway, even if GT640 had GDDR5...and low end Fermi is rebranded as low end Kepler (GT610/620/630)...

Plus NV is selling everything they can make, so presumably the bulk of their GK107 shipments are going into higher margin mobile parts. The desktop GT 640 is practically a dumping ground for chips that draw too much power - they just need to get rid of chips here, they don't necessarily need (or want) a high-demand desktop video card competing with their mobile GK107 allocation.

This makes sense.

TDP of the GT640 is 65W. Adding GDDR5 would most likely make it require an extra power connector. Hence the reason why.

However why AIBs didnt do it yet, good question.

Plus we still miss the GK106 in between.

Does GDDR5 really uses so much more than DDR3? I have no idea about graphics cards but desktop ram uses very little power, like maybe 10W to run 8GB.

My guess: GDDR5 is expensive compared to the big pile of DDR3 that's still lying around. They hope they will sell just by their name, and they are probably right (CUDA cores! PhysX! Way its meant yada yada).
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
GDDR5 do use alot of power. You cant compare it to desktop memory.

The memory on the HD48xx couldnt power down. And underclocking the GDDR5 manually could easily save 20W in idle modes.

app_pc_03.jpg

Note its Gbit, not Gbyte.
 

coffeejunkee

Golden Member
Jul 31, 2010
1,153
0
0
Ok, you might be right. But then the question comes to mind why the HD7750 can use 1GB GDDR5 and not require external power while in general Kepler uses less Watts than GCN for same performance.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Ok, you might be right. But then the question comes to mind why the HD7750 can use 1GB GDDR5 and not require external power while in general Kepler uses less Watts than GCN for same performance.

HD7750 is 75W and uses Powertune to stay below 75W. Its just a different (better) approach. Better memory but throttle down when needed.
 

The Alias

Senior member
Aug 22, 2012
646
58
91
Ok, you might be right. But then the question comes to mind why the HD7750 can use 1GB GDDR5 and not require external power while in general Kepler uses less Watts than GCN for same performance.
see 660ti vs 7950 only five watts more on gcn side yet it completely outdoes the 660ti
 

KompuKare

Golden Member
Jul 28, 2009
1,223
1,578
136
Yes, but AT don't actually measure what the actual card consumes:
power_peak.gif


power_maximum.gif


So that is 660TI at 145W vs 7950 at 144W (effectively a tie) using "Crysis 2 at 1920x1200, Extreme profile, representing a typical gaming power draw."

But running Furmark, they get 660TI at 135W vs 7950 at 179W. That's 44W so it seems Nvidia are throttling whereas AMD are not.

What I'd like to see investigated is the power usage differences of AIB's of the same card. For instance the various 660TI's TechPowerUp reviewed seemed vary a fair bit in power consumption. Ok they do run at different speeds, but I've seen similar things with some 7950 and other reviews where the cards were running at the same speed but still pulled different amounts of power. But no reviewer really wants to go there: for a scientific test you'd really need one or two of each card with the same ASIC quality, run them at the same voltage and speed and see what happens.

I wonder if the cheap PCB (relative to price) which Nvidia use for the 680/670/660 use less power than the over-engineered PCBs used in the 7950/7970?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
660ti=150W
HD7950=200W
HD7950B=225W

A little more than 5W.

200W for the 7950, 225W for the 7950B, how did you even calculate that? That's 100% wrong. The Metro 2033 Load numbers are for the entire system, that means it includes added power consumption from CPU workload too. I also bet you they are at the wall as well. You simply took their load and idle numbers and subtracted them. So you made 2 mistakes:

1) Included loaded CPU power consumption and the power consuming X79 chipset power consumption, along with the 7950 GPU; the CPU they are using is a Intel Core i7-3960X @ 4.3GHz. That CPU alone draws a ridiculous amount of power.

power-4.png


and then X79 chipset motherboard draws more than LGA1155 motherboards:
power-5.png


You didn't account for any of that, and when you add it up an i7-3930K OCed system draws an insane amount vs. a 2600k:
power-2.png


So instead what you just did is added all of those X79/3930K power hog properties to the 7950.......

2) Didn't account for the power supply inefficiency. They are using an Antec 1200 Quadro Silver rated = 85%. So you'd need to multiply all of that by 85% to begin with.

Just think for a second what you are even saying. A stock HD7970 draws less than 200W. So how can a stock HD7950 draw 200-225W. It's obvious the way you used those numbers is wrong. My HD7970 draws around 215-220W at 1150mhz! I used a P3 Kill-a-Watt to measure it. How can a stock HD7950 draw 200W? :rolleyes:

MSI TF3 7950 = 142W
imageview.php


TechPowerUP Sapphire Flex 7950 = 145W
power_average.gif


Tom's Hardware = 145-157W, not a single one exceeds 160W
D04%20Power%20Gaming%20Peak.png


MSI GTX660Ti PE = 134W
imageview.php


Plus, HD7950 B's reference card power consumption is pretty much irrelevant since that's not what the consumer will be buying. All 7950 after-market cards use much lower voltages and draw less power. There is only 1 HD7950 GPU Boost reference card on all of Newegg vs. 18 other7950 cards. See Tom's chart above and you can see after-market 7950s even with 880-900mhz still draw less than 160W.

==========================

I don't think even GDDR5 would save GT640. The card is just slow, very slow. But since NV sells them anyway, they have no incentive to spend any $ to add GDDR5.

value-average.png


Even with GDDR5, it would still lose badly to an HD7750.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
200W for the 7950, 225W for the 7950B, how did you even calculate that? That's 100% wrong. Those numbers are for sure from the wall, which have nothing to do with the GPU power consumption.

There is no way those numbers are right. My HD7970 draws around 215-220W. I used a P3 Kill-a-Watt to measure it. How can HD7950 draw 200W? :rolleyes:

You should check the cards TDP first before going bananas again. And because YOUR card doesnt mean others cant.

:thumbsdown::thumbsdown::thumbsdown:

http://www.anandtech.com/show/6152/amd-announces-new-radeon-hd-7950-with-boost
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
You should check the cards TDP first before going bananas again. And because YOUR card doesnt mean others cant.

:thumbsdown::thumbsdown::thumbsdown:[/URL]

TDP has little to do with real world power consumption. So I could care less what the card's TDP is. It could be 250W, it doesn't mean anything really. TDP is just a guidance for manufacturers/OEMs to design a heatsink to cool the card. "The thermal design power (TDP), sometimes called thermal design point, refers to the maximum amount of power the cooling system in a computer is required to dissipate." What makes it even more useless is that companies define TDP differently (AMD and Intel CPUs). So really, it's mostly a guidance number and has little to do with real world power consumption for AMD cards.

I linked at least 3 professional reviews that all prove you wrong and even showed you that your math calculations using AT's review are completely incorrect. That should tell you you made a mistake with your math. Also, it's been shown on our boards that HD7970 @ 1150-1165mhz draws 235-238W.

That should have given you the hint how off your #s are. Not only they don't make mathematical sense from 2 aspects, but 3 reviews show different results. They should have clued you in that you made a mistake somewhere. So instead of wasting time trying to defend your post, maybe you should actually consider what we are telling you.
 
Last edited:

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Back to the original discussion...

Obviously Nvidia has lots of GTX550ti stock they are trying to get rid of, or we would have seen the gddr5 GK107 part by now. Still, the price for the gt640 is way to high.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Back to the original discussion...

Obviously Nvidia has lots of GTX550ti stock they are trying to get rid of, or we would have seen the gddr5 GK107 part by now. Still, the price for the gt640 is way to high.

Or their plan is the next step is a GK106. Its the GPU we still lack to see. And it gonna end somewhere in between GT640 and GTX660ti.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I have a feeling NV will just release a faster GK107 chip with GDDR5 later and drop prices on this 640 card to $60-70 level.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Or their plan is the next step is a GK106. Its the GPU we still lack to see. And it gonna end somewhere in between GT640 and GTX660ti.

Maybe. Or Nvidia could end up with a hole in its lineup like the one AMD had between the 5770 and the 5850.
 

coffeejunkee

Golden Member
Jul 31, 2010
1,153
0
0
GDDR5 do use alot of power. You cant compare it to desktop memory.

The memory on the HD48xx couldnt power down. And underclocking the GDDR5 manually could easily save 20W in idle modes.

app_pc_03.jpg

Note its Gbit, not Gbyte.

Ok, you're saying the GT640 uses GDDR3 instead of GDDR5 so it can fit into a 75W TDP but then you show us a graph in which GDDR5 uses a lot less power than GDDR3.
 

hyrule4927

Senior member
Feb 9, 2012
359
1
76
Ok, you might be right. But then the question comes to mind why the HD7750 can use 1GB GDDR5 and not require external power while in general Kepler uses less Watts than GCN for same performance.

You're using the performance of top end parts to make assumptions about the lower end GPUs. 7970 and 7950 have higher power consumption because they have full compute functionality. 78x0 and 77x0, much like current Kepler parts, are cut down in this department and thus have similar (and sometimes better) perf/watt than Kepler
 
Last edited:

coffeejunkee

Golden Member
Jul 31, 2010
1,153
0
0
You're using the performance of top end parts to make assumptions about the lower end GPUs. 7970 and 7950 have higher power consumption because they have full compute functionality. 78x0 and 77x0, much like current Kepler parts, are cut down in this department and thus have similar (and sometimes better) perf/watt than Kepler

Are you sure? Afaik GCN is just better at compute than Kepler. A half Tahiti gpu will still be better at gpgpu than half a Kepler.

But it's not that relevant actually. I think the best evidence for GDDR5 not using more power than GDDR3 is that many cards come in versions with either GDDR3 and GDDR5 while having same TDP. That, and Shintai's graph.
 

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
Ok, you're saying the GT640 uses GDDR3 instead of GDDR5 so it can fit into a 75W TDP but then you show us a graph in which GDDR5 uses a lot less power than GDDR3.

Lol, not only that. But the GT 240 (based on the 9600GSO) uses GDDR5 and stays within the 75w TDP, thats a MUCH less effecient card than the new 640's
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
You're using the performance of top end parts to make assumptions about the lower end GPUs. 7970 and 7950 have higher power consumption because they have full compute functionality. 78x0 and 77x0, much like current Kepler parts, are cut down in this department and thus have similar (and sometimes better) perf/watt than Kepler

IIRC, the Radeon HD 7800 series actually has much the same front end as the 7900 series. It's just the stream processor count and memory bus has been cut down. That's why you actually see the 7870 match or exceed the 7950 at stock on some benchmarks.