• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Ghz edition 7970 coming very soon! (Softpedia)

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
AMD fanboy in denial... Here's a quote from the review...

The PowerColor HD 7970 when overclocked and overvolted draws 102W more than overclocked on the stock fan profile and with the stock voltage. It also uses over 100W more than the GTX 680 in the same situation and in the same system!

http://alienbabeltech.com/main/?p=29157&page=3

Here's a "max overclocking" gtx 680 BF 3 drawing 355w total system power consumptions... Even when o/c the gtx 680 uses less power than a stock 7970....

http://www.hardocp.com/article/2012/04/04/nvidia_kepler_geforce_gtx_680_overclocking_review/6

Here's a total system power drawn for the hd 7970 drawing 117w more when overclocked... Stock @ 490w vs 607w when overclocked..

http://www.hardocp.com/article/2012/01/25/asus_radeon_hd_7970_video_card_review/8

Even with facts and number and you're in still denial...😉

Forget power draw (and seriously, including overvolted and oc'd results is iffy, esp. if better binning is coming via the GHz edition). It's not THAT big advantage for the GTX 680, especially since 7970 has lower idle power draw which offsets load power somewhat for 24/7 users. NV's initiative in other areas are far more interesting, like with shrinking PCBs and introducing Adaptive VSYNC, center-weighted Surround, GPU Boost, etc.

You have an interesting post history by the way. :whiste:
 
Forget power draw (and seriously, including overvolted and oc'd results is iffy, esp. if better binning is coming via the GHz edition). It's not THAT big advantage for the GTX 680, especially since 7970 has lower idle power draw which offsets load power somewhat for 24/7 users. NV's initiative in other areas are far more interesting, like with shrinking PCBs and introducing Adaptive VSYNC, center-weighted Surround, GPU Boost, etc.

You have an interesting post history by the way. :whiste:

TBH, i don't really care about power consumptions but facts are facts... Man i'm making a lot of friends already....:$:$:$
 
Which I don't really understand, because Fermi was a pretty successful architecture on both the desktop side, as well as the professional side.

It was and still is. The constructive nit-pick was performance per watt but had many compelling reasons why it was a success over-all, especially after some maturity.
 
Actually comparisons to GCN to Fermi is a compliment based on on the architecture did garner share away from AMD and had strong professional architecture advantages.

IF AMD can garner nice professional revenue growth with GCN - that's an important key.
 
Actually comparisons to GCN to Fermi is a compliment based on on the architecture did garner share away from AMD and had strong professional architecture advantages.

IF AMD can garner nice professional revenue growth with GCN - that's an important key.

Sure, but NV is miles ahead of AMD. I don't see that changing, since NV is about to issue GK110 tomorrow in Tesla form.
 
Probably, but considering that field may be projected for sizable growth -- have to design products and architectures to receive some slices of the pie, one may imagine.
 
Sure, but NV is miles ahead of AMD. I don't see that changing, since NV is about to issue GK110 tomorrow in Tesla form.


Really, how is NV miles ahead of AMD in one release? 7970 is no slouch and they are about equal clock per clock to GK104 (without the compute) We don't know the expectancy of Big K yet. It could have awesome compute and crap for gaming with terrible power consumption.


If AMD were to strip all of the compute out of their cards, then Bitcoin sales would diminish and the performance per watt would increase for gamers. Just sounds like give and take if you don't count the bad driver situation.
 
Last edited:
Really, how is NV miles ahead of AMD in one release? 7970 is no slouch and they are about equal clock per clock to GK104 (without the compute) We don't know the expectancy of Big K yet. It could have awesome compute and crap for gaming with terrible power consumption.

I don't think you read my comment correctly. NV is *already* miles ahead of AMD in terms of HPC and pro graphics market share. (NV pro graphics market share has been around 85-90% going back years now.) Releasing GK110 looks like a *continuation* of that dominance. You are also factually wrong about BTC mining; you can mine just as well with 5xxx series parts that aren't gussied up for compute. The differential has less to do with HPC silicon and more to do with AMD's 32-bit integer performance.

Please read comments carefully and get your facts straight; doing otherwise could lead others to be confused as well.
 
Last edited:
I don't think you read my comment correctly. NV is *already* miles ahead of AMD in terms of HPC and pro graphics market share. Releasing GK110 looks like a *continuation* of that dominance.


Ahhh, gotcha! Sorry about that.

GCN has potential, just needs to be executed properly.
 
I really don't think BigK will benefit gaming much.

This is probably true to large extent. BigK is supposedly built for HPC according to VR-Zone's article, not for gaming. (And I fully expect this to be true given that Kepler has no good HPC part right now, so obviously GK104 is NOT the HPC part.) Expect for BigK to be significantly faster than gtx680, but at a high power cost and of course $price. If they even make it a gaming card at all. Why bother, when you can use the same chip to sell Tesla cards for MUCH more than you can get for a gaming card? The non-gaming cards make up only ~1/3 of discrete card sales but 2/3 of NV's profit in the segment. To give you an idea of how much a older Quadro (pro graphics) or Tesla (HPC) card sells for, here's an older-generation Quadro pro graphics card: http://www.amazon.com/PNY-DisplayPor.../dp/B0044XUD1U Once TSMC's 28nm is more abundant, we may see gaming cards built on BigK. Until then, I fully expect NV to use BigK chips on their professional cards only.
 
Last edited:
For gaming big k will be like 20-25% faster than a 680 reference, so like 5-10% faster than an Oced 680 when the big k is at stock. Like going from 480 to 580 or so
 
For gaming big k will be like 20-25% faster than a 680 reference, so like 5-10% faster than an Oced 680 when the big k is at stock. Like going from 480 to 580 or so

I think you are basing that off old rumors. NV did at least one more respin from what I've heard. Still places it out of my budget though so I don't care. 🙂
 
For gaming big k will be like 20-25% faster than a 680 reference, so like 5-10% faster than an Oced 680 when the big k is at stock. Like going from 480 to 580 or so


I think it would take some performance away IMO. Kind of like more cores = worse single threaded performance. So BigK could actually perform worse in gaming than GK104 for all we know.
 
Expected pricing? Expected performance?

Too difficult to say without knowing 28nm yields for BigK, which are no doubt improving as TSMC gets its act together, but I would not be surprised if BigK never made it down to GeForce and stayed at Tesla/Quadro. A GeForce BigK would almost certainly cost more than a GTX680 which is at the maximum limit of my budget. 🙂 If you have not yet read the new VR Zone article about BigK you may be entertained by it despite the typos: http://vr-zone.com/articles/how-the...-prime-example-of-nvidia-reshaped-/15786.html
 
If vrzone is right, and GK104 really was supposed to be GTX 660,
then GK110 will eventually end up in Geforce, and won't be a slouch for gaming.

Only 3.5B transistors and 256 bit MC is a strong argument for this theory.

On the other hand, 7B transistors vs 3.5B is a little too much of a difference between 660 part and 680.
Hence GK104 really was meant to be 670/680, and BigK was designed as HPC/workstation part from the beginning.

In that case GK110 shouldn't offer huge advantage over GK104, but even so,
there is no reason not to offer it in GF when 28nm sorts out, appropriately DP castrated of course.
 
If AMD is seeing 1250mhz reliably, why release a new card at only 1000mhz? Cut it down the middle, 1100 and you're tied or better than a 680 in most benches. Keep the price at $480 and you're back in business until NV responds.
 
Back
Top