• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GTX780 will not be based on GK110? (OBR Rumor)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Based on everything I said in my previous comment, is that calculator mostly accurate and am I figuring correctly or not?

It will always depend on how a person has their setup configured. For example, if someone undervolts and still gets max OC they'll be using a lot less power while stilling high hashes.

What did you use for power consumption?
 
Based on everything I said in my previous comment, is that calculator mostly accurate and am I figuring correctly or not?
I find that the calculator is way too conservative.

You can make about $100/month with a 7970, minus electricity costs which are about $25 depending on where you live.

The 7850 will make about $50/month because it has 1/2 the shaders.
 
And? I mean why should anybody care? I look at DX11 games and see that the GTX680 is winning 17 out of 24 against the 7970.

We've been over this again and you were proven wrong on this already many times. 7970 GE is a competitor to the 680 not 7970. If you want to discuss games that specifically use DirectCompute, that is Sleeping Dogs, Dirt Showdown and Sniper Elite V2, 7970 series has a huge advantage over 680 in those titles. However, thus far the majority of games do not use DirectCompute for HDAO/Post-processed AA or have a global lighting model. Please let's stay on topic of GTX780 not derail this thread into another 7970 vs. 680.

No problem: The GTX690 winning 20 of 24 DX11 games.

$450 card vs. $1000 card.
Single-GPU vs. Dual-GPU.

🙄

Like GeForce Ti 4800 was an overclocked 4200, 5950U was an overclocked 5900U, like a 6800U was an overclocked 6800GT, etc. You are arguing semantics of naming. HD7970 GE could have been called 7980 or 7970 XTX PE or 7970 Ultra Extreme (like 6800 Ultra Extreme was also nothing but an overclocked 6800 Ultra).

What's the fastest single GPU in this chart?

1100mhz 7970 $450 vs. $568 1267mhz Galaxy GTX680.
Vapor-X 7970 Ghz $450 vs. $588 MSI Lightning GTX680

~$120-140 price premium for no advantage in performance. NV knows its target customer base well. NV's mgmt ftw! that they can successfully continue to charge such large premiums and get away with it.

You deliberately ignore the 7970 Ghz ed even though its widely available and still significantly cheaper than 680. We had this rubbish of yours disproven several times already, stop spreading fud.

Exactly. 1Ghz 7970 costs $380 on Newegg. They don't even compete on price. Regardless, seriously guys this thread has been very civil and we have many 680 vs. 7970 discussions and let's not derail it into another 7970 GE uses as much power as a 690 non-sense thread.
 
Last edited:
That's just to break even? And what of the 800 bucks he said he made on top of that? Is that even possible?

Ya, it is possible. You cannot use that calculator today to figure out the profits that were made on a 7970 retroactively from January 2012. 2 major reasons for this: (a) Difficulty has increased, which means all those months prior to today the 7970 was making a lot more bitcoins (b) the value of bitcoins has changed, which means those retroactively mined bitcoins are worth double today.

Right now a 1350mhz 7970 makes about 8.5 BTC / month. The difficulty back in January was probably half today's (or less), implying the card likely made 17 BTC at the beginning of the year. As difficulty rises, the value of BTC tends to increase, but not always. In early 2012, the value was about $6 per coin which is why 17 coins production x $6 = $102 in February vs. 8.5 x $12 = $102 today give you roughly the same profitability per month.

Let's say someone started in February, that would have been 16-17 BTC in February, 15 in March, 14 in April, 13 in May, etc. all the way down to slightly less than 9 this month. If that person gamed 3 hours a day out of 24 that by now he/she wold have accumulated roughly 116 BTC x (21/24) ~ 102 BTC. The value of each is about $12.30 = $1250, less electricity cost and you are easily netting $800.

The second point is actually a glaring omission by most people who don't quite get the bitcoin mining GPU strategy. A lot of miners made $ during using 4800/5800/6900 too. That means after selling the old AMD cards and reinvesting the bitcoins from those generations, the $550 7970 was likely free to begin with from mining with a 6950 for instance. Back in the days a single 4890 made more than 1 BTC per day. Those original coins from the first half of the year can still be in a person's wallet since they didn't need to be used to actually buy the 7970. Thus, it's not necessary to assume that they were all sold back then at $5-6. Almost hard to believe but if anyone took the risk of buying 3 7970s in January 2012, all those are now paid for and have been making extra profits since mid-summer. By now, 3 7970s would have earned nearly $3,000 USD before electricity costs. You can run each without even trying at 170-180W since you don't need memory speed. That's less than 550W of power from 3 overclocked cards, each making > $100 a month.

==========================================

Back to topic.
 
Last edited:
Based on everything I said in my previous comment, is that calculator mostly accurate and am I figuring correctly or not?

Power for custom 7970s ghz ed, not reference with 1.25v: http://www.techpowerup.com/reviews/Gigabyte/HD_7970_SOC/26.html

180w avg, 203w peak.

Put into the calculator:

$400 for the gpu.
.12 usd for power
200w
650Mh

Hardware break even: 204 days


No idea how bitcoining mining affects total system, ie. cpu/ram etc, but just in case: up it to 250W system wide

Hardware break even: 221 days

So essentially if you have cheap electricity, leave ur rig on mining when not gaming and it does pay for itself, and later, make a profit.
 
Power for custom 7970s ghz ed, not reference with 1.25v: http://www.techpowerup.com/reviews/Gigabyte/HD_7970_SOC/26.html

180w avg, 203w peak.

Put into the calculator:

$400 for the gpu.
.12 usd for power
200w
650Mh

Hardware break even: 204 days


No idea how bitcoining mining affects total system, ie. cpu/ram etc, but just in case: up it to 250W system wide

Hardware break even: 221 days

So essentially if you have cheap electricity, leave ur rig on mining when not gaming and it does pay for itself, and later, make a profit.

In the case I am refering to, the user did not buy an hd7970 today. He bought it in January when it was $550, so the break even day even with your flawed specs is ~285 days. But in order to get 650mh or more it has to be significantly overclocked. And significantly overclocked = significantly higher power draw. Again, he is saying he has made $800 profit on 1 hd7970 since mid January, so he's saying he's made $1350 in bitcoin mining. According to my searches, $.12 kwh is on the low end of the cost spectrum. Getting free electricity, at 650mh you're still looking at 24/7 bitcoin mining for 208 days STRAIGHT nonstop mining to break even.

Nothing, none of it, adds up.
 
Last edited:
1.1ghz = 650Mha. The link i showed u is a high OC custom 7970 that uses 180-203w.

Custom ghz ed is very different in terms of power use than ref, since its vcore is nowhere near the default ref 1.25v.
 
1.1ghz = 650Mha. The link i showed u is a high OC custom 7970 that uses 180-203w.

Custom ghz ed is very different in terms of power use than ref, since its vcore is nowhere near the default ref 1.25v.

So $800 in eight months in the best of conditions with the best possible overclock with 1 hd7970 is impossible. Thanks for clarifying that.
 
Bitcoining is using Integer and a special format. AMD has much better integer performance because of that. But that is not the "compute" we talk about.
Sure it is, it's a form of computational task that isn't strictly related to gaming. You asked:
And? I mean why should anybody care? I look at DX11 games and see that the GTX680 is winning 17 out of 24 against the 7970.

What can you do with all these "compute task"? And what is not running on Kepler? Because Luxmark is something for what i need only time to progress.
And I answered. My AMD cards have made me a large chunk of change and pretty much made all my upgrades for the last several years free (or the next several years, depending on how you look at it).
In the case I am refering to, the user did not buy an hd7970 today. He bought it in January when it was $550, so the break even day even with your flawed specs is ~285 days. But in order to get 650mh or more it has to be significantly overclocked. And significantly overclocked = significantly higher power draw. Again, he is saying he has made $800 profit on 1 hd7970 since mid January, so he's saying he's made $1350 in bitcoin mining. According to my searches, $.12 kwh is on the low end of the cost spectrum. Getting free electricity, at 650mh you're still looking at 24/7 bitcoin mining for 208 days STRAIGHT nonstop mining to break even.

Nothing, none of it, adds up.
You made incorrect assumptions and then based faulty logic on them, I can only guess at your confusion 😛. First, this is what I said:
Well right now Bitcoin mining has paid for my 7970 and netted me another $800 profit and counting. I'd say compute performance is pretty important.
I started mining in May 2011 with my 6950, and have done it continuously since. Over my time mining I have sold 197 bitcoins for an average of $9.68 a piece, which totals to $1906.96. That is by no means optimal, but certainly decent for an amateur who doesn't sit around playing the market all day. On the flipside, my rig with my 7970 uses 275W (from the wall) when mining and my rig with my 6950 I used ~220W when mining. Since I've had my 7970 mining a bit longer, call the average power consumption 250W. If I mine 24 hrs. a day (which I often do since I'm rarely home), that's 250W x 24hrs = 6kWh per day. Electricity averages 14.5 cents/KWhr here in MA, so that comes to 87 cents/day cost, or $317.55/year. Since I've been mining for 17 months, that comes out to just about $500 ($503 technically, but this is all an estimate) for electricity costs over the time period. Then we do $1900 - $500 = $1400 profit. You could also even say that my 7970 was only $350 since I sold my 6950 for $200, but I digress 🙂.

In the end, what your video card can do for you besides game can be very important. 😎
 
In the case I am refering to, the user did not buy an hd7970 today. He bought it in January when it was $550, so the break even day even with your flawed specs is ~285 days. But in order to get 650mh or more it has to be significantly overclocked. And significantly overclocked = significantly higher power draw. Again, he is saying he has made $800 profit on 1 hd7970 since mid January, so he's saying he's made $1350 in bitcoin mining. According to my searches, $.12 kwh is on the low end of the cost spectrum. Getting free electricity, at 650mh you're still looking at 24/7 bitcoin mining for 208 days STRAIGHT nonstop mining to break even.

Nothing, none of it, adds up.

My first 7970 mines at 1170/685 @ 1.081V and the total system draw from the wall is ~270W (probably drop 10-15W at 1170/170). At those speeds my mining rate is ~700Mhash/s. I pay $0.11/KWh for electricity so some of your assumptions are wrong. An overclocked 7970 doesn't necessarily use a lot more electricity than a stock card.

Also, unless I'm interpreting it wrong, MrK6 said $1400 from mining not necessarily since January. I've been mining off and on through the last year (probably averaged about 10 days/month until July when I went up to 25 days/month) and I've made ~$520 after electricity costs.
 
Good point about the difficulty, I forgot that difficulty goes up with time. When I was mining I was making almost 2 coins a week, I'm sure that same setup would barely bring in 0.5 coins today.

I should peep into the bitmining thread, I wonder if it's still profitable nowadays, for me.
 
Good point about the difficulty, I forgot that difficulty goes up with time. When I was mining I was making almost 2 coins a week, I'm sure that same setup would barely bring in 0.5 coins today.

I should peep into the bitmining thread, I wonder if it's still profitable nowadays, for me.
What's your setup like?

I can give you a rough idea if it's profitable or not.
 
An overclocked 7970 doesn't necessarily use a lot more electricity than a stock card.

No matter how many 7970 members state this, it will be ignored. It will also be ignored that you cannot buy a 7970 GE reference card and that it draws more power than pretty much any manually overclocked HD7970 card @ 1050mhz. Even a manually overclocked 1125-1150mhz after-market 7970 will draw less power than a reference 7970 GE card.

Even at 1070-1100mhz, an after-market 7970 still uses less power than a GTX580.

power-consumption.jpg


power.jpg


For bitcoin mining you can drop that to 170-180W even at 1150mhz since plenty of 7970s can hit those clocks at 1.08-1.15V. Yours can do it at 1.081V, mine fluctuates between 1.089-1.092V @ 1150mhz. The reference 7970 GE cards with 1.25V bios tested at AT or other sites have nothing in common with after-market 7970/7970 GE cards in terms of voltages or power consumption and certainly enthusiasts who are overclocking don't need to put up with 1.20V+ if they don't want to.

Also, unless I'm interpreting it wrong, MrK6 said $1400 from mining not necessarily since January. I've been mining off and on through the last year (probably averaged about 10 days/month until July when I went up to 25 days/month) and I've made ~$520 after electricity costs.

Don't forget all the $ that is still to be made every month before HD8900 series launches, while GTX680 sits there and depreciates.

Back to topic, VR-Zone reported at the end of August that GTX780 isn't expected before the end of March 2013 and performance increase is expected could be 25-30%.
http://vr-zone.com/articles/no-nvid...e-march-2013-maxwell-only-in-2014-/17073.html
 
Last edited:
1150mhz at 1.08v? Seriously? Most cards will need about 1.2v to reach that speed with 100% stability.

You guys really need to stop this nonsense with the super low volts and the uber high clockspeeds. They do not reflect reality. I don't know what kind of golden samples you guys are getting. People are going to get unrealistic expectations of what this hardware is capable of.
 
Your data is confusing. Afterburner says 1.2v, then you have a "min" VRM voltage of 1.08v but it maxes out a 1.167v. Is that 1.08v your idle voltage?

In any event, with both of my cards I need 1.225v to hit 1180mhz. My ASIC quality is 87% on one of the cards as well.
 
Last edited:
Your data is confusing. Afterburner says 1.2v, then you have a "min" VRM voltage of 1.08v but it maxes out a 1.167v. Is that 1.08v your idle voltage?

In any event, with both of my cards I need 1.225v to hit 1180mhz. My ASIC quality is 87% on one of the cards as well.

I never tried undervolting mine to see what I can get, but I can hit 1150mhz on mine stable with only 1.174v, my ASIC is 69.5%.

I found my max stable OC was 1265mhz @ 1.225v, but at that heat (65C) I wasn't too happy, plus not many games I play require them clocks. I wonder what ASIC means if a crappier card can hit higher clocks with the same volts?
 
Your data is confusing. Afterburner says 1.2v, then you have a "min" VRM voltage of 1.08v but it maxes out a 1.167v. Is that 1.08v your idle voltage?

In any event, with both of my cards I need 1.225v to hit 1180mhz. My ASIC quality is 87% on one of the cards as well.

Tahiti <> Pitcairn though. Some of the higher ASIC 7970's can hit 1150Mhz@1.08V so why do you think we're trying to mislead people? Besides from my experience, mining is more forgiving of clocks than some of the more intense games. Never experimented with it at 1170Mhz but I'd probably have to up my voltage to 1.1V to be game stable.

You have to look at the context of what users are posting rather than your preconceived notion of what they're saying.
 
I never tried undervolting mine to see what I can get, but I can hit 1150mhz on mine stable with only 1.174v, my ASIC is 69.5%.

I found my max stable OC was 1265mhz @ 1.225v, but at that heat (65C) I wasn't too happy, plus not many games I play require them clocks. I wonder what ASIC means if a crappier card can hit higher clocks with the same volts?

but that is still a lot closer to something like this:

? Most cards will need about 1.2v to reach that speed with 100% stability.
than this
1150mhz at 1.08v? Seriously


You guys really need to stop this nonsense with the super low volts and the uber high clockspeeds. They do not reflect reality. I don't know what kind of golden samples you guys are getting. People are going to get unrealistic expectations of what this hardware is capable of.

Actually at 1.174 volts your only a mere 0.026v away from the 1.2 sickB says. You are "around" 1.2volts.
 
but that is still a lot closer to something like this:


than this


Actually at 1.174 volts your only a mere 0.026v away from the 1.2 sickB says. You are "around" 1.2volts.

Sickbeast said 1.225 which is actually 0.05mv, not 0.26, in terms of mV's that is a huge difference.

My card is a referrence Sapphire, not a cherry picked.

EDIT: Wait, Sickbeast referring only to his Pitcarin, I thought he bought a 7950 too. Why are we even discussing this, as Elfear pointed out - if he is basing his clocks on Pitcarin, well then that solves the question of the voltage to clock ratio.
 
Last edited:
Sickbeast said 1.225 which is actually 0.05mv, not 0.26, in terms of mV's that is a huge difference.

My card is a referrence Sapphire, not a cherry picked.

EDIT: Wait, Sickbeast referring only to his Pitcarin, I thought he bought a 7950 too. Why are we even discussing this, as Elfear pointed out - if he is basing his clocks on Pitcarin, well then that solves the question of the voltage to clock ratio.
If anything Pitcairn should do higher clocks on lower volts seeing as it's a much smaller chip. There are a lot of similarities.
 
Your data is confusing. Afterburner says 1.2v, then you have a "min" VRM voltage of 1.08v but it maxes out a 1.167v. Is that 1.08v your idle voltage?

No, you can see my GPU usage is 99% loaded when I took that screenshot and has been running at 99% for an extended period of time to show you I am not faking it. This is why the minimum and current are the same since I opened it to take a screenshot for you while still running 99%. My card averages 1.088-1.092V at full load and peaks at 1.174V at most (which is what I have in my sig). The idle voltage is 0.804V.

Here is idle clocks of 300mhz and voltage drops to 0.804V.
http://imageshack.us/photo/my-images/849/7970idle.jpg/

MSI Afterburner or Sapphire Trixx only set your target voltage, not your actual voltage.

The point is if you drop memory clocks even lower, you can come in at 170-180W at 99% GPU load and even an average 7970 doesn't need 1.25V to reach 1150mhz. So you can have a rig with a bunch of 7970 cards mining and it will use a reasonable amount of power.

Actually at 1.174 volts your only a mere 0.026v away from the 1.2 sickB says. You are "around" 1.2volts.

No, 1.174V is max at random peaks. That is not even remotely close to 1.175V average or 1.25V of 7970 GE reference cards used in reviews. The average for me is 1.088-1.092V over 24/7, and I am guessing Elfear's is around 1.08V. These are not golden samples. A golden sample 7970 will hit 1280-1310mhz on air at under 1.3V in MSI AB.

1310.jpg

http://www.legitreviews.com/article/1962/15/

To get an average of 1.175V, you would likely need to put in 1.25V into MSI AB or flash your card with the 7970 GE June 2012 BIOS AMD sent out where it forces the voltage to stay above 1.21V at all times, with a peak of 1.25V. They did this to ensure every single reference 7970 hits 1050mhz, even the ones with ASIC of 50-55%. This is why the BIOS was sent out. Any HD7970 reference board will run 1050mhz with the GE BIOS if you care to flash it. Of course, hardly any enthusiast will bother with that BIOS since with manual overclocking you have more control over the voltage and that's exactly what allows you to narrow down 1150mhz @ 170-180W power consumption @ 99% GPU load.

In any event, with both of my cards I need 1.225v to hit 1180mhz. My ASIC quality is 87% on one of the cards as well.

Keep in mind you are comparing ASIC of 87% among Pitcairn chips. You can't compare ASICs across 2 different families of chips: Pitcairn XT vs. Tahiti XT. The data applies to a specific chip family and comparing it to a database of these chips. Pitcairn XT is not compared to Tahiti Pro (7950) or Tahiti XT chips. 87% on Pitcairn is not necessarily better than 70% on Tahiti XT. Also, other factors come into play such as your GPU and VRM temperatures.

EDIT: Wait, Sickbeast referring only to his Pitcarin, I thought he bought a 7950 too. Why are we even discussing this, as Elfear pointed out - if he is basing his clocks on Pitcarin, well then that solves the question of the voltage to clock ratio.

Ya. Pitcairn has its own parameters and other chips have their own. Just because a Pitcairn needs 1.225V to hit 1180mhz, doesn't mean that can be extrapolated to a completely different chip made on 28nm, even if it's made by AMD. Comparing how much voltage Pitcairn needs for a certain clock vs. Tahiti would be no different than comparing Pitcairn to GK104.

Regarding your question of why my memory is stuck at 1000mhz, ya you don't need it. I just did a full reinstall and for some reason unofficial overclocking mode 2 isn't working on these latest Cats 12.9s and AB 2.2.4s. Any ideas? I might need 2 missing .dll files from Cats 12.2 and earlier.

==============

If anyone finds more info on GTX780, please link it!
 
Last edited:
Your data is confusing. Afterburner says 1.2v, then you have a "min" VRM voltage of 1.08v but it maxes out a 1.167v. Is that 1.08v your idle voltage?

In any event, with both of my cards I need 1.225v to hit 1180mhz. My ASIC quality is 87% on one of the cards as well.

Just so we're all on the same page, if I set 1.081V in AB I only get 0.994-0.996V actual according to GPU-Z.



Are you setting 1.225V in AB or is that what GPU-Z tells you?
 
My sweet spot for heat/noise/performance ended up being 1075MHz at 1.075V. I think my ASIC quality was 84% when I checked.
 
Back
Top