Bitcoin mining is ruining 280x prices for us

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
This is actually not true. Nvidia GPUs are still far outselling their AMD counterparts, it could be due to supply constraints or whatever but at the biggest US etailer, nvidia has dominant sales figures over AMD

...

I think this is because miners are taking away all of the supply from the actual intended demographic - PC gamers.

How can you explain these two conditions? If the miners are taking away all the supply, aren't they *BUYING* that supply? Yet, how can Nvidia have dominant sales figures?

The best explanation I can think of is that your example falls apart when facing the situation of demand being so high as to cause out of stock situation. Then the Amazon list would show zero sales for something out of stock, right? It fails to account for the current anomaly caused by such high demand.

In fact, I would say that a more accurate judge of popularity is not a sales list (which fails to account for out of stock items), but the inflated prices that reflect huge market demand and reaction by auto dynamic price increases that online vendors use.

Another explanation is that AMD simply cannot make enough cards to satisfy demand, and so they will only sell a small trickle of cards because they make only a small trickle of cards. But Nvidia is a huge gorilla pumping out many many cards that keep up with demand. I have no idea if this is the case.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
AMD sells chips, not cards. Get that into your skull. I guess that no one is stopping AIBs from making a mining card like ASRock did not long ago with a mining mobo.

True, but an AIB can't remove features inherent to the GPU, AMD would have to do that.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Funny, I was under the impression that AMD sold full reference boards. They could sell a mining specific reference board with gimped gaming features like I mentioned.

In any case they don't "just" sell chips. They sell cards. Reference cards are always manufactured by AMD, yes, full cards. And the AIB slaps their sticker on it. Not just chips.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
I believe it's a third party that actually makes the reference cards under contract to AMD or nvidia. AFAIK Sapphire for AMD and Foxconn for nvidia ? That could be wrong. I believe the GPUs are sold as a package of GPU & memory.

The AIBs pay a set cost for a reference board, GPU, memory and cooler. This is why you see you reference 780, 780ti and 770 with the Titan cooler costing more than non-refenence cards with custom cooling, because they had to pay the hefty price for the nvidia reference cooler on the board.

I also don't think using Amazon's top rankings of products is the best way to gauge demand. There is no way to know without actually knowing the real data.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
Funny, I was under the impression that AMD sold full reference boards. They could sell a mining specific reference board with gimped gaming features like I mentioned.

In any case they don't "just" sell chips. They sell cards. Reference cards are always manufactured by AMD, yes, full cards. And the AIB slaps their sticker on it. Not just chips.

Then why not bypass the AIBs and sell them yourself? Because you need a proxy to offload your production and stay safe from market fluctuations. AMD makes ONLY the reference cards for launch for obvious reasons like keeping their IP off competition hands as long as possible. Once the deployment is done they don't have cards or boards in production anymore.

It's way better to take orders from AIBs and play it safe than doing all by yourself.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Just join a pool that shows you expanded statistics. Givemecoins does a pretty accurate average for me.

Id rather not run it that long :|

There is kh/s reporting on the side of the cmd, how accurate it is - I don't know.

Alright so before you guys waylay me, this is cuda mining. I have basically no idea what I'm doing. Last time I ran it just with auto, this time I'm messing with some settings and clocks trying to max that money per decade statistic out.

Previous attempt:

mining_zps85d0e79e.png~original


615~ kh/s @ low 500w draw

Second attempt:

k_zps721585d0.png~original


575 kh/s @ 370w


I think I can increase this, of course with a custom bios though. I do not believe it needs that much voltage to run the miner, I'm limited on the stock bios by Boost 2.0... Which actually got me testing vram, and was amazed that lowering vram, increased core clocks (due to lessened TDP constraints), which also increased kh/s.

So I believe I've discovered, at least with my sample/settings... That Kepler has basically almost no dependency on vram speed... This is in direct contrast to my 7950 which is why I wasn't expecting these results.

Assuming profitability remains the same (and I'd bet on a spike before summer) it would take around four months to pay off a 780 at my current electricity rates. Which is before summer, I believe.
 
Last edited:

KompuKare

Golden Member
Jul 28, 2009
1,224
1,582
136
So I believe I've discovered, at least with my sample/settings... That Kepler has basically almost no dependency on vram speed... This is in direct contrast to my 7950 which is why I wasn't expecting these results.

Pure speculation this, but it may be that the CUDA re-work cgminer is doing something different (storing some values in registers for example) which means less trips to main vram? Or indeed the architecture of Kepler (or at least big Kepler GK110) is able to cache the needed values.

It's interesting that after all these years someone was able to get the Nvidia architecture to gain so much vs what the older miners were able to do. I though that architecturally AMD's advantage was due to shift operators so unless something was changed with Kepler does the CUDA re-work uses another technique? Or does scrypt hashing not use the shift operators and hence this would not work for SHA-256 (where GPUs are no longer able to keep up with ASICs anyhow)?

And are whatever technique enables scrypt to work so well on GK110 also relevant to Radeon scrypting (in other words a 30-40% speedup or the option to clock vram down a fair but to save a few watts)?
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Pure speculation this, but it may be that the CUDA re-work cgminer is doing something different (storing some values in registers for example) which means less trips to main vram? Or indeed the architecture of Kepler (or at least big Kepler GK110) is able to cache the needed values.

It's interesting that after all these years someone was able to get the Nvidia architecture to gain so much vs what the older miners were able to do. I though that architecturally AMD's advantage was due to shift operators so unless something was changed with Kepler does the CUDA re-work uses another technique? Or does scrypt hashing not use the shift operators and hence this would not work for SHA-256 (where GPUs are no longer able to keep up with ASICs anyhow)?

And are whatever technique enables scrypt to work so well on GK110 also relevant to Radeon scrypting (in other words a 30-40% speedup or the option to clock vram down a fair but to save a few watts)?

Here is an article by the developer of the improvements for CUDA Scrypt mining, explaining what particular optimizations were made.

It would be interesting to find out if some of these could be applied to AMD cards as well, to improve their performance at least somewhat, even if not to the same degree.

I'd also be interested to see how a parallel, optimized assembly implementation of Scrypt would do on a Xeon Phi. Probably not good enough to be worth it compared to 7970/280X, but the raw numbers might be impressive. The Xeon Phi has ~60 x86 cores which are basically original Pentium architecture, except they also have wide vector units (up to 512 bits). And the on-board RAM is suitably fast and plentiful as well.
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
Assuming profitability remains the same (and I'd bet on a spike before summer) it would take around four months to pay off a 780 at my current electricity rates. Which is before summer, I believe.

GTX 780 575 @ 370 * 0.06 = 115/m
500/115 = 4.34 months
R9 290 825 @ 330 * 0.06 = 175/m
500/175 = 2.85 months

Not to talk that with higher electricity rates you run into profitability problems way before with the GTX 780.
 

KompuKare

Golden Member
Jul 28, 2009
1,224
1,582
136
@JDG1980
Thanks. Guess I should have googled it :)

Now, most of that goes over my head but two quotes jumped out:

Don't use too much memory bandwidth. The GPU has a lot -- from 80 to 300 GB/sec -- but it's not infinite.
Use your memory bandwidth well: If each thread reads a totally random location at a time, your code will be slow. If, instead, most threads read adjacent locations so that the overall read is a big sequential one to memory, you will get a lot of bandwidth.

In these terms, the prior work, Cudaminer, ran the PBKDF2 code on the host CPU, and ran only the scrypt core loops on the GPU. As a result, it had to copy in 1024 bits (128 KB) per key in and out of the GPU. My code moves the entire search process in to the GPU, returning only a single integer of whether or not a scan for several thousand nonces succeeded or not.

So between them there might be an explanation as to why the new CUDA miner is not as vram bandwidth dependent a might be expected and staying a bit closer to the original topic there's the possibility that something similar may eventually be done with an ASIC thereby leaving GPUs in stock for those who want to game...
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Yes I covered that already, it's going to depend on your kwh rates how long break even takes.

$500 290 is rare, the non ghz AM R290 is $580.

The discussion was a gamer paying off the card, free card basically. Not which card would make more money, and comparing a reference $530 290 to a aftermarket 780 for gaming purposes leaves a lot to be desired on the red side.
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,341
264
126
Titan at over 600 kH/s using less than 300W. :biggrin:

But... water cooling = lower temperatures and no fan which is helping me out a little bit with the power consumption. Unfortunately MSI AB doesn't let me underclock my memory so I can't get a few percentages lower in that figure. So my estimates earlier with the GTX 780 were definitely skewed.

1zlchlj.png
 

Imouto

Golden Member
Jul 6, 2011
1,241
2
81
Yes I covered that already, it's going to depend on your kwh rates how long break even takes.

$500 290 is rare, the non ghz AM R290 is $580.

No, it's not. If you're not going to take the $500 price tag (there are 2 cards at that price in both the egg and Amazon) at least pick the $530 most of Newegg's offers show.

The discussion was a gamer paying off the card, free card basically. Not which card would make more money, and comparing a reference $530 290 to a aftermarket 780 for gaming purposes leaves a lot to be desired on the red side.

With higher rates the R9 290 efficiency starts to pay off. At $0.15 the R9 290 sits at $150/m and the GTX 780 at $90/m. If we go by australian or EU rates of $0.28 the R9 290 is still netting an impressive $120/m while the GTX 780 is back to $55/m.

The R9 290 can endure higher electricity rates and coin devaluation way better than the GTX 780.

When you take that in mind the price tag difference isn't that important if you're set on mining.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
If you're set on mining you shouldn't be looking at the 780.

If you're set on gaming with a free video card as RS put forth, Nvidia is an option assuming your kwh rates are within reason.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
If you're set on mining you shouldn't be looking at the 780.

If you're set on gaming with a free video card as RS put forth, Nvidia is an option assuming your kwh rates are within reason.

+1, totally agree. nVidia IS an option now for mining, whereas it wasn't even close before.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Miners want AMD, bottom line. Attempting to talk up NV's mining abilities just makes NV look bad. The double-amputee (NV) now has one leg, but AMD has always had 2 legs in the first place. Most people who want to gamble on crypto should just buy the cryptocurrency outright as it's been 10-20 times more profitable to do so each year for the last 4 years in a row, and with no electricity costs or mining headaches.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Yes miners do, no question. However people are pushing AMD video cards for gaming not mining based on the idea of a free card to do it with.
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,341
264
126
The OP is not a miner. The point that's simply being made is that Nvidia cards will also pay for themselves. This was never the case the in past. They'll just take twice as long. What was once a strong argument to buy an AMD gaming card (even at an inflated price) has been weakened a pretty fair amount with the advances of CudaMiner.

The whole thing stemmed from the idea that AMD should create mining cards (ASICs). I agreed because an AMD card once came with the option of having the GPU pay for itself (where as Nvidia just couldn't mine period). That was a big reason to get an AMD card. Now that Nvidia is catching up, some sort of AMD ASIC could keep them well ahead of Nvidia in mining application, and more importantly, prevent their gaming GPUs (280X like the OP wants) from inflating in price during crypto bubbles. There's no talking down of either side.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Some people who decline to identify their financial conflicts of interest, are pushing video cards promising free cards and whatnot, but what they don't tell you is that electricity is not free and difficulty only has to rise a mere 4.4% per difficulty readjustment to make it so that cards never make enough money to pay for themselves, even if you pay just 10 cents/kWh (which is low compared to the average USA rate which is more like 13 cents/kWh). (Prior to the big snowstorms recently, litecoin difficulty was rising over 10% per block since October). And if cryptocurrency goes to zero you have only a ton of electricity bills to show for it. If people are so bullish on that kind of stuff it's better to buy directly; you do not pay inflated prices for cards, and it's wildly more profitable to buy cryptocurrencies directly since you don't have monthly electric bills eating you and you don't have big upfront costs, either (in cards, PSUs, etc.). http://bitcoinwisdom.com/litecoin/calculator
 
Last edited:

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
If people are so bullish on that kind of stuff it's better to buy directly

You've only said this about 500 times already in this and the other threads.

Now that you've had your say (over and over and over and over and over again), please sit down and shut up. We don't need to see the same post every page.

Warning issued for personal attack.
-- stahlhart
 
Last edited by a moderator: