• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Cryptocoin Mining?

Page 198 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
I realize this. Still, Litecoin will be a level playing field for GPUs. If the price goes up to say $15-20 it could be worth it to continue mining for a lot of people.

Litecoin have been appreciating in value as of late actually, what's most interesting is the increase in LTC to BTC exchange rate. Just a week ago my holdings were worth around 15 BTC and now 21 BTC. The value used to always be pegged directly to BTC for the most part and now it's starting to break away.

Look at the last six months of LTC prices vs BTC prices. They fell quite a lot. In the short term there will be fluctuations of course.
 
Won't work. There are a zillion people chomping on the bit to do that exact same thing.

First of all, nobody have any Litecoin FPGAs ready yet. Second of all, only a few companies are working on one. Third, most people are heavily involved with ASIC race in Bitcoin and rather concentrate on that.

Litecoin will be really valuable for those who venture in to it first witha FPGA. Litecoins are Bitcoins 2 years before now
 
First of all, nobody have any Litecoin FPGAs ready yet. Second of all, only a few companies are working on one. Third, most people are heavily involved with ASIC race in Bitcoin and rather concentrate on that.

Litecoin will be really valuable for those who venture in to it first witha FPGA. Litecoins are Bitcoins 2 years before now


The litecoin market will saturate at an exponentially faster rate than bitcoin did. Why? Because bitcoin paved the way and people already know what it is now. Litecoins are just the next thing that everybody is ready for. Unlike when bitcoins started. It will be blinding how fast it saturates and the level of difficulty will dwarf the time it took bitcoin.
 
As long as its easy to exchange BTC to litecoins and vice versa, all these coins will be of equivalent value to BTC based on difficulty. Outide of temporary fluctuations (which traders can quickly exploit), it should even out.
 
First of all, nobody have any Litecoin FPGAs ready yet. Second of all, only a few companies are working on one. Third, most people are heavily involved with ASIC race in Bitcoin and rather concentrate on that.

Litecoin will be really valuable for those who venture in to it first witha FPGA. Litecoins are Bitcoins 2 years before now

And will LTC FPGA Miners be able to out develop AMD ? Don't forget GPU companies have product cycles of 12-18 months, sometimes less.

From what I read at first the HD8000 series will be tweaked HD7000 series so no more shaders. But lets say in later 2014 a new lower nm HD9000 series arrives with twive the shaders of the HD7000 series and less power consumption.

Couls work out on a $/k/hash/watt basis better than an FPGA newly developed and in the waiting listing. Not the same story as BTC ASIC vs FPGA/GPU.

M
 
Look at the last six months of LTC prices vs BTC prices. They fell quite a lot. In the short term there will be fluctuations of course.

Doing better than ever though! LTC/BTC has droken 0.033 now today. The old peg of around 40LTC= 1 BTC has gone for now.

And this is what is important to me rather than the $/LTC price as it's easier and cheaper to get LTC into BTC into FIAT than LTC into FIAT.

Still, lovely to see LTC trading over $4 when it costs $0.80 in electricity (16 hours) for me to mine one (@ $0.18 p/kwh).

M
 
And will LTC FPGA Miners be able to out develop AMD ? Don't forget GPU companies have product cycles of 12-18 months, sometimes less.

AMD isn't developing mining GPUs, the FPGA/ASIC guys are building with only that specific task in mind.... big difference.
 
The litecoin market will saturate at an exponentially faster rate than bitcoin did. Why? Because bitcoin paved the way and people already know what it is now. Litecoins are just the next thing that everybody is ready for. Unlike when bitcoins started. It will be blinding how fast it saturates and the level of difficulty will dwarf the time it took bitcoin.

Except its algorithm is different from Bitcoin and it is much harder for an FPGA/ASIC to be designed to mine litecoins compared to bitcoins
 
The litecoin market will saturate at an exponentially faster rate than bitcoin did. Why? Because bitcoin paved the way and people already know what it is now. Litecoins are just the next thing that everybody is ready for. Unlike when bitcoins started. It will be blinding how fast it saturates and the level of difficulty will dwarf the time it took bitcoin.

There won`t be any problems with the GPU miners switching from Bitcoin to Litecoin. The problem with difficulty only happens when you get FPGA/ASICs that put down a ridicilous amount of GigaHashes. And you can`t use the same ASICs on Bitcoin and Litecoin

And will LTC FPGA Miners be able to out develop AMD ? Don't forget GPU companies have product cycles of 12-18 months, sometimes less.

From what I read at first the HD8000 series will be tweaked HD7000 series so no more shaders. But lets say in later 2014 a new lower nm HD9000 series arrives with twive the shaders of the HD7000 series and less power consumption.

Couls work out on a $/k/hash/watt basis better than an FPGA newly developed and in the waiting listing. Not the same story as BTC ASIC vs FPGA/GPU.

M

GPUs won`t even come near the performance/watt as ASICs does. Not a chance in hell. 5.5GH/s ASIC use 30W. To match that hash performance you need 8 * 7970s, which will use 1350W.
Believe it or not, but hash performance haven`t increased a whole lot with newer series graphic cards from AMD either. After all, GCN (7000 series) is still a tweaked VLIW architecture (5000,6000 series). 5970 for example can do 700MH/s @294W. 7970 can do 700MH/s @ 250W. Thats three generations but still only 50W difference.
 
Last edited:
If you transfer bitcoins from a PPS service like btcguild to an account like mtgox, does anyone know off-hand how long it takes to show up in your mtgox account ?

Minutes, hours ? etc.

Thanks.
 
Except its algorithm is different from Bitcoin and it is much harder for an FPGA/ASIC to be designed to mine litecoins compared to bitcoins

I'm sure it's different. But are you sure it's much harder? Or is it just that it requires a different ASIC or FPGA? My point is, that bitcoin is supposedly the first venture into the "mining" thing. It was unknown and took a while to gain steam. But now, everybody is looking out for the next great thing especially with the level of BC difficulty rising so quickly. Rest assured there are people, companies, industries working on Litecoin FPGA/ASIC/AVALON type systems as we speak. Right now. I don't know how some of you cannot, or simply will not see this. Makes no difference to me, just the lack of foresight is sometimes astonishing.
 
Please do even the most basic research first if you're gonna post this stuff rather than be "astonished." FPGA/ASICs are doable for litecoins but the RAM usage is much more intensive so it'd be more expensive. You still save a ton of wattage on the hashing chip, but the price of tons of RAM chips makes FPGAs/ASICs very costly up front, and in that sense it's vastly different than bitcoin FGPAs/ASICs. It'd be more for those in high-electrical-cost areas where the operating costs dwarf the equipment costs.
 
Please do even the most basic research first if you're gonna post this stuff rather than be "astonished." FPGA/ASICs are doable for litecoins but the RAM usage is much more intensive so it'd be more expensive. You still save a ton of wattage on the hashing chip, but the price of tons of RAM chips makes FPGAs/ASICs very costly up front, and in that sense it's vastly different than bitcoin FGPAs/ASICs. It'd be more for those in high-electrical-cost areas where the operating costs dwarf the equipment costs.

Agreed. Hence my previous comments it it takes 2 years to get to market a litecoin FPGA with memory attached they could find that the top end AMD HD9000 series will have twice as many shaders as the HD7970 and use 30% less power and be cheaper than a Scrypt specific FPGA.

In theory a single GPU like this with some tweak gould hash around 1500+ k/hash.

M
 
Agreed. Hence my previous comments it it takes 2 years to get to market a litecoin FPGA with memory attached they could find that the top end AMD HD9000 series will have twice as many shaders as the HD7970 and use 30% less power and be cheaper than a Scrypt specific FPGA.

In theory a single GPU like this with some tweak gould hash around 1500+ k/hash.

M

Then explain to me why a 5970 can do the same hashrate as a 7970, 750kH/s?

5970 is 40nm, 7970 is 28nm. Next node shrink will be 20nm (including the 9000 series), a much smaller jump. I don`t see why they should suddenly gain so much more.

I'm sure it's different. But are you sure it's much harder? Or is it just that it requires a different ASIC or FPGA? My point is, that bitcoin is supposedly the first venture into the "mining" thing. It was unknown and took a while to gain steam. But now, everybody is looking out for the next great thing especially with the level of BC difficulty rising so quickly. Rest assured there are people, companies, industries working on Litecoin FPGA/ASIC/AVALON type systems as we speak. Right now. I don't know how some of you cannot, or simply will not see this. Makes no difference to me, just the lack of foresight is sometimes astonishing.

Read about the difference between SHA256 and Scrypt.
 
Last edited:
Then explain to me why a 5970 can do the same hashrate as a 7970, 750kH/s?

5970 is 40nm, 7970 is 28nm. Next node shrink will be 20nm (including the 9000 series), a much smaller jump. I don`t see why they should suddenly gain so much more.

I don't get it, 20nm from 28nm is roughly the same jump from 40nm to 28nm. Only problem is that GPU prices are going up.
 
My has rate with two 7950s at 1100/1600 is pretty dynamic

8690451028_f65e153478_b.jpg
 
Back
Top