Looks like my last post about the basic of cryptocurrencies sparked some attention, as I am writing 1 (one!) Bitcoin is trading around 14K USD and the % increase since the beginning of the year is astronomical, so that might be a good part of the reason.
However there are many different cryptocurrencies and, while they all share the basic concepts, they are designed in slightly different ways which, by designs, affects their market.
They all have a blockchain as described in my previous article, they have wallets, miners etc.
Some were created with slightly different goals in mind, but the main differentiation is how the mining is done.
To understand how we got here, we need to see what happened with Bitcoin.
At the beginning coding the software for the mining process was pretty much standard, it would involve some (most likely) C routine to run in the most common operating systems (Windows, Linux, MacOs etc).
This routine would use the CPU to run the loop and calculate the hashcodes.
It worked, but then we quickly realized that common CPUs have 4 cores, meaning they can run 4 routines in parallel, and moved to a multi threaded approach, this allowed to run the routine at the same speed, but 4 times at a time.
By design all these algorithms must be relatively simple to compute because we need many iterations to find the solution to the puzzle (finding the correnct nonce), but it should be quick to verify the correctness of such nonce, else it would generate a lot of work on non-miners.
Let me expand on this.
Say that on average you need to run a loop 10.000 times before you can find the solution.
Veryfying such solution would require one single iteration, therefore 1/10.000 time of the mining process.
Typically, when you mine, you are fed with a number of tasks by "the network" (normally a mining pool) and then you dispatch back the solution.
The pool must verify all your solutions, and also all the solutions of the other peers connected to the pool, therefore the verification process must be fast.
Finally this boils down to the fact that we need a computation which is relatively simple, but can be executed many times in order to mine.
This is the best scenario for parallel computing : simple tasks to be executed many times.
Turns out that modern VGA cards have processosrs (GPU) very well equipped for that : their cores are way less sophisticated than those that we have in our main CPU, however they are good enough to run those basic calculations.
Probably the most desired card for mining at the moment is the NVIDIA GTX 1080 TI , and to put things in perspective, this card has 3.584 Cores (@1.48GHz) while a normal PC CPU usually has 4 cores (@2 to 4 GHz).
So, the GPU cores are still slower than the main CPU cores, but you get PLENTY of them, plus it is possible (with dedicated motherboards) to install 8 of them on a single PC, that makes 28K+ cores!
So, GPU mining was immediately a huge leap forward, but it did not disrupt the market too much because GPUs are anyways consumer products. Sure buying 8 1080 Ti is quite an investment, but it is still feasible and anyways you could start with one and then expand (don't do it, not for Bitcoins at least, it would not pay off, keep reading).
But GPUs are not the only computing hardware that deals well with parallel computing, FPGAs (Field Programmable Gate Arrays) are extremely interesting devices that allow the designer to create computing logic (including simple CPU cores) designed for a specific task.
Now, GPUs were not designed to mine currencies, they were repurposed for that, while FPGAs are a sort of blank canvas that you can arrange pretty much the way you like (I played with them, you can see how I created, ironically, a VGA interface).
The problem with FPGAs is that they tend to steeply increase in price when you need many logic cells (to accomodate many computing units) and performance.
However FPGAs were invented to allow prototyping of solutions, whatever you can do with them it can be transferred to ASICs.
ASICs are Application Specific Integrated Circuit and they are designed to perform a single task, they are not as versatile as CPUs.
While you can purpose an FPGA (technicaly you synthesize a circuit) to do a task and then you can erase it and repurpose for something else, an ASIC is the implementation of one single circuit.
ASICs are not a good solution unless you want to run a mass production, typically they will have a very high production setup cost and a low cost per item afterwards.
When mining and Bitcoin became mainstream, the market was big enough to justify mass productions and that's when ASIC miners changed everything.
Since they are designed for that specific role only, they are "cheap" and power efficient (that's why you should not mine bitcoins with your GPU), allowing massive parallel computing.
Technically this changed the market because these are not typically consumer products, while you probably have at least one GPU in your house, chances are that you don't have an ASIC miner (but you can buy one if you like, even tho it's profitability is going to drop quite quickly. Usually only the latest model is profitable, get ready to change them often).
You can actually buy one, but again, unless your electricity is really cheap, you may waste money.
This means that specialized companies were created, ususally they have access to cheap electricity (i.e. with Solar or geothermal) and always keep updated with the latest asic miner model.
The reward of the mining process is linked to the average time effort of mining a coin, so those companies managed to blow out of the water all the home miners.
At this point the cryptocurrencies, that were designed to be controlled by an extremely wide segment of the population, started to look like common valuable resources : controlled by big investors.
To (try to) avoid that new currencies and markets were created such as Ethereum or Monero.
They have many different peculiarities, but from the mining point of view, they tried to make the calcualtion less effective on ASICs.
How to achieve that?
We saw that ASICs are great for simple multi-threaded calculations (they run in parallel), so the approach was to make it more efficient for single threaded architectures.
One path was to increase the complexity of the hashing algoritm or at least to make it less efficient by design.
This favors cores with a high frequency (such as CPUs), but still running many computation simultaneously would be more efficient.
So the trick was to artificially creaste a slow hashing algorithm that would need to use a lot of memory.
Monero implemented the CyptoNight algorithm (for those who like technical documentation, check here) which requires about 2MB of temnporary buffer (the "scratchpad") to compute tha hash.
Why this is usually not a good fit for parallel comoputing?
For every instance of the process you are running, you need to allocate a separate buffer so, if you have a 10.000 cores machine, you need enough memory to allocate 10.000 buffers.
Memory is still relatively expensive and typically ASICs cannot have all that RAM available per core if we want them to have an acceptable price.
A PC normally has a lot of RAM because we use it for many different tasks, plus, PC RAM modules are widely used so this brings their average cost pretty low.
This worked when a top of the line VGA had 2GB of memory, it was still quite good to mine, but not much better than a CPU.
Just to give you a feeling, this is a benchmark I just ran on my PC mining Monero (XMR):
(my GPU is an old trusty Radeon HD 7870 with 2GB and my CPU is a Intel Core i5 4670K , both quite outdated now. The CPU is mining with 3 cores at about 1/3 of my GPU)
Now the 1080 Ti sports 11GB or ram, that makes it pretty good with memory intensive hashing.
In my example, I would be currently mining about 400 Hashes / second of Monero which would yeald to a wopping 1.21$ / day of reward (you can chek it here, varies a lot with exchange rates, it's pretty high now) which is probably less than the cost of the power consumption of my PC.
Many mining companies currently have mining rigs based on GPUs alongiside with their ASIC miners, so they can mine these currencies too.
Still GPUs are a consumer product so they don't have a huge advantage versus home miners as their hash / watt rate is comparable.
Is it going to last?
Probably not, in fact the cost of the memory is merely a matter of scale : if those currencies become so attractive to justify bigger investments, then scale production of dedicated hardware with enough memory will become financially viable.
However this memory inefficient algorithm is still a good solution : currencies are usually profitable to mine when they start (profit is balanced by the risk that the currency itself will not be successfull and eventually will fade to oblivion), then they usually pay less over time, when they are established.
The Bitcoin millionaires are those that had the coins early on, they gambled, they won, fair enough.
So, if this beginning fase does not favor big companies, but spreads the mining rewards over a wider population, to my book, it's already a success.
1 comment:
Brevi Cito Clare, Rare
/Bruno
Post a Comment