Section 6: Latest advancement in DLT technology
CONSENSUS CONVERGENCE MODE, AND BY EXTENSION, GOVERNANCE
One use case for DLT is to use them as a governance tool using a set of smart contracts.
In the governance of blockchain themselves, issues did not hugely evolve during the last three months. To put things in a nutshell, on the one hand PoW has proven a high level of reliability but faces opposition due to its very important electrical energy consumption in the competition of miners to find blocks; and on the other hand, PoS would solve the energetic issue but struggles to offer comparable levels of safety, while designing a clever and fair system according to this principle looks is difficult. So, the main constraints that have to be respected are:
• Keeping the amount of energy consumption at a sustainable level overall.
• Avoiding hard forks
• Keeping uncompromised security levels
Here are the last developments in the field of DLT governance (in terms of technical decisions to be implemented into the protocol).
First note that Bitcoin’s lack of evolution by its developing community has already lead to many forks, the governance principle being ultimately to stick to the original Satochi’s principles.
Remark on critics on PoS:
Rich people will get richer, and the poor will remain poor – well this can also be said of PoW mining when you need to have money to buy computing capabilities. Business models can be thought of to contract a loan to buy ETH to stake.
The approach makes the frontier somewhat blur with private blockchains, where a reduced number of people retain the power. Again, true as well on PoW where some pools of mining have a huge fraction of total hash power.
Cardano just claimed to have successfully cracked a PoS of its own, in Ouroboros Genesis.
Delegated Proof of Stake is getting momentum, in the person of EOS, in which one coin = one vote is used to elect addresses that are then the one who produce the blocks
Various smaller chains initiative have focused on governance, among which Tezos project, and Decree. They include mechanisms to converge on the expression of the majority’s view on what needs to be done. For the moment, this is still working under progress or minor blockchains.
Proof of Authority applicable in private blockchains
Last, “off-chain governance” is debated as well, when consensus on the evolution of the protocol has to pass in the “real world” instead to be built-in the incentivization of the actors “on chain.” However, it looks not too satisfactory, and no real effective such initiative is existing.
So, to conclude, while bitcoin and some minor old blockchain may remain in PoW, we should expect some clever PoS or DPoS to be ultimately the mechanism retained in most blockchains. On the short term, Ethereum and Cardano are going to be carefully scrutinized, jointly with EOS main net launch later this year. Being successful in PoS implementation will be a huge positive differentiator for the blockchains achieving it, while PoW may remain the most secure system on the market.
SCALABILITY IN TERMS OF TRANSACTIONS PER UNIT OF TIME
Here is what news has come in the field of raising the throughput rates of the main blockchains.
BTC's Lightning Network
A main-net and wallet applications have been launched in April. LN promises to reduce fees to an insignificant level, while allowing thousands of transactions per second, treated instantaneously.
Just to recall the principle, it starts with pairs of individuals opening “payment channels,” declaring both that they pay an amount of BTC that they decide to the channel. This creation is recorded on the main bitcoin network, and the BTC concerned are blocked on the main-net. Then when a pair of LN users transact between themselves BTC payments, the transaction is made on the sidechain, not anymore on main BTC blockchain. To become very powerful, it would then require many individuals to open channels so that the network can find a route between two individuals otherwise not knowing each other (with onion encryption within the network) when the channel is closed, the balance of BTC remaining returns to the main BTC net. It seems that all of this is dealt with in smart contracts running on a private blockchain operated by LN service provider.
In the case of wide adoption, this solution has the potential to be a game changer. At least it is in my view much more credible in addressing scalability issues than increasing the block size, finding free bytes to use on it or reducing the time between blocks.
Nota: bitcoin network as is limited to ~200 million transactions per year, so if 1 billion people want to open channels or close one per year, then there will still be a bottleneck on the main net.
Meanwhile, other initiatives on BTC mainnet progress too, with more acceptance being observed for SegWit.
Choice of PoS versus PoW is way better for scalability purposes, as highlighted by Cardano’s researchers, to range in the several thousand as already proven on NEM. It looks like Cardano’s innovation in scalability is linked with a specific implication of PoS; it remains to be quantified as so far too few information is available.
Ethereum’s work on sharding looks to be progressing, with V. Buterin announcing in April that “it is coming.”
To tackle the principle of sharding, associated addresses and the state database are going to be split into shards, and some nodes are going to be collators, in so that they are going to receive transactions for a given shard. Also, so each shard is going to have its chain of transactions concerning it. The main chain will be there to collect all the headers of all the shards. Challenges to solve are numerous. One should try keeping each shard safe from attacks, especially in PoW mode
Allowing cross-shard communications, as soon as a transaction triggers contracts that will want to send consequences to address in different shards; this may be detailed in the next weeks/months, but we can assume that if Ethereum foundation is confident about sharding, they must have made satisfactory signs of progress on this, most probably through a mechanism of one shard emitting a receipt request from the other shard concerned, and receive a receipt consumption when transaction is complete (thereby increasing the traffic of information overall).
Ethereum is even considering further shards of shards (quadratic sharding) if one day necessary to scale up some more.
Ethereum used to have a project linked with the Raiden Network, that is a sort of lightning network solution for Ethereum (“multi-hub transfers off the chain”). It looks like a second tier development compared to sharding right now. Research on Cardano was also considering having several chains running in parallel, synchronized in epochs. However, no further disclosure is available to comment further.
To understand what options are being studied, let us see what high throughput blockchains have made as choices to precisely reach the thousands of transactions per second already.
To start with the basics, Bitcoin and Ethereum as they are today scale differently because Bitcoin sets the average time between blocks at 10min vs. 15s on Ethereum, and block size is fixed 1MByte on Bitcoin whereas miners can choose to adapt it by voting on Ethereum. These choices that are probably over-conservative on Bitcoin are explained by the propagation time on the peer to peer networks underlying; so when block time decreases, there is less time for nodes to be aware of the canonical status of the chain, and potentially more competing forks will appear; and when to block size increases, it takes more time for participants to download the data before mining the next block, which in fact may disqualify nodes that not the most fastly connected/located far. By choosing more scaling values, Ethereum agrees to have much more uncles and wasted has power due to the foundation blocks that appear more often and will not be included in the blockchain ultimately. Ethereum’s block size increasing since last year may reach to the limits of acceptability soon enough, making it necessary to bring new solutions.
Waves: time is divided into ‘epochs.’ For each epoch, a single epoch leader is in charge to serialize state machine transitions. Said differently, this means that the minor is selected in advance, allowing to get rid of the limit in time that on Bitcoin cannot be lowered unless participants in the network become incapable of downloading the blocks promptly due to latency in the network. So this proves that proof of stake algorithm can easily lift the throughput rate to the thousands per second. The steam that is touted as particularly fast works in Delegated Proof of Stake, which has allowed it to present itself as superior to others in scalability; however the 1000 tx/s again is not sufficient, and the chain is being oriented to sidechains networks. Then there is also Stratis that has shown high throughput rates, and this is again thanks to the adoption of PoS early on.
Ripple: first Ripple have just a few seconds between blocks, and then it relies on some Proof of Authority, as just a “supermajority” of well identified Validating Nodes is sufficient to publish the next block; hence the very good efficiency of the project as scalability is concerned – at the expense of high level of disintermediation/decentralization.
Monero: PoW, with adaptive block sizes; limiting factor is still the bandwidth of web connections of nodes.
Zilliqa: Is a project aimed at scaling linearly with the addition of users, based on sharding and PoW.
Raiblocks: Claims to be almost instantaneous, with no fees. It is promoting the use of block-lattice instead of blockchain, which means that every account has his blockchain=account-chain, that registers all transactions regarding this user. The network only keeps the list of balances per accounts. So when two blockchains dialogue, a “send-transaction” is created in the sender’s blockchain, and a “receive-transaction” is created in the receiver’s blockchain, sequentially. A very small PoW has to be performed just to prevent the network from attacks. So the network itself only keep track of the balances of accounts. Moreover, whenever a verification has to happen, for any reason when a consensus issue, it is a Delegated Proof of Stake that is used, to solve the conflicts. So from a scaling point view, since the transactions are recorded from each participant’s point of view, it can be thought to scale to infinite. The question is if this works, why do we not hear more about it? So probably the resolution of conflicts remains a heavy point to solve.
IOTA: This DLT does not have the blockchains issues as the consensus is obtained by gossiping on gossips, ultimately reaching consensus. The probability that a transaction is ultimately declared invalid never decreases to zero, but in practice as the network grows, consensus principle hold and do not slow the execution of transactions.
As a conclusion, right now we can say that apart from cosmetic improvements on the time between blocks and size of blocks that will help marginally, and the intelligent PoS mechanism that will help greatly, the real breakthrough to scale potentially to infinite is being researched around sharding and sidechain networks. Moreover, then there are radically different approaches on alternatives DLTs such as acyclic graphs methods and blockchain lattices that also scale-up very well.
Overall, scalability issues look about to be overcome, even if some more work remains to be done, and this is quite encouraging for blockchain as a whole.
SCALABILITY IN TERMS OF LEDGER SIZES
In the list of technical issues to be solved for DLTs, the ever-growing size of the DL is a big problem, even more, if scalability increases drastically. However, this has not been a top priority of research yet to be solved.
Consequently, no real progress or positions have been taken in the last months regarding pruning, compression, checkpoints.
Note: sharding somehow brings a solution also on the ledger size by fractioning it.
SCALABILITY IN TERMS OF COMPUTING CAPABILITIES (FOR VM)
Once scalability is solved for the part of the transaction, logically the computing capacities of deterministic state machines are solved as well.
Cardano’s RINA is addressing this issue, to be explored in a later study.
The concerns about the need to dialog throughout various DLTs is growing; a “Blockchain Interoperability Alliance” had been created in November 2017.
The purpose would be to standardize some smart contract features to allow data exchange across DLTs. Ultimately, the interaction between blockchains would occur when the two chains that are to communicate share a common hash function, under the sidechains principles, that some tokens are frozen in the first chain sending a transaction on the second chain.
It remains to be seen if developers of a specific blockchain will be too enthusiastic about favoring data to flow out of their ecosystem. However, provided that the standards are available in the protocol of a given blockchain, technically it is feasible to imagine it can dialogue with counterpart DLTs.
Another issue that has to be documented regularly from now on is the reliability or cryptography used in the various DLT to authentic users, and the robustness of hash functions. Indeed if the processing power of devices continue to increase exponentially as Moore’s law was predicting it for a long time already, then the safety of the DLT altogether are at risk. Quantum computing is another concern.
This cryptographic robustness is not a burning issue, with experts declaring bitcoin at risk in the late 2020s or 2040, but we should explore more in details in subsequent trimestrial analysis.
The volatility of crypto assets is not a technical issue, but a practical economic one. We can do nothing but acknowledge that high volatility in price is an obstacle to mass adoption of the DLT coins as currencies. While this volatility may ease in the mid-term, some initiatives have been undertaken to try to solve it, the so-called stablecoins.
The objective is to create a DLT token whose value reflects assets that are perceived as stable, may it be fiat currencies like the Euro, or directly consumption indexes.
A centralized entity can propose tokens that are collateralized, may it be by fiat ownership, precious metals (gold) or anything else. This is the easy way of doing, and anything can be a support for such tokenization to be then swapped just like used to be during antiquity.
Alternatively, the emitting entity can be thought of as a smart contract that observes the price concerning its index, and automatically adjusts the number of coins in circulation so that it matches back the objective. Somehow, this becomes completely artificial and asks the question how is the value stored when coins are sold out by the emitting smart contract, and what happens if this envelope is too small when it comes to buying back the tokens in periods of monetary mass shrinkage. This looks a lot like reinventing a central bank, decentralized and with no control whatsoever on it. The idea may be interesting philosophically, but it remains to see if the general public would engage in this.
Moreover, well, what sure is, is that the entities behind all this are going to make sure to get paid a fee for this stabilization service.
CONCLUSION OF THE TECHNICAL DEVELOPMENTS
DLT pain-points topics are being actively researched. On most of the issues, especially on scalability, good progress has already been made in just the last three months. In general, we can be optimistic
However, clearly, we are not there yet. All the fresh cash that has been poured into the sector in 2017 is likely to foster further quick progress, jointly with the development of user-friendly interfaces applications.
Real mainstream adoption by the general public could happen only then.