Section 6: Latest advancement in DLT technology
As usual, but specifically in this Section, it is assumed that the reader is up to speed with studies covered in previous reports. Past Blockchain Quarterly issues can be accessed through our website.
INVESTMENT IN DLT TECHNOLOGIES
Somehow, investment in purely distributed ledger technology occurred mainly in 2016-2017. The main projects with creative ideas have been generously funded by their ICOs and the appreciation of their tokens in the past few years; Cardano is a typical example. Furthermore, communities of developers are active in refining the protocols, and work mainly for the sake of making their “product” the best it can possibly be, now that there are sufficient proposals to confirm fundamental progress.
Investment in recent times has been mainly in application projects. After the ICO funding phase, private equity and wealthy individual investors have regained some ground in this respect. And of course, large companies and consortiums back their own ventures and pilots, that they then try to roll out.
CONSENSUS MODE – AND GOVERNANCE
The consensus method on DLTs largely determines the scalability of the infrastructure, and in general, scalability features need to be governed carefully. Therefore, we discuss these topics mostly within the paragraph on scalability. In this issue of Blockchain Quarterly, please also refer to the comparison between execution platform infrastructures.
Technical developments on consensus modes
The EOS consensus mode has been experiencing minor difficulty. Some black-listed addresses were able to perform a transaction, thanks to one of the block producers not updating their black-listed address list. This cast light on the touchy governance of EOS, or generally speaking, of blockchains that implement a limited level of decentralization.
Tezos completed its first blockchain vote in March, which was a historic achievement for them. It was between two alternatives, “Athens A” and “Athens B”, and concerned some amendments to the protocol that would impose a hard fork if not coordinated. Only half of the community (validators = bakers) actually voted. Athens A won, increasing the computation limit per block, and decreasing the staking amount to be selected as a baker. Overall, the test was successful.
Debates about hard forks of the main blockchains have decreased in the past year.
This would tend to indicate that communities of developers and miners have merged, at least partially, in their visions. Or it could be because less incompatible evolutions have come to light. Anyway, fewer forks are good news for the sustainability of blockchain implementations.
Scalability in terms of transactions per unit of time
On Bitcoin, the Lightning Network continues to gain momentum:
An app, Tippin, launched on Twitter, enables users to send small amounts of Bitcoin as tweets to selected Twitter accounts. This is introducing the use of Bitcoin in a very concrete way on an operational and widelyused platform – quite a milestone!
The capacity surpassed 3-million-euros worth of Bitcoin. It is reported to have more nodes than XRP, LTC and EOS combined.
A “Lightning Torch” has been passed from celebrity to celebrity, each one adding a fraction of a Bitcoin; the torch quickly gathered over 100 euros, and its size has now reached its limit; the current liquidity of the network is not sufficient for it to easily move further.
Some critics have emerged, while others have qualified it as “semi custodial banking”, with the “degree of custodianship [being] proportional to the fee required to escape from a non-cooperative channel." And the point seems perfectly valid. It raises the question of how likely the channel is to misbehave, and how a non-KYC-AML-enslaved channel will be able to resist regulatory pressure and retaliatory action.
Joseph Lubin, owner of Consensys, has expressed why, in his view, the solutions competing against Ethereum are not viable. With reason, his criticisms mostly focus on the reduced level of freedom and decentralization that Corda, Fabric, EOS, etc. present, compared to Ethereum. Of course, one would argue that not all applications will require the level of trustless environment that Ethereum is proposing. The debate is philosophical rather than anything else: but still, we reckon that, for non-permissioned networks, Ethereum still is the project of choice, with no equivalent.
The Constantinople hard fork has been implemented, with limited actual effect, being merely a necessary step on the way to PoS implementation later on that will be the real thing.
Input Output Hong Kong is the technical organization behind Cardano that provides updates on scalability and interoperability, and announces overall progress.
Quite disappointingly, the Cardano Foundation council repeatedly proves its ineffectivness at various opportunities, such as the Crypto Valley Summit or random meetups. Insiders confirm this public perception.
EOS continues to gain momentum, in comparison with Ethereum, merely for its higher scalability.
OmiseGo has released the alpha version of its network, Ari, implementing a version of plasma, usable in particular for ERC20 tokens. In application of the plasma principle, the load is transferred from the main chain to a child chain, thereby providing interesting performance, with a test achieving almost 3,000 transactions per second.
Scalability in terms of ledger sizes
At the moment, the issue of ledger size has not yet surfaced, mainly due to low scalability. Only when throughput rates increase will we find ourselves having to deal with high volumes of data within the ledger.
It might be necessary, at some point in time, to implement some of the solutions that we described in earlier editions of Blockchain Quarterly. Ultimately they assume increased confidence in the nodes that are going to store all of the data, and remuneration for those who will take on the burden of maintaining the entire ledger, which will be costly. Maybe this will turn out to be the real cost of decentralization.
A decentralized data storage system may be relied upon, in some cases, to enable blockchain to work.
CONFIRMATION TIME: FINALIZATION MECHANISMS
Also hidden by the scalability issue is the confirmation time required by commercial applications. For those who trade the financial markets, for instance, confirmation time is a key parameter. It would not be acceptable for an order to take more than half a second to be processed and for feedback to be provided. A slower response time would require a change in the behavior of traders, and would be regarded as an unacceptable regression.
The statistics related to block discovery in PoW adds to the issue of slow block validation: a long confirmation time is one thing, but it is commercially unacceptable for applications to be at risk of having a situation where a block takes much longer than usual to confirm, just because “it happens”. As such, these considerations add weight to the requirement for alternative consensus modes to be implemented in enterprise-grade blockchains.
Cosmos has released its live software, with the first block mined on March 13th. Let’s take the occasion to describe in more detail what the Cosmos project aims at. The project presents itself as the “most scalable, powerful, customizable, interoperable ecosystem of connected blockchains”.
PoW is discarded in favor of PoS, due to the higher throughput rate, thanks to the Tendermint Byzantine Fault Tolerant consensus protocol. Without diving too deeply into it, Tendermint is designed to be the simplest possible and a never-to-fork PoS consensus engine, so not having to wait for transaction confirmation results in high-speed transaction execution.
Cosmos is architectured as a network of many independent blockchains that are called “zones”; each zone is powered by Tendermint Core. The first zone is called the “Cosmos Hub”. All interzone transfers pass through the Cosmos Hub, which keeps track of the total amount of tokens held by each zone. Somehow, each zone is isolated from the potential failure of another zone. Anyone can connect a new zone to the system. One would note
similarities with the intended Ethereum sharding system architecture.
On the Cosmos Hub, multiple types of assets can be owned by individual users or by zones. These tokenized assets can be moved from one zone to another, thanks to a special packet relying on InterBlockchain Communication (IBC), which is touted to be a messaging protocol between blockchains, similar to what TCP/IP is to the Internet. This IBC packet is called a “coin packet”. The Hub is responsible for preserving the global invariance of the total amount of each type of token across all the zones. The packet needs to be committed by the sender, the hub and the receiving blockchain.
An interesting feature is that a given zone can act as a bridge to another DLT / cryptocurrency.
Cosmos insists on “usability”, which is the promotion of a developer-friendly modular framework to build blockchains.
On-chain governance is built-in.
ATOM, the native cryptocurrency of the new blockchain Hub, cannot be transferred at the moment; an on-chain vote may one day authorize the transfers when the network is deemed to be stable.
It is also worth highlighting in this section that Accenture has patented two solutions focused on interoperability. One deals with a smart-contract that uses a pre-commit authorization that enables the sender to lock the token data on the blockchain while waiting for confirmation from the receiver.
PRIVACY – CONFIDENTIALITY
The two implementations of MW, Grin and Beam, have some history worth mentioning: Grin came first, followed by Beam came, initially raising suspicion as it was not open source. Beam has a corporate structure and some interests that are pushing it, while Grin has retained a research focus,
with a community in charge of it. So, it can be regarded as mainly still with the realm of amateurs, with developers working only part-time. It seems that Grin intends to be a currency, whereas Beam aims at being a store of value, akin to BTC.
In summary, some news from the small, but loyal MW village:
• Beam has announced a plan to create a MimbleWimble-compatible Lightning Network.
• Grin has voted to contract a third full-time developer.
• LiteCoin is reported to be considering implementing MimbleWimble, possibly through extension blocks.
• So far, MW is not quantum resistant.
Zero Knowledge Proof
ZKP is becoming increasingly better understood by development teams – at least as far as its implementation is concerned. As a result, we are seeing an increasing number of projects that claim to have implemented it, or plan to implement it in their solution.
Analysis of volatility
Volatility has been at very low levels this winter, with many analysts thinking it would lead to a price surge, as it did in 2015. They were right.
The main topic to discuss here is the trend of large, traditional investment banks issuing their crypto-fiat. JP Morgan has gone big with this. We highlighted earlier that this is potentially impactful in terms of macroeconomics and terms of fiat money creation and circulation, but it goes beyond that. Many people are talking about it, including many critics from the pure crypto-sphere; activists and enthusiasts, but also much concern are being expressed within the financial industry. JP Morgan’s competitors are working hard to keep pace: Citigroup especially is doing the same. So, what is there about it that makes it especially interesting to comment on?
This token is of permissioned nature, it is JP Morgan’s DLT-operated ecosystem that manages the blockchain, and handles the unit of value. They have decided to match each token to one USD; that is their choice, but it is, of course, a convention: these coins are merely accounting tools. But for the big banks, this opens the door to something as important as replacing its books! The way they are going to switch their accounting system, interface it and transition it from the legacy systems is their business. It may take quite some time, but it looks like they have triggered exactly that.
Furthermore, once relevant money management processes occur using DLT, the world of smartcontracted automation will open up. This is, of course, a huge thing for banking institutions, because streamlining several processes is within reach – not to mention potential removal of errors from business processes, and benefiting fully from the distributed ledger’s “auditability by design”. Therein probably lies most of the motivation of banks to issue “stablecoins”, in the form of tokenized fiat.
As it is permissioned and unilaterally deployed by a given bank, or several given banks, without any coordination, the financial sector is in effect entering into a competitive landscape where each will have an in-house solution. This is understandable, as controlling the IT system that is required to be at the core of the operation of these companies is perfectly strategical and needs to be tightly managed.
Somehow here, we get the impression of what a “banking supported by DLT” might look like, as derived from the legacy system. JP Morgan and others have decided to engage and devise a solution. In the future, we may see a new kind of decentralized bank, with a group of people providing services independently on it. We should wait and see which system will win. Interesting times ahead in banking.
Regarding some other news, in summary:
Tether has started to issue its stablecoin on the Tron infrastructure. It is still unable to provide any auditing, and has recently admitted that it is not entirely 100% backed by the USD.
OrmeusCoin, a so-called stablecoin, has risen sharply in price; it is supposedly backed by crypto mining proceeds, with 40% reinvested in mining hardware, and 40% secured in a vault.
Dai is managing its parameters through an onchain poll: a vote is being held to decide if the interest rate should be increased to 4%.
BM’s system on the Stellar involves a USDtokenized stablecoin
The Stablecoin.Foundation is active in Geneva, with the following mission statement: “Stablecoin Foundation represents the collective interests of Stablecoin issuers to unify the industry to promote the global mass adoption of cryptocurrency with a stable store of value through collaboration, advocacy, education, research and independent fact-based news using a hybrid centralized / decentralized framework and governance model.”
Whether it be in terms of consensus modes, scaling solutions or other DLT features, we have the feeling that only a reduced number of alternatives will ultimately remain as the basis of decentralized currencies, and for information management applications.
In the case of a given sector’s approach to the use of blockchain to support redesigned business processes, we can only observe that many different actors, consortiums of incumbents or start-ups, ICOed or not, are striving independently to implement their infrastructure and applications to solve issues related to their part of the value chain. Indeed, it is rare for a single development initiative to cover all of the required end-to-end processes, to integrate all parties in the solution. Rather, the norm is for projects to focus on a specific use-case on which to run the business.
If we take a step back, it appears evident that numerous actors are trying to suggest that their approach and thinking is superior. If this continues, we are going to have a lot of interfaces and inefficiencies between many blockchains, even if they manage to communicate (interoperate) with each other. We are of the view that the best approach is for a given sector to adopt a common base infrastructure or protocol that can be adopted by all actors to conduct their business and communicate efficiently with other participants: clients, providers, partners, developers, operators, insurers, investors, etc.
Many products and services can then be built on this base layer or protocol; this is the application layer, where competition is likely to be fierce, and standardization, if it happens, would likely be in the form of Google’s or Binance’s monopolistic, de facto dominance.
There is likely to be several standards of DLTs, in which case we should have a single protocol definition per industry sector, using one distributed ledger technology to support the standard processes for the relevant sector/ecosystem; and only then should applications be built to offer and provide actual services.
However, before we define robust and specific standards for the base layer, a lot of the technical features of present DLTs still require optimization.
SYNERGIES WITH OTHER TECHNOLOGIES
There is not much to add to what has been said in previous editions of this Quarterly. We will just state the obvious, that IoT is needed by blockchain applications, as devices with an on-chain identification, that are there to provide information from the real world (connected captors) and extract information (connected actuators).
On this topic as well, most have already been said. Let us just mention an interesting development that we bet is here to stay: “Decentralized Artificial Intelligence”!
MOMENTUM GAINED BY DLT ALTERNATIVES TO BLOCKCHAIN
Direct Acyclic Graph (=Tangle)
A new child in the DAG family has been born, named Obyte, from Russia. It defines itself as an “open cryptocurrency platform providing users with unbreachable contracts, controlled identities, ability to develop applications, and send payments over any network”.
Corda, Hyperledger and other permissioned non-blockchain DLTs
R3, the company that produced Corda, has been working on expansion plans to diversify its outreach from the financial sector to other industries. At this time, it seems that there is disagreement between the founders, causing one of them to leave the company.
CRYPTOGRAPHIC KEYS MANAGEMENT ON PERSONAL WALLETS
The CEO of a Canadian exchange, QuadrigaCX, has died. He was the only person in possession of the private keys for most of the crypto assets held in cold wallets. Consequently, the exchange is bankrupt. Ernest & Young are in the process of trying to recover any assets that can still be found.
If anything, this once again emphasizes the difficulty of safely and properly managing crypto assets for oneself and others. The QuadrigaCX story received much exposure, and validates the view that everyone should own their private keys, and not give control of their assets to a third party.
Double spending – the 51% attack The re-writing of the Ethereum Classic chain in early January is a concern for the sector, with good reason. No less than 400,000 euros were stolen in real double-spends – shocking by all standards.
One very important comment on this 51% threat is to highlight that the consequences of such an attack on, say, Bitcoin would be pretty difficult to deal with.
On the one hand, the offenders would be identified, and pretty easily. It would be difficult even to hide the preparation of mobilizing such a huge mining effort. If it were a collusion between mining pools, it would be immediately evident to observers. So, concealing the action would be practically impossible, and far too costly for the expected benefits, not to mention that the underlying asset would collapse in value.
On the other hand, it does not seem that any laws can be easily enforced against such an act: a double-spend on a blockchain is not something that is clearly defined in the law, let alone, punished! DLT protocols are what they are, and purists will argue that a 51% attack is, after all, playing by the rules of the protocol.
Further, which jurisdiction would apply? The offender may be in one country, and the doublespend victim in another, with incompatible laws, not to mention a scenario where cryptocurrencies are banned in the victim’s country: what sort of recourse would there be then?
Another double-spend theft occurred throughout Canada. In 7 cities, Bitcoin ATMs were subject to a theft worth close to 200,000 euros. Most of the ATMs relied on zero confirmation from the blockchain, making it possible for aware individuals to withdraw fiat cash in exchange for Bitcoin simultaneously in several locations, hence, effectively withdrawing the same cryptocurrency twice. The loop exploited here is the “Replace by Fee” (RBF) mechanism in the BTC protocol, which instructs miners to replace a transaction of a given type and a given amount by a similar transaction of the same type and amount, but for a higher fee. The mechanism is provided so that a user can speed up
the processing of a non-confirmed transaction by increasing the mining fee. In this case, the offenders used it to steal from the unaware ATM operators. It’s rather surprising that this has not happened before.
We have not previously discussed this sort of attack. The idea is to try to separate the nodes participating in a blockchain into two partitions, where it would be impossible for a participant in one partition to transact with a participant in the other.
It gets quite technical quickly, but in short, it is not blockchain that has the vulnerability, but the internet itself, on which it is built. The so-called Border Gateway Protocol, sometimes referred to as the internet’s biggest security hole, can, if exploited, allow an attacker to split the blockchain (Bitcoin’s or any other) in two.
While this is likely to remain a minor concern overall compared to other threats, some researchers are working on solutions. In particular, a Swiss team is developing SABRE as a dedicated small network of nodes strategically located to always be interconnected. Other researchers have recently performed a Bitcoin transaction communicating via radio waves, thereby showing that, even if the internet, as we know it, is down, participants can still interact.
Let’s recall the BitWise analysis of fake volumes reported by exchanges; please refer to the paragraph on Volumes for details.
Bitcoin and cryptocurrencies are, more than ever, a wild west, as was shown again by the obvious simultaneous large buy orders on five exchanges on April 2nd.
And with more participants, bigger markets and increased prices, whales are playing everywhere and spilling more water than before.
Cases of violent physical assault on crypto asset holders are constantly being reported. A recent example occurred in the Netherlands and another in Thailand. This is not entirely surprising, but is, of course, a reminder that the weak point in the chain simply moves when the transfer of value itself is made secure, and holding access to your asset makes you vulnerable to direct physical threats from “classic” thieves…
Therefore, crypto assets custody services are going to be an ongoing concern.
With reasonable crypto prices, mining malware is also far less profitable, hence crypto-jacking has almost disappeared – for the moment.
CONCLUSION OF TECHNICAL DEVELOPMENTS
Overall, in this quarter, teams continued working, and have produced definite progress that has delivered some of the promises in 2017 whitepapers: the release of operating versions of major projects such as Cosmos and OmiseGO confirms this.
However, this progress has been incremental, and there is no disruptive breakthrough to report. Therefore, the indications are that technological development still has a long way to go. In particular, scalability remains an issue, and some sort of sharding solution will ultimately emerge, as current communication protocols continue to be a limitation; the careful architecture of shards might be the real challenge for industrial applications.