Section 6: Latest advancement in DLT technology

It is still not clear which solutions will ultimately win widespread adoption, as far as scalability of the various current DLT alternatives is concerned. Some refinements appear to be gaining momentum, such as the MimbleWimble protocol, and zkSTARKs that are poised to replace zk-SNARKs. Also, DAGs seem to have received insufficient exposure up to now; but only time will tell which of these technical solutions will earn dominant implementation. As the example of Facebook shows, it is not always the best technical solution that achieves a monopoly.


To ensure further adoption of the various technical solutions, a userfriendly packaging is very much needed to allow the public to seamlessly enter the ecosystems.


In the following pages, we present a systematic review of developments in the technical aspects of DLTs. You will find updates on technological research and improvements in DLT infrastructures that build on previous in-depth descriptions and analyses of the critical issues. We will also explore topics that have not been thoroughly covered up to now in this Quarterly, such as the implications of quantum computing.


In this section on technical updates, we assume that the reader is up to speed with studies covered in previous Quarterly reviews.




The consensus methods of DLTs largely determine the scalability of the infrastructure, and, in general, scalability features need to be implemented. Therefore, these topics should be discussed together.

Technical developments on consensus modes

Various teams are currently working on their implementation of PoS, DPoS, and PBFT, or a combination of the above, to improve scalability, rapidity and other features. We will cover some scalability aspects in this section. However, there is no breakthrough to highlight at this time. (Also refer to section “I—Infrastructure functionality” on page 23.)


This may be the time to cover a blind spot; even if it is not new. As yet, we haven’t taken the opportunity to dive into the Federated Byzantine Agreement (FBA) consensus mode (used in particular for Stellar Lumens). Its specifics are interesting, and its impressive characteristics are worth looking at.

The FBA model (used in the Stellar Consensus Protocol), is an approach to the consensus that has nothing to do with PoW or PoS. This means that the network agrees on a statement if a majority of its participants agree on it; the threshold is usually 2/3 of loyal participants to counterbalance a maximum of 1/3 malicious nodes.


With FBA, as opposed to Byzantine Fault Tolerant mechanisms, the list of validating nodes is not established in advance by a central authority. In FBA, anyone can join, and every node decides which other nodes they trust, which forms a “Quorum Slice.” Quorum slices, by design, must overlap, together building the whole network consensus, and thereby gossip on positions by the node can spread across quorum slices.


In a synchronous distributed system—according to the FLP Impossibility Proof—at most, two of the three following properties can be achieved by the consensus method: fault tolerance, safety, and liveness.


Fault tolerance is the guarantee that the system will survive if a validator on the network fails at any time. Most consensus protocols choose fault tolerance as one of their preferred properties. The big tradeoff remains on the choice between safety or liveness as their second favored property.


Safety is the guarantee that nothing wrong will happen, like a fork. Liveness is the guarantee that transactions will always be processed. FBA favors security over liveness; whereas, in blockchains, liveness is favored. Hence forks are possible.


Periodically, the whole network goes through a cycle of validation. First, at the level of the Quorum slices, all participating nodes essentially go through continuous rounds of voting until they can agree on a statement. Once a consensus is reached in the quorum slice, voting then occurs at the higher level. Of course, every node maintains the same view of the ledger.


C o n s e q u e n t l y , transactions merely need to be passed to the network, and safety means that no confirmation time is required; this allows for fast transaction processing: in the range of 3-5 seconds. Another consequence is the removal of the risk of attack by considerable computing powers. The fee is reduced to the very minimum, which is mainly to prevent spamming the network; and therefore, not a lot of full nodes are running—due to lack of incentive.



All evidence points out that the governance of blockchains is going to be a growing concern. One indication of this is that Ethereum is delaying its Constantinople update, not because the fix to the problem is not ready, but rather to have sufficient time to convince the community and to coordinate miners to ensure avoiding a hard fork.

But, as of now, this topic of tediously steering the blockchain protocol evolution is well below the radar, with scalability being the focus. Article writers and panelists in various discussions are identifying the issue, but only limited research efforts have been made to date.

Hard forks

In November we witnessed the fork of Bitcoin Cash, which gave birth to Bitcoin Satoshi Vision. Bitcoin SV claims to strive for address stability, scalability, and security. But for now, it hasn’t changed much since the fork. Furthermore, they are somewhat selfcontradictory in wanting to retain the original concept of bitcoin, yet wanting it to evolve.


Scalability in terms of transactions per unit of time



  • Currently, 40% of transactions use SegWit - helping to decrease transaction fees. This is a failure of mass adoption of an easily implementable feature, but  overall it is a minor consideration.

  • The Lightning Network (LN) is gaining momentum: new records have been set in capacity over the network, namely over 500 BTC, over 5000 nodes, increasing steadily, and even more, channels are in the process of creation. To increase the adoption of LN even further, the community should concentrate on making the set-up phase easier. A few features are still under development to enhance the attractiveness of the Lightning Network. These include splicing to enable the size of channels to be adjusted after opening.

  • The sidechain concept is competing with the Lightning Network. The most advanced implementation of it has been in development for three years at San Francisco’s Blockstream, and is called “Liquid.” As expected, users convert their BTC to LBTC, thereby totally pegging the new currency. The sidechain is a federated one, where partners (participants are big exchanges) are acting as third parties for each other. This is the fastest option, but bitcoin purists are not satisfied with it. It looks like we are witnessing the construction of efficient channels for—and within—a nascent industry about to replace banks.

  • Meanwhile, as a side remark, Bitcoin Cash’s 8MB block size has never been reached.

  • Litecoin is also implementing the Lightning Network.


  • The Constantinople hard fork has been delayed and is unlikely to happen before the end of February. A vulnerability has been flagged in the upgrade by a smart contract auditor, which required a rework. For details about the technical improvements included in this very significant upgrade of the protocol, refer to our previous Quarterly.

  • The Ethereum Foundation has paid 5 million USD for the development of the Parity upgrade of the Ethereum node client.


Cardano is currently claiming it can handle several hundred TPS and is working to reach several thousand.


EOS has released an upgraded version of the code, which is 35% faster.


Stellar: As we have just seen in the previous paragraph, Federated Byzantine Agreement is very scalable and fast.

MultiVAC is a solution that has been progressing fast and is receiving increased exposure. The team describes it as a “high throughput, flexible public blockchain based on trusted sharding.” By allowing nodes to accumulate their transaction power, they claim unlimited scalability for the network. A test in October showed 31,000 transactions per second for 64 shards. MultiVAC has the following features:

  • The DApp designers can decide the tradeoff they want to adopt between decentralization, scalability, and security, rather than being limited to what the underlying platform provides.

  • The sharding solution provides various dimensions of smart contract computation, including storage and transmission of data.

  • It uses Verifiable Random Functions (VRF) to establish in which shard a node is allocated, and which node gets to validate blocks.

  • A Proof of Instruction Execution (PoIE) is implemented to eliminate the need for every line of code in a smart contract to have to execute on all nodes.

  • Uses the physical discrepancy between computation and storage—existing in modern computers—to penalize malicious behavior and ensure security and efficiency.


Devvio: this initiative claims to be able to process 8 million transactions per second.

  • Here again, the solution lies in sharding. The consensus mode is Proof of Validation. In short, on the blockchain’s network (or shard), a validator is chosen randomly. It then gathers messages from the other participants in the network until reaching 51% of validators agreeing; the block is then passed to the blockchain, including the messages received from its peers.

  • Devvio also claims that a transaction cost on their network is five orders of magnitude lower than on Ethereum. • Every wallet is assigned on one and only one shard, and any transaction coming out of a wallet is processed by its shard. Then all the blocks from all the shards become inputs to a single shard, which handles all the cross-shard transactions.

  • Of course, these performance claims have not yet been observed in real-life operation.


Unit-e: another group, mostly from the US and Germany, funded by Distributed Technologies Research (DTR, based in Zug) and Pantera, has been able to resolve the scalability issue for a globally decentralized payment network. The associated cryptocurrency is Unit-e.

  • The group focuses on in-depth scientific research to achieve an increased throughput rate without sacrificing decentralization.

  • They envision using PoS in an algorithm called Prism that would be “approaching the limits of what is physically possible in a blockchain.”

  • They are also researching a new sharding technique called PolyShard. The key being that nodes should not store replicated data; instead, they should store coded linear combinations of data, while still allowing for data recovery.


A significant effort is being made to ensure that Proof of Stake functions effectively, not the least being Ethereum’s difficulty to switch from PoW to PoS. At this point, it is essential to mention the “Nothing at Stake” problem that exists in the PoS mechanism. The issue arises when two “forgers” (the equivalent of miners in the PoS consensus) arrive at the threshold that allows them to propose a block at the same time. In this case, a fork happens, and it propagates in the network. In PoW, miners will dedicate their effort to one of the two branches, and after some time, one of the competing blocks loses the race and is orphaned. In PoS, every person on the network may want a stake on both sides of the chain, so that they can ensure a higher reward for whichever block ends up winning. And then, the issue is that if almost all forgers do this, say 99% are on both ends; the remaining 1% may eventually decide the fate of the chain. In this case, an attacker would pass a transaction to one fork in the chain, and join the other fork, causing it to win, and in the process allowing the attacker to spend their cryptocurrency again. However, doing this would ruin the trust in the chain and would also impair the value of the attacker’s stake. In reality, this problem is not observed, and of course, modern implementations can effectively resolve this.



Scalability in terms of ledger sizes

Bitcoin’s ledger is now reaching 200 GB. The progression is now very linear and is not regarded as a concern if it continues. The low scalability of BTC has the advantage of having a “manageable” ledger size.

MimbleWimble has potential: with this protocol, most of the transaction data can be eliminated over time, without compromising security. As transactions are verified by checking that the amount of input corresponds to the amount of output, storing the proof of input is enough. Ultimately, this results in unspent transactions. Note that this is all made possible because the total amount of tokens is known in advance.


To prevent 51% attacks from rewriting the blockchain story, some of the newer projects have implemented a finalization mechanism when validator nodes sign the appended blocks. These mechanisms ensure the endorsement of the chain that is being followed and immediately discard any stale blocks that are not on the longest chain (called uncles). This, in turn, avoids waiting for several rounds to be sure that the transaction has been accepted.


Barclays has organized a dedicated hackathon centered on finding ways to connect Ethereum with madefor-enterprise blockchains (e.g., Hyperledger). The panel of judges included representatives from Barclays, UBS, HSBC, and Santander. Otherwise, technically speaking, there is nothing significant to report on this topic.




The implementation of the MimbleWimble protocol continues to progress, adding to the ecology of anonymity-oriented cryptocurrencies. Grin, which is community-funded, is gaining momentum, and another has debuted: Beam, run by a start-up with the intention of handing over operations to a dedicated non-profit foundation.

MimbleWimble looks, even more, promising the further one dives into it. As of early 2019, it has generated a lot of enthusiasm within the community.

Please refer to the previous Quarterly for a technical explanation on MimbleWimble.

A few additional remarks about previous comments:

  • Transactions jump from node to node before being recorded, which increases the difficulty of identifying the wallet a transaction request came from.

  • MimbleWimble does not require addresses, so people have to interact in real life to enable communication between their wallets.

  • MimbleWimble cryptos are fungible because of its design.

  • MimbleWimble also marginally increases scalability in throughput rate.


Also note that the elegance and lightness of the MimbleWimble solution are competing directly with the burdensome, cryptographically intensive solutions of Monero, ZCash, etc., due to the computing demand of zeroknowledge proofs, in particular. In MimbleWimble, unless you were one of the participants, none of the transactions in any given block will be recognizable to you. If you view a block, you will not see a list of transactions. You will see one big transaction, in which everything has been mixed and merged—all that is left are the “list of new inputs, a list of new outputs and a list of cryptographic signatures created with the dummy above outputs.” This makes it more anonymous than Monero and the like.


Apart from traceability, which is sacrificed to obtain fungibility, MimbleWimble-based chains are likely to achieve bitcoin’s goals better than bitcoin itself. Surprisingly, projects using this technology have had little exposure, and money continues to flow to protocols like Ethereum. This may change shortly, as the framework is clearly in place for Grin, Beam, and probably many others to come.

Zero Knowledge Proof

While zk-SNARK has been among the most popular topics of discussion in the debate around anonymity implementation in blockchains, its method has been refined in a new application called zk-STARK (Zero-Knowledge Succinct Transparent Argument of Knowledge). The differences and main advantages of zk-STARK concerning SNARK are:


  • It does not require a trusted set-up phase.

  • It is less costly, thanks to the reduced complexity of the arithmetic involved; of communication interaction; of the prover task; and the verifier task. We are talking here of an order of magnitude improvement, not just an incremental change.

  • zk-STARK does not rely on private-public key pairing, and therefore it is not as vulnerable to the inception of quantum computing, as zk-SNARK is.




The race towards quantum computing continues, with IBM recently announcing the release of IBM Q System One, the world's first commercial quantum computer.

Regarding post-quantum cryptography, the following, in broad terms, is a summary of recent developments.

Hash functions are partially vulnerable. As for finding a primitive for a hash, quantum computing is likely to be very efficient. But returning an original document hashed is still impossible.

Symmetric cryptographic algorithms are still relatively secure as they do not rely on the integer factorization problem, the discrete logarithm problem, or the elliptic-curve discrete logarithm problem. Provided the symmetric key size is long enough, symmetric encryption is still valid. Mathematics shows that doubling the key size is required to have the same level of safety post-quantum as it was pre-quantum.

The problem is that blockchains do not rely on symmetric encryption, but on asymmetric encryption to sign transactions. Asymmetric encryption that will not be broken by quantum computing is largely still at the research stage. Possibilities that are explored to find problems not easily solvable by quantum computers include:

  • Finite fields of multivariate polynomials.

  • Error correction codes, where the private key is the matrix used to correct, and the public key is a randomized version of it. A cyphered message is then the message with some errors included in it. This method is proven to ask the attacker to perform a prohibitive amount of work while keeping the legitimate user’s work limited. The issue is that the resulting keys tend to be very heavy, especially compared to classical pre-quantum keys.


To summarize, methods already exist to create pairs of private and public keys to allow cryptocurrencies to continue being operational, but with more complex keys to managing. Meanwhile, there is still hope of finding more efficient suitable problems to generate more manageable quantum-resistant cryptography.


Analysis of volatility

The volatility of cryptocurrencies has reduced considerably in the past year, indicating that more actors are entering the crypto market and contributing mechanically to more inertia in price movements.


The volatility itself is not bad; it is how people react to it that comes into question. Some actors do prefer higher volatility.


As applications design and development continues, interest in stablecoins has increased. A need for a more stable vehicle—without the need to hedge the exchange rate—to automate the transfer of value by smart contracts over the blockchain becomes more and more evident. The actual usage of stablecoins to settle the automated transfer of value is yet to be generalized, but we can be very confident this will happen as part of real smart contract platforms being put into production.


Evidence of this interest can be seen by the release on the market of various simple fiat collateralized cryptocurrencies, which include Gemini’s GUSD, SmartValor’s tokenized Swiss franc, Paxos’ PAX and others. The value of these tokens has progressed in the CoinMarketCap rankings, as they act as a reserve of value for investors, while volatile cryptos continue to fall further. All initiatives line up and happily go through audits in the hope of grabbing Tether’s place if the USD-pegged cryptocurrency collapses.

Huobi has also launched HUSD, which is a basket of pooled stablecoins pegged to the USD.


While it is easy to tokenize fiat, (the creation of selfstanding mechanisms to maintain the pegged value of virtual currency), the real challenge for a stablecoin would be to offer a mechanism for a floating value, not linked to a particular fiat, yet that is overall stable.



At the moment, the early maturity stage of distributed ledger technology calls for a broad exploration of all implementation alternatives. Only when the various known solutions have consolidated to produce optimized versions will “de facto standards” emerge, thanks to wider adoption. Ethereum’s standard token types are the first attempt at this.


IoT synergy

As a review, this is how the IoT will use DLTs:


  • Two devices want or need to exchange something, which could be information, a commodity, a parcel, or anything else.

  • Both have to run the same decentralized application.

  • One (or both) must acknowledge its counterpart’s identification on the supporting DLT, thanks to direct communication (sensors, Bluetooth, scanned QR code).   

The decentralized application will have the framework for the intended interaction; including:

  • On-chain acknowledgment of the identification of the two objects.

  • Choice of a range of services or interactions, a range of steps coded in the process to get data, and its confirmation between the parties. • Escrow of exchanged crypto-value or information.

  • Feedback from one or both parties proving that the potential interaction took place satisfactorily, thanks to potentially independent sensors acting as oracles.

  • And finally, releasing information and crypto-assets, as expected


When it comes to the challenges of technology, machine-to-machine communication faces three challenges that can be solved by DLTs:

  • Transparency: when communicating, devices use encrypted data that is not easily auditable. So, until now you’ve had to trust that the application would handle the content of the communication appropriately (typically not sharing information you do not want to share).

  • Longevity: communication up to now has passed through the vendor’s cloud, and if the vendor goes bankrupt or stops maintaining the devices, then usage can be discontinued.

  • Trust: over a conventional chain, payment details are handled sensitively by a third party.


Smart contracts address the concern of transparency, since the logs are public, and it offers longevity because the blockchain is not stoppable and can be trusted, thanks to the disintermediation offered by decentralized applications.


However, DLTs must solve scalability, offer fast response times and privacy of exchanged information before claiming to be suitable for the IoT.



Direct Acyclic Graph (Tangle)

To the crypto world, it may seem strange that DAGs have not had more momentum given their elegant solution to the scalability problem in the field of DLTs. As they do not rely on blocks, are asynchronous to a large degree, have no time diffusion issues, less mining waste, lower fees, and there are no debates on block size or block frequency.

So, if it is faster, cheaper, more decentralized, not threatened by quantum computing, and in theory, has no scalability limit, why have we not seen any sign of supremacy of Hashgraph, IOTA, Nano and the like?

Here are some possible answers:

  • Without getting into too much technical detail, the “gossip-on-gossip” protocol—which is the basis of the mathematical demonstration of the reliability of truth recording on DAG ledgers against attacks— assumes sufficient traffic of transactions passing through the system. Therefore, a relatively high volume of transactions needs to occur on the tangle to maintain its security constantly. In other words, the tangle approach is eventually more vulnerable to attack than a blockchain system.

  • Today, to avoid this trap, tangles are implementing “witness nodes,” sort of central coordinators that negate the true decentralization of the system. Yes, the famous trilemma holds still, but as a result, detractors of DAG claim that it is centralized.

  • Timestamping cannot be guaranteed as it is with a blockchain.

  • DAG solutions struggle to integrate worst-equipped, worst-located network participants. The nearinstantaneous processing of transactions by DAG means that far-away participants face the risk of disproportionately unconfirmed transactions. This will prove to be a ‘real-life’ limitation to the effective scalability potential of DAG.

  • There are comparatively fewer developers working on DAG than on blockchains, and the inception of the concept as a competitor to blockchain is relatively new; nobody had this family of solutions on the radar before IOTA emerged in mid-2017.


There is not the same “global state” on DAG implementations as there is on the blockchain, which ultimately prevents participants from checking a transaction against the complete ledger history.

  • Firstly, this is because the ledger changes with each transaction submitted to the network, and does not get gossiped at a constant pace, depending on the connectivity of each participant requesting an operation.

  • As DAGs are designed for a high throughput rate, the size of the ledger can rapidly explode, hence pruning is commonly used (snapshot sum-up of states by epochs), deleting the history of operations—this again weakens trust in what is recorded in the ledger and empowers nodes that take charge of pruning.


The takeaway is that, most probably, both blockchain and tangles will have their preferred field of application where they will perform better than others.

Corda, Hyperledger and similar DLTs

Corda has on-boarded ING. The insurance giant has decided to use the technology to help streamline its back-office. BNP Paribas and Deutsche Bank are also among large corporations adopting Corda. Hyperledger is constantly hitting the headlines, with IBM promoting it in every project they sell to their customers.

Otherwise, there is not much to highlight technically regarding these solutions.


Samsung, in addition to Huawei, is proposing built-in secure wallets in their high-end smartphones; this is logical competition arriving to challenge the market for dedicated hardware wallets like Ledger and Trezor.

Sirin Labs, a company that launched dedicating itself to serve the smart-phone-for-crypto segment, has released its first products, which are available to the public through Sirin Labs’ physical outlets.

In the same vein, a Swiss luxury watch company (A. Favre & Fils) has advertised a watch that has a built-in cryptocurrency wallet.


Double spending—the 51% attack

A 51% attack occurred on Ethereum Classic in early January. It was spotted by CoinBase and consisted of double spending coins to the value of $500k. This has been repeated than in the following days on several transactions.


It appears that Ethereum Classic is now the chain where the DAO hack is still considered effective; this chain is particularly suitable for hackers to continue attacking with confidence, knowing that the community is not going to work to reverse the hack.


The takeaway is that small blockchains with a less hashing power are very vulnerable, as there is always someone who can have more computational power than everyone else. This is even truer because, as coins weaken, they become easier to attack—as, in a bear market, hashpower is turned off.

The vulnerability is not limited to double spending: once in control, an attacker can easily use the power to manipulate prices and build a short position. This is known as the Goldfinger attack.


The latest cryptocurrency exchange for being hacked is Cryptopia, based in Christchurch, New Zealand. The hack, which occurred on January 15th, involved a few million dollars. The interesting thing here is that the tokens concerned have been identified as having been transferred to Binance, which took the conservative step to freeze them.

This incident sets a very interesting precedent. Except for privacy-focused cryptocurrencies, it is evident that this safety measure can be deployed for all sorts of cryptocurrencies and is a feature that can be easily implementable in the legacy payment system. Note that even if the thieves are not identified—and they may very well be, and easily—then the proceeds of the theft should be refused everywhere, thus destroying the profit from the theft, making it pointless.


Anonymous cryptocurrencies continue to be the currency of choice for kidnappers, scammers, and other criminals.


“Fake impersonations,” for instance, of celebrities, through the creation of fake social media profiles account for a growing number of easy scams. For example, in a fake twitter account for technology entrepreneur and billionaire, Elon Musk, scammers promised to give away a total of 100 ethereum (ETH) to US President Donald Trump’s followers after they send 0.2 ETH to a certain address. This may sound absurd, but people constantly fall victims of such scams. An analysis shows that 468 scammers have collected more than 8000 ETH in this way.

Fraudulent investment funds and Ponzi schemes

During and after the crypto bubble of 2017, an important number of crypto investment funds were launched. Some were not very well managed or made promises based on the astonishing returns observed in 2017. Hence, many are collapsing, and investors are using some for fraudulent promises.

Selfish mining attack

This strategy consists of a miner that discovers a block to mine, and then a second one with no competition, and if successful, a third one, etc. The miner then publishes this chain only if the main network is advancing at its level, then tries to broadcast its version of the chain fast enough.


It appears that no successful selfish mining has occurred recently on bitcoin. We can, however, report that such an attack occurred on Monacoin in May 2018, with damage of around ninety thousand dollars. Small chains are more vulnerable to this kind of attack.


Mining malware

With specialized mining equipment came specialized malware, which specifically targeted bitcoin mining rigs, asking for a ransom, with the threat of destroying the equipment by running it to the point of overheating.

© 2020

  • Twitter - White Circle
  • White LinkedIn Icon
  • YouTube - White Circle