Section 6: Latest advancement in DLT technology

As usual, but specifically in this Section, it is assumed that the reader is up to speed with studies covered in previous reports. Past Blockchain Quarterly issues can be accessed through our website.


Investment in the cryptosphere shifted throughout 2018-2019, from wild funding of small ventures through ICOs, to large companies creating their innovation teams and paying for the development of enterprise information management solutions. So, the financial investment is by no means down, although the current state of the cryptocurrency market is more worrisome.

Unknown Fund has been created by the Anonymous ‘hacktivist’ organization specifically to invest $75 million (of Bitcoin) to boost privacy-preserving technologies. In an announcement, the group said that it considers the management of data as a powerful tool to manipulate people, declaring “The Unknown

Fund sees incredible opportunities to protect the rights and freedoms of people that technology such as blockchain and cryptocurrencies give us. This is a chance for humanity to create a new environment, a new and honest monetary system, and to make the world a better place”.





















[Consensus method on DLTs is largely determining then the scalability of the infrastructure, and in general scalability, features need to be governed carefully. Therefore, we discuss these topics mostly within the scalability paragraph.]


Technical developments on consensus modes The consensus mechanism intended to be used on the Concordium blockchain is worth examining, as it is quite elegant. The principle is to combine a Proof of Stake mechanism that ensures fast block production, and hence, a decent throughput rate, with a Byzantine Fault Tolerant mechanism on some blocks that allow for safe finalization. The combination of “the best of both worlds,” as they put it, results in a chain that is both lively and safe, which means that the PoS fast consensus, which is prone to forking, is surely converged by the BFT college, whose requirement of a percentage of consistent votes can be adjusted.




















There is a type of paradox, as far as governance, in general, is concerned. Industries and administrations, after gaining an understanding of blockchain’s promises, are enthusiastic in identifying that it is going to help them with governance. Indeed, the more “liquid” the democracy, the more direct the contributions and votes, and, of course, tamperproof authenticated records will build trust in governments, as well as leaders.

This, indeed, presents a very positive future, if you assume that the DLT infrastructure itself will be built appropriately. The only problem is that the governance of the blockchain itself is far from being resolved!

Governance is a problem that has several aspects. For example, who is going to have the authority/ prerogative in managing the deployed logic? However, the most sensitive issue is the consensus model. The more control identified parties have on the agreement mechanism of what is written to the chain, the less decentralized it is, and the higher the risk of attack on the given individuals, even potentially from official bodies. Even in the case of full decentralization, governance is a concern: a 51% attack, whether it be stake or work-based, can never be totally ruled out, and having businesses running on platforms with the sword of Damoclès hanging over them, would likely result in loss of sleep for those responsible for continued operations.


We believe there are two ways things are likely to develop. These are likely to progress in parallel, as there is room for both approaches to succeed in their specific contexts.


The first one is the emergence of private “houses of trust,” providing their clients, against a remuneration, an execution environment that they are (easily) accountable for. This first approach involves consortiums, whose governance today is through companies collaborating within a given industry; ultimately, a body will be in charge of running the execution environment, which eventually may or may not become independent. This will be at the discretion of its stakeholders.


The second is going to emerge, with no compromise in terms of decentralization. This is the aim of Ethereum, for instance. For this approach, concerns around the safety of the network may be resolved by intelligent consensus mechanisms being actively researched and developed. But most would agree that the best protection is always going to be the sheer size of the network, the required number of participants, which needs to be consistent in the long term.


Hard forks

Ethereum’s hard fork, named Istanbul, has been successfully implemented. This was nothing special, really, just an expected step on the road to Ethereum 2.0.



Scalability in terms of transactions per unit of time The news on the scalability front is not good. Analysts are starting to express the view that they do not expect to see full scalability within the next ten years.

Trusted Execution Environments

Research on providing Trusted Execution Environments (TEE) is taking off. This is somewhat paradoxical but makes perfect sense: if we can perform computation in local enclaves that trust each other to compute in parallel, can coordinate the sharing of information, and endorse transactions, then this part of the process can be removed from the main chain, thereby increasing scalability.


Companies as diverse as Visa, SuperComputingSystems, etc., are working on their own solutions in this direction. Cartesi is another project that is also proposing to rely on off-chain computation, available to any blockchain.


Importantly, the use of TEEs will also favor the introduction of privacy, as it is easier to keep information private if it is shared by only a limited number of participants.

Lightning network for Bitcoin

2019 was announced as the take-off year for the Lightning Network, driving BTC adoption, as it was seen as the solution to the scalability issue. This has not been the case.


The community has answered impatient critics in various ways. With a call for patience, claiming that the current poor user experience is the main factor behind the slow progress.

Claiming that Bitcoin is hodled, instead of being used as an everyday exchange medium because this is its inherent nature. In our view, this argument is not at all convincing!


Explaining that practical Lightning Network usage is hard to estimate, because, as a second layer protocol, its traffic is sort of “in private,” and therefore cannot be aggregated correctly.


But all in all, the lack of adoption of LN has come as a disappointment to many – a concrete sign of overall difficulties within the DLT sector.


A different usage is being explored for the Lightning Network: the transmission of private messages. An application has been released, called Whatsat. It 
has always been possible to add some text to a BTC transaction, and, by extension, to a Lightning transaction.



During the Ethereum DevCon held in October in Osaka, Ethereum’s main stakeholders conceded that Ethereum 2.0 was still some way down the track. Among the over 3000 participants at the event, developers acknowledged that the postscalability version of the platform might not be available before late 2021, two years from now.


Funding for development still comes mostly from the Ethereum Foundation, domiciliated in Zug.


One of the most debated topics at DevCon was how to migrate tokens from Eth.1 to Eth.2. Many ways exist, but one has to be chosen, with no definitive answer provided thus far by Vitalik Buterin, who stated that it would favor “close to no disruption at all” – so, everyone can feel reassured.


The figure of Buterin is still as central to Ethereum as it ever was. As the Grand Priest of DevCon, his speech is listened to carefully by believers. His view is that the current version of Ethereum has been an interesting experiment, mostly paving the way for Ethereum 2.0 and that more iterations are necessary, especially with projects like Raiden, Plasma and others, which leaves an impression on the community.



IOTA has announced a bridge connector to communicate between IOTA and Hyperledger Fabric



Cardano has the specificity to be a project aimed primarily at sound academic research. In this sense, monitoring its progress indicates confidence in obtaining functioning PoS, sharded mainstream solutions.


The indication is that research is ongoing; the contracted team under Hoskinson’s IOHK (Input Output Hong Kong) has grown to 200 people worldwide, with a very detailed roadmap that will still take years before it bears actual fruit.

Tezos Kathleen Breitman, co-creator of public PoS blockchain, Tezos, recently stated: “I can tell you for a fact, it’s extremely unromantic and extremely unpleasant to watch a proof-of-stake network evolve. … It’s an extraordinarily hard task to switch to a PoS network or to launch a PoS network. The reason why is because there’s so much more coordination cost, more than anything else. It’s not a trivial task.”



An airdrop of the EIDOS token has clogged the whole EOS platform. The token itself looks pretty useless, but those behind it appear to have triggered its distribution to make a point: by design, they encourage a maximum number of transactions. Users are invited to mine for new EIDOS tokens by moving EOS back and forth on the network. 

To get EIDOS, users send tiny amounts of EOS to the smart contract. The contract then sends back the same amount plus a small portion of the daily release of EIDOS. Importantly, it doesn’t appear  to matter how much EOS is sent. What matters is how many transactions a person can submit. EIDOS is designed to generate maximum transactions. By providing a profit motive to encourage people to take action, rather than spend more money, EIDOS seems to have been designed to test the capacity of EOS itself. 



A number of more or less credible projects are developing their approach to solving scalability, while also developing specific consensus algorithms.
Solana is proposing a mechanism that can run independently on each participant’s computer in order for them to agree on the time, and include this “proof of history” in the validated blocks agreed upon by PBFT. Claimed scalability is in the range of 50000 TPS.

A Korean platform, Fleta, also claims to have achieved 14000 TPS after the launch of its main net.


Metahash uses a Multiple Proof of Stake (MultiPoS) consensus mechanism. This involves ranking nodes in the network in a hierarchy: above light clients for users are peer nodes that decide to participate in the consensus. Among them are selected verification nodes, based on a range of criteria, including performance, but also geography and randomness. The role is to provide a college that is adapted to the network configuration and representative. 


Verification nodes have a more significant capacity requirement, in terms of processing than peer nodes, but less than core nodes, which are the top category. With this configuration, MultiPoS relies on multilayered validation to protect against the corruption of some nodes in the network. This model allows validation and block distribution processes to run in a parallel, thus decreasing consensus time. Core nodes generate blocks, but a voting mechanism exists for lower-level nodes to ask for a re-build of the network whenever they observe corruption. Over 50,000 TPS is claimed by Metahash, with nodes on five continents.


Relictum, which will have a reduced block size, and a throughput rate of one million TPS.


Algorand, created by a proven and famous cryptographer, Silvio Micali, was designed to solve the trilemma of decentralization/security/ scalability. Blocks are produced in two phases: first, a block proposal phase, with one proposer being chosen randomly, then a voting phase by a randomly chosen college among the willing participants that are staking at least 1000 tokens. The advantage of the resulting blockchain is that it confirms immediately, does not burn energy to do so, is effectively decentralized and highly resistant to attack, as a hacker would not know whom to target.



Let us highlight here the fundamental difference, in terms of finalization, between what is commonly called the “Nakamoto consensus mechanism” (either PoW or PoS), and the Byzantine Fault Tolerant voting mechanisms. When the chain is agreed upon after the proposal of a block by one or more participants, chosen according to a specific principle, the possible existence of competing chains can cause a given user to rollback a transaction that appeared to have been validated, with the probability of such an event decreasing rapidly, along with the number of blocks waiting. On the contrary, when every next block is the object of a vote among a pool of chosen voters, identified upstream, the block, once accepted in BFT voting mode, is, by definition, finalized.



There is still not much discussion or research on the question of handling potentially large amounts of decentralized data securely, and, as problems begin to arise, the size of the ledger will become an issue only when scalability allows high volumes to be processed.


Solana proposes to incentivize “Archivers” to store the ledger history. This incentivization is probably an interesting option to explore.



This quarter, there is nothing fundamentally new to report in the field of interoperability, but we can still take the opportunity, in this calm period, to mention the main initiatives and approaches, thanks to a summary by Stephen O'Neal. Indeed, different methods are being researched, including crosschains, sidechains, proxy tokens, swaps, etc.}

  • Polkadot is a multichain, or cross-chain, technology. It allows different blockchains to plug into a broader, standardized ecosystem. It was founded by Gavin Wood, a co-founder of Ethereum. Technically, Polkadot is comprised of parachains (i.e., parallel blockchains that process transactions and transfer them to the original blockchain), a relay chain (i.e., a central component that connects parachains and ensures their security), and bridges that connect Polkadot to external blockchains.

  • Cosmos also follows the cross-chain principle. Specifically, it employs an inter-blockchain c o m m u n i c a t i o n (IBC) protocol to establish blockchain interoperability. It serves as a TCP/IP-like messaging protocol for blockchains. Since various established blockchains (like Bitcoin) do not support IBC by design, Cosmos uses the so-called “peg zones” to connect them to the “Cosmos Hub” – as the project is called – a “flagship” blockchain that binds all the zones together and coordinates communications between them via standardized languages. However, the Cosmos Hub is a part of the broader interchain ecosystem developed by Cosmos that can contain other entities. For instance, there is also Iris Hub, which focuses on enterprise customers and Chinese clients.

  • Chainlink is a decentralized Oracle service. It allows for data to be retrieved from off-chain APIs and be put on a blockchain. In other words, Chainlink serves as a bridge between blockchains and all the infrastructure that exists off-chain: Oracle nodes receive real-world data, process it through the network and take it to the blockchain. Notably, the company cooperates with SWIFT, the global interbank data transfer and payment system used by most banks across the world.

  •  Wanchain uses a different protocol to facilitate data transfers between otherwise unconnected blockchains. Thus, instead of deploying peg zones or its multichain analogs, Wanchain creates socalled “wrapped” tokens that can be traded on other blockchains. For instance, to move 10 ETH to the BTC chain, the platform would first lock that amount of ETH on the Ethereum blockchain using smart contracts, which would then mint 10 Wanchainwrapped ETH (WETH) on Wanchain. These WETH could then be traded for Wanchain-wrapped BTC (WBTC) on a trading platform. Those wrapped BTC tokens can then be turned into the original tokens located on the Bitcoin blockchain. 

  • Quant: unlike the examples mentioned above, Quant is not a blockchain. It uses the Overledger protocol, a layer that runs over existing blockchains. Overledger ostensibly allows developers to create “MApps” (decentralized applications that utilize multiple blockchains at the same time) in “three lines of code” and without any additional infrastructure. This allows for more options in blockchain engineering. For instance, a MApp could rely on the Ethereum blockchain for data storage while using Bitcoin Cash (BCH) for value transfer. 

  • Other projects researching interoperability include Cardano, Aion, Icon, Ark, Bytum, Dragonchain, and Ferrum network, among others.



From the possibility to group transactions with multiple inputs and multiple outputs like typically on bitcoin, to obfuscate where is going which money, have appeared some “mixing” services.


The “fun” part is that regulators have immediately started to clamp down on such services! For instance, has been shut down by the Dutch regulator. This all too much illustrates where the next crypto battle is being waged, as expected! States are not going at all to let go the control over financial flows.
These moves from official bodies are considered very seriously by crypto communities. The controversial McAfee has stated: “Bitcoin mixers are now being targeted. Anonymity itself is slowly being considered a crime. The word ‘Privacy’ will soon mean ‘Criminal Intent’.” Vitalik even pressed for the creation of on-chain mixers in response to off-chain actions by regulators.



The MimbleWimble technique came under threat in November, as Ivan Bogatyy reportedly deanonymized 96% of Grin’s transactions in realtime. The researcher claims that the method can no longer be considered a viable, secure alternative for private transactions, saying that MimbleWimble is “fundamentally flawed.”

In practice, MW relies on merging transactions to obfuscate them. But because transactions are continually being created and originating from separate locations, if a node picks up all transactions before the cut-through aggregation is finished, it is able to unwind the CoinJoin. Any sniffer node can observe the network and take note of the original transactions before they are aggregated. While this does not show the number of transactions, it exposes the addresses.

The better connected the attacker is, the more effectively it can uncover transactions. This will become more difficult as the network grows in size but remains a potential breach of security.

Grin developer, Daniel Lehnberg, said that Bogatyy’s assertion is unfair and attention-grabbing – a story to be followed-up.


Zero-Knowledge Proof

Research continues on the magical technique that is ZKP. Findora recently revealed a breakthrough - ‘supersonic’ proof that is practical, trustless, succinct and verifiable as zero-knowledge. These are smaller than 10Kb and take only milliseconds to verify, even for the most complex statements. Fedora claims these are at least 25 times smaller than any other trustless zero-knowledge proof system with comparable verification times.
Ben Fisch, CTO at Findora, stated that until now, skSNARKs have not been practical as a trusted set-up. “Supersonic’s combined proof size and verification time improve on the state-of-the-art by more than an order of magnitude for complex statements. They are at least 50 times smaller than STARKs and 1000 times faster to verify than Bulletproofs for these kinds of applications.”



Please refer to the GDPR paragraph in the Legal Section of this report.



Not coming back on the functioning and potential impact of quantum computing, let us simply inform here that there is no shortage of a frenzy of press releases and technical development claims from firms involved in making this a reality. So, the inception of quantum computing is a matter of time, that may be shorter than expected.



Using distributed ledgers to propose an infrastructure for information management and automation of business processes (including settlement), requires at its core, the need to formulate a protocol and the platform requirements. Some argue it might be too early to work on standardizing blockchain. However, DLT-based systems are fundamental, a method of handling business processes and communicating between independent actors. To successfully achieve this, the business processes and technical requirements have to be stated in detail and agreed upon beforehand.


This may occur informally between large actors, or officially by standardization bodies, however, if any of the promised benefits of using blockchain are to be achieved, an exhaustive collaboration of stakeholders must be coordinated to formulate the information management processes and protocols, and the infrastructure requirements to deploy it.

Several initiatives are being developed in diverse industries such as power generation and insurance, and lately, the film industry, under the patronage of the American Film Market. The case of logistics/ supply chain looks emblematic to us; after a considerable number of pilot projects by Walmart, Carrefour, Maersk, IBM, SAP and Nestlé, with each focusing on their supplier ecosystem, the lack of generally admitted principles to build the traceability mechanism has become evident.


At the most official level, ISO standards, ten papers have been initiated, starting with identification of use cases and standard terminology. Most of the work is being done by AFNOR, the French standards, and certification body, with oversight by Julien Bringer, focusing on issues such as digital assets management on DLTs, privacy issues, security, etc. The work is ongoing but illustrates how serious some stakeholders are in defining universal principles to assist the whole blockchain sector is progressing in an orderly manner.



Double spending – 51% attack

It has been observed that an unidentified miner managed to control 50%+ of the hash rate on BitcoinCash during 24 hours, mining 73 blocks, or roughly half of the blocks mined during that period. Interestingly, the miner chose not to write the whole chain, even though this was theoretically feasible. No double spending was reported, so no fraud was reported.


We are referring to the fourth most prominent cryptocurrency, so the 51% attack is more than ever an ongoing concern, even for the strongest, Bitcoin. It is something that no one can rule out. Further, the halving of the Bitcoin reward in the coming months is likely to reduce the safety of the network, making this attack all the more concerning, even though possible.

Market manipulations

We continue to observe indications of questionable market behavior, but with little hard evidence.


A case was filed to prosecute Alameda Research LLC, with accusations of attempts to manipulate Bitcoin’s price on Binance (even though unsuccessful).


Some research on the trading of Bitcoin and Tether between March 2017 and March 2018 has been conducted by US academics. Patterns have been identified that led the researchers to claim that most of the impact on the Bitcoin price during that period (i.e., the exceptional run to $20k) was caused by the actions of a single large trader that they codenamed 1LSg. This casts further doubt on the already shady Tether and Bitfinex connection. If this is true, then it shows that dramatic fluctuations in the price of Bitcoin was, and continues to be, caused by only a limited volume of transactions, and puts in question the fundamental analysis we have recently put forward…

Other researchers, however, point out that the purchasing power of Tether (Tether’s market cap per BTC market cap)  increased throughout 2017, and 
decreased afterward, highlighting that whenever BTC falls, this metric increases. To us, this reasoning does not seem very convincing. The only sure thing is that the crypto market is still the Wild West, and nobody can effectively control the actions of large or small players that attempt to influence the market!


























Thefts, hacks, frauds, and scams

The latest victims of exchange hacks were Vietnambased VinDAX, which lost half a million USD on November 5th, spread across 23 different cryptocurrencies, and UpBit, a Korean platform, which lost $49 million worth of crypto, which evaporated on 26th November 2019.
The attacks are not stopping!


Mining malware

Always more sophisticated mining software continues to spread on the internet.
On its side, Firefox now provides an add-on option to block crypto-mining scripts automatically.



IoT synergy

Blockchain is regarded as a technology of interest within the robotics sector for managing information transfer between machines in a secure manner, and immune from attack. In particular, swarm robotics poses a challenge in terms of robots working together to perform tasks and operations. The collective behavior of the swarm is what is of interest. By using cryptographic digital signatures and secure public-key cryptography, blockchain promises to provide optimum security for data across shared channels between swarm robots. So, while AI-powered robotics is emerging as a forefront technology, blockchain empowers robotics with an optimum security solution.


Artificial intelligence

Beyond implementing advanced algorithms as distributed applications, artificial intelligence can enhance blockchain in a variety of ways:


  • Saving energy: AI can be highly effective in keeping a check on energy consumption, thanks to optimization.

  • Data screening: the significant amount of onchain data, not necessarily organized, calls for new approaches to exploit (mine) it, which are compatible with machine learning.


Conversely, blockchain can help AI development, especially in providing a data repository to train algorithms, with or without homomorphic encryption to hide the data itself. This is a typical blockchain use-case, including a fee to the data owner.



Direct Acyclic Graph (=Tangle)

The progress published by IOTA is earning the project some momentum in the fight towards reaching mainstream adoption.


Corda, Hyperledger & other permissioned non-blockchain DLTs

The CordaCon conference was held in November, and was attended by 2000 delegates, twice as many as the previous conference. The organizers congratulated themselves on the number of significant enterprise blockchain consortia preparing to go live. R3’s Chairman, Frédéric Dalibard, noted that the conversation has changed from experimentation to “how do I put my CorDapp into production, and how to iron out the legal details, which means this is real.”


Importantly, Corda is being adopted beyond finance, with big consortia currently readying to launch in sectors such as insurance, trade finance, and capital markets. Objectively, the scale and outreach of projects deployed on Corda are impressive: insurance consortium B3i, the Swiss Exchange SIX platform, the Marco Polo trade finance solution. These are all significant, top-tier systems that are going to have a considerable impact when, or if, they are put into full production.


Other permissioned non-blockchain DLTs

Hyperledger is committed to organizing a significant event in Phoenix in March 2020, capitalizing on the continuous inflow of new members, and use cases that are being developed on the platform.



There is no denying that the slow and challenging progress in the technological development of DLT infrastructures is disappointing. There are very few new concrete technological solutions to report. Ethereum’s DevCon, in particular, embodies the difficulty of coordinating a community, and how far we still are from the satisfactory scaling of blockchains. Even Bitcoin’s Lightning Network is plateauing.
So, there is not much positive development to report, either incrementally or disruptively, which is quite worrisome. This gloominess, however, explains, at least partly, 

© 2020

  • Twitter - White Circle
  • White LinkedIn Icon
  • YouTube - White Circle