Estonia is a real place

My little corner of the internet seems awash with tales of a mythical utopia that goes by the name of Estonia. Since my little corner is the digital identity corner, I’ve been hearing about digital identity in Estonia more and more. At meetings and conferences, on social media and in conversation, I hear people talking about the Estonian national identity scheme that uses a blockchain. The Harvard Business Review, for example, tells us that “since 2007 Estonia has been operating a universal national digital identity scheme using blockchain”. This sort of thing crops up on Twitter from time to time. I’m not sure if some of the people tweeting about the Estonian national digital identity blockchain know that Estonia is actually a real place and that some people (e.g., me) have been there. In fact, here is a picture of me in Tallin to prove it.

 Me in Tallin

The Estonian national digital ID scheme launched in 2002. A decade ago a colleague of mine at Consult Hyperion, Margaret Ford, interviewed Mart Parve from the Estonian “Look@World” Foundation in the long standing “Tomorrow’s Transactions” podcast series (available here). Mart was responsible for using the smart ID service (both online and offline) to help Estonia develop its e-society. If you listen carefully to them talking, you will notice that they never mention the blockchain, which is unsurprising since Satoshi’s Nakamoto’s paper on the subject was not published until more than a year later, in October 2008.

The strangeness of the obsession with Estonia in blockchain circles began to bother me after I was invited along to a blockchain breakfast (seriously) at the House of Lords last year. The invitation came because I had been asked to contribute to the Parliamentary Office of Science and Technology (POST) work on distributed ledger and the purpose of the breakfast was to discuss this report. The breakfast was hosted by Stephen Metcalfe MP, chair of the Science and Technology Committee. Sir Mark Walport, the Government’s Chief Scientific Adviser (GCSA), opened the proceedings. Sir Mark had authored the Government Office for Science report on “Distributed Ledger Technology: beyond blockchain” earlier in the year. In it, he focused on a particular kind of distributed ledger, the Bitcoin blockchain, and attempted to explain it to the general reader and then explore some of the potential uses.

(From here on I insist to sticking to the term that Richard Brown of R3 and I started using a couple of years ago “shared ledger technology” (SLT) as the general description because I feel that the fact that multiple organisations share the ledger is more important than its architecture.)

Personally, I found the report slightly confusing because it was jumping between ledgers, blockchains, the bitcoin blockchain and bitcoin almost on a paragraph by paragraph basis. What’s more, and I realise that I read the document from a very technical perspective and that I may see some of these things therefore in the wrong context, I think the report might have benefited from some more description of shared ledgers, and the reasons why Moore’s Law and falling communications costs have made the core idea of everyone storing every transaction a plausible architecture. Here’s the way that my colleagues at Consult Hyperion and I started to think about the ledger a couple of years ago, the “4Cs” model that has worked rather well.

Consensus Computer Model

I prefer to use this layered approach to explain the key components of a shared ledger and then develop ideas around different choices in those layers. Different choices in consensus technology, for example, lead to a variety of different possibilities for implementing a shared ledger. In order to help categorise these possibilities, and narrow them down to make useful discussions between the strategists and technologists, I use the taxonomy that Consult Hyperion developed to distinguish between different kinds of public and private ledgers. Rather flatteringly, Sir Mark used a simplified version of the this model on page 19 of his report.

When the report came out I said that it might be considered reckless to disagree with the GCSA, but I just did not (and do not) see cryptocurrency as a sensible government option for digital currency. Anyway putting my nerdy criticisms to one side, Sir Mark’s conclusions (which were essentially that the technology is worth exploring in government contexts) were surely correct. He said that permissioned ledgers (i.e., not the Bitcoin blockchain) are appealing for government applications and I’m sure he was right about this, although I remain sceptical about some of the suggested government uses that are based on costs or efficiency. I think that his suggestions around applications that focus on transparency are the more interesting areas to explore in the short term and they would be my focus if I were looking to start exploratory or pilot projects in the field. I share the Open Data Institute’s view on this, which is that blockchains could be used to build confidence in government services, through public auditability.

House of Blockchain

When it came time for my contribution, by the way, I said that it wasn’t at all clear to me that it was accurate to describe Bitcoin as a decentralised system since almost all of the hashing power resides with a very small number of unaccountable mining pools based in China but, more importantly that

  1. It seems to me that many of the efforts to move shared ledgers into the marketplace have concentrated on shaping shared ledgers to emulate existing solutions in the hope that SLTs will be faster, higher or stronger. These are all unproven assertions. It is possible that a shared ledger replacement for RTGS might be cheaper, or more resilient or more functional that the currency centralised solution, but who knows?

  2. The transparency of the shared ledger, the aspect that most doesn’t work for current solutions in current markets, may well turn out to be the most important characteristic because it allows for ambient accountability and therefore opens up the potential for new kinds of markets that are far less costly and complex to regulate, manage, inspect and audit. This is the “shared ledger as regtech not fintech meme” that I am rather fond of.

  3. Just as the invention of double-entry bookkeeping allowed for the creation of new kinds of enterprise, so it seems to me that the shared ledger will similarly lead to new kinds of enterprise that use the shared ledger application (the SLAPP) as the engine of progress and the focus of innovation. I assume that there are kids in basements experimenting with SLAPPs right now and that this is where the breakthrough use case will come from. As I said some time ago in a discussion about shared ledgers for land registry, turning the ledger into a platform may be the most important reason for shifting to this implementation.

At the breakfast, Sir Mark said that the goal of the POST reports is to demystify technology for policy makers although I have to report that in his closing remarks he said that we had not been entirely successful in this enterprise and I fully concur with his opinion. That’s not why I’m talking about it breakfast at the House of Lords here though. Back to Estonia! At one point, the breakfast discussion moved on to the Estonian electronic identity system. At this point I expressed some scepticism as to whether the Estonian electronic identity system was on a blockchain. The conversation continued on the basis that it was. Then to my shame I lost it and began babbling “it’s not a blockchain” until the chairman, in an appropriate, gentlemanly and parliamentary, told me to shut up.

The point that I was trying to make was that the Estonian ID scheme, launched in 2002, has nothing to do with shared ledgers or mutual distributed ledgers or blockchains. As it happens, a some time after my breakfast with their lordships, I had another breakfast, this time with the new CIO of Estonia, Siim Sikkut

sikkut17 

I asked Siim where this “Estonian blockchain ID” myth came from, since I find it absolutely baffling that this urban legend has obtained such traction.  He said that it might be something to do with people misunderstanding the use of hashes to protect the integrity of data in the Estonian system. Aha! Then I remembered something… More than decade ago I edited the book “Digital Identity Management” and Taarvi Martens (one of the architects of the Estonian scheme) was kind enough submit a case study for it. Here is an extract from that very case study:

Long-time validity of these [digitally-signed] documents is secured by logging of issued validity confirmations by the Validation Authority. This log is cryptographically secured by one-way hash-function and newspaper-publication to prevent back-dating and carefully backed up to preserve digital history of mankind.

Well, there we have it. It looks as if the mention of the record of document hashes has triggered an inappropriate correlation amongst observers and, as Siim observed, it may indeed be the origin of the fake news about Estonia’s non-existent digital identity blockchain.

(This is a revised and edited version of post that first appeared on Consult Hyperion’s “Tomorrow’s Transactions” blog in March 2017.)

Back to the future of Bitcoin

I was very excited to discover via the interweb tubes that Bitcoin is now going into geostationary orbit. In the near future, Bitcoins will be dropping as a gentle rain from heaven. Well, sort of.

Blockstream Satellite is the world’s first service that broadcasts real-time Bitcoin transactions and blocks from a group of satellites in space.

From Blockstream – Announcing Blockstream Satellite

You cannot imagine the nostalgia this story generated for me because, astonishing as it may now seem, the first ‘fintech’ project that I ever worked on involved using satellites to transit financial data and the first book chapter that I ever wrote was about the use of satellite data for business.

Settle down youngsters, and I’ll tell you the tale…

Cast your mind back to 1982. Those interweb tubes are a distant dream. Getting data from place to place is a major effort. In a far away place (Indonesia) a group of talented 10x prima donna programmers are writing software to run on the world’s first regional satellite data system, the Palapa-B1 service (a Hughes HS376, for the technical, with 24 C-band transponders). In the great city of Bandung, one of these dashing young software engineers — me — was initially tasked with writing the (and here’s one for the teenagers) X.28 code and then the X.25 code to allow (amongst other things) bank terminals and other devices to connect via this new satellite network to allow communications between bank branches on far flung islands throughout the Indonesian archipelago and bank offices in Jakarta and elsewhere. You couldn’t buy communications software for the processors we were using. You had to write it from scratch. If you tell the young people of today that, they won’t believe you.

Indo83 3

We were working at a telecoms supplier’s site in Bandung. I know it doesn’t look much from the outside.

A Japanese team were building the baseband modems and implementing the Aloha link protocol that had originally been invented for Alohanet. This gave me the assembly language primitives to work with to implement the CCITT protocols on top. X.28, as if you need any reminding, was the protocol for character input/output (used to connect terminals across a network to mainframes) and X.25 was the packet-switching protocol for interconnecting computers. I still think of terminals at DTEs (Data Terminating Equipment) and I still think of network connections as DCEs (Data Circuit Terminating Equipment). All of these quaint terms vanished from the pages of history about a week after TCP/IP was invented.

Indo83

As you can see, inside we had access to many modern facilities.

Implementing X.28 meant that staff could log on to bank mainframes using terminals in the branches. Implementing X.25 meant that remote minicomputers could interconnect. Getting the code to work, and getting it to work quickly enough, and getting it to work in the limited memory available was a fantastic education. I loved my time as C ninja, interfacing with what was then leading-edge communications hardware to deliver data services to real users.

Indo83 2

Here I am making a few small adjustments to the communications processors boards.

It was here I learned all my UNIX tricks and C programming stunts. Those were the days when if you didn’t like the way that the team wrote code you could quickly knock up a parser to force them into line (which one of my colleagues did, using YACC), when you had to pretend to the system administrator that you didn’t have root access (which we all did) and when the disk packs held 5Mb so you had to be very careful with the space available *wipes away a tear*.

Indo83 1

As you can see, the team really appreciated my mad programming skills and their contribution to the great success of the project.

In the later 1980s and very early 1990s, I enjoyed working on a wide variety of projects around satellite data communications. I worked on technical architectures, system designs and even on regulation in a team with the now-infamous Vicky Pryce (who was then chief economist at KPMG, and who I remember as a very impressive and really clever, but also really nice person). The very first conference paper that I ever wrote was on the use of satellite data broadcasting to deliver stock exchange data to market participants and I spent happy days at Telekurs, Dow Jones Telerate, the London Stock Exchange and other places working on link budgets, low-noise blocks and forward error correcting codes (this is where I learned about convolutional coding and Viterbi decoders. One of the most interesting areas I worked in was the use of Vertical Blanking Interval (VBI) data services embedded in analogue television transmissions and the potential (abandoned) use of data space in digital television transmissions for value-added (largely financial) services.

Books about satellite communications

A few years later, I worked on a similar system using Very Small Aperture (VSAT) terminals in K-band (too much information, ed.) for a US telecommunications provider, on one of Consult Hyperion’s first US projects. In those still pre-internet days, if you wanted to get data from a branch office back to HQ reasonably quickly you had to pay for a leased data line from the phone company, which was very expensive. Putting a satellite terminal on your roof was a cheaper alternative and as the frequencies went up from C- to Ku-based, so the dish sizes and costs came down. The cost of installing and maintaining a six foot dish compared very favourably with the costs of alternatives, until the internet and mobile phones came along and spoiled all the fun.

Ah, the good old days.

ZCash and The Glass Bank

Interesting to see the cryptocurrency ZCash in the news today, since it’s one of the ones I focussed on in my new book (in case I haven’t mentioned it, it’s called Before Babylon, Beyond Bitcoin and you can buy it from all good booksellers). As I said about Zcash in the chapter “Counting on Cryptography” written toward the end of 2016, “people, companies and governments will not use the underlying anonymous currency but instead use the privacy-enhancing kinds of money built on top of it”.

This is indeed what J.P. Morgan just announced at Consensus 2017 (see “JP Morgan Chase to Integrate Cash Technology to its Enterprise Blockchain Platform“). Or, as American Banker put it in their story “So, just to be clear: JPMorgan isn’t using Zcash”. As was set out by the parties themselves, what they intend to do is to use the Zcash technology of zero-knowledge proofs on their own Quorum blockchain to deliver privacy into financial markets where the participants want the advantages of shared ledgers but do not want to disclose the contents of transactions to all participants. I think this is quite a big deal, but that’s because the institutional use of these new technologies to create markets that work in more efficient ways accords with my own mental roadmap for shared ledgers. 

In a paper I co-wrote a couple of years ago with Richard Brown, the CTO of R3, and Consult Hyperion colleague Salome Parulava [published as Birch, D., R. Brown and S. Parulava (2016). “Towards ambient accountability in financial services: shared ledgers, translucent transactions and the legacy of the great financial crisis.” Payment Strategy and Systems 10(2): 118-131.], we adopted the term “translucent” to mean transactions that are transparent for the purposes of consensus (in other words, we can all agree that the transaction took place and the order of transactions) but opaque to those not party to the trade or the appropriate regulators under the relevant circumstances. I gave a talk introducing these concepts at NextBank Barcelona back in 2015.

It seems to me that the JP Morgan / ZCash announcement takes us another step forward in this direction and moves use towards the era of “The Glass Bank” (something I used in client workshops for many years and that I first blogged about back in 2011), an era in which translucency develops as a response to the Great Financial Crisis (GFC) and as a fundamental improvement in the way that financial markets operate, and which I have already decided will be the title of my next book!

The official blockchain quatrain

The moving finger writes; and having writ

Moves on; nor all your piety nor wit

shall lure it back to cancel half a line,

Nor all your tears wash out a word of it.

 

#Blockchain

 

OK, I added the hashtag, but the rest is from Edward Marlborough’s 1859 translation of the Rubáiyát of mathematician, astronomer, philosopher and poet Omar Ahayyám (1048-1131).