A Healthcare Utopia of Rules

Blood Draw

I have a medical condition that requires that I get blood tests every three months. And, having recently changed jobs, my insurance, and thus the set of acceptable labs, changed recently. I know that this specific problem is very US-centric, but bear with me, I think the problems that I'll describe, and the architectures that lead to them, are more general than my specific situation.

My doctor sees me every 6 months, and so gives me two lab orders each time. Last week, I showed up at Revere Health's lab. They were happy to take my insurance, but not the lab order. They needed a date on it. So, I called my doctor and they said they'd fax over an order to the lab. We tried that three times but the lab never got it. So my doctor emailed it to me. The lab wouldn't take the electronic lab order from my phone, wouldn't let me email it to them (citing privacy issues with non-secure email), and couldn't let me print it there. I ended up driving to the UPS Store to print it, then return to the lab. Ugh.

This story is a perfect illustration of what David Graeber calls the Utopia of Rules. Designers of administrative systems do the imaginative work of defining processes, policies, and rules. But, as I wrote in Authentic Digital Relationships:

Because of the systematic imbalance of power that administrative ... systems create, administrators can afford to be lazy. To the administrator, everyone is structurally the same, being fit into the same schema. This is efficient because they can afford to ignore all the qualities that make people unique and concentrate on just their business. Meanwhile subjects are left to perform the "interpretive labor," as Graeber calls it, of understanding the system, what it allows or doesn't, and how it can be bent to accomplish their goals. Subjects have few tools for managing these relationships because each one is a little different from the others, not only technically, but procedurally as well. There is no common protocol or user experience [from one administrative system to the next].

The lab order format my doctor gave me was accepted just fine at Intermountain Health Care's labs. But Revere Health had different rules. I was forced to adapt to their rules, being subject to their administration.

Bureaucracies are often made functional by the people at the front line making exceptions or cutting corners. In my case no exceptions were made. They were polite, but ultimately uncaring and felt no responsibility to help me solve the problem. This is an example of the "interpretive labor" borne by the subjects of any administrative system.

Centralizing the system—such as having one national healthcare system—could solve my problem because the format for the order and the communication between entities could be streamlined. You can also solve the problem by defining cross-organization schema and protocols. My choice, as you might guess, would be a solution based on verifiable credentials—whether or not the healthcare system is centralized. Verifiable credentials offer a few benefits:

  • Verifiable credentials can solve the communication problem so that everyone in the system gets authentic data.
  • Because the credentials issued to me, I can be a trustworthy conduit between the doctor and the lab.
  • Verifiable credentials allow an interoperable solution with several vendors.
  • The tools, software, and techniques for verifiable credentials are well understood.

Verifiable credentials don't solve the problem of the lab being able to understand the doctor's order or the order having all the right data. That is a governance problem outside the realm of technology. But because we've narrowed the problem to defining the schema for a given localized set of doctors, labs, pharmacies, and other health-care providers, it might be tractable.

Verifiable credentials are a no-brainer for solving problems in health care. Interestingly, many health care use cases already use the patient as the conduit for transferring data between providers. But they are stuck in a paper world because many of the solutions that have been proposed for solving it, lead to central systems that require near-universal buy-in to work. Protocol-based solutions are the antedote to that and, fortunately, they're available now.


Photo Credit: Blood Draw from Linnaea Mallette (CC0 1.0)


Verifying Twitter

Bluebird on branch

This thread from Matthew Yglesias concerning Twitter's decision to charge for the blue verification checkmark got me thinking. Matthew makes some good points:

  1. Pseudonymity has value and offers protection to people who might not otherwise feel free to post if Twitter required real names like Facebook tries to.
  2. Verification tells the reader that the account is run by a person
  3. There's value to readers in knowing the real name and professional affiliation of some accounts

Importantly, the primary value accrues to the reader, not the tweeter. So, charging the tweeter $20/month (now $8) is charging the wrong party. In fact, more than the reader, the platform itself realizes the most value from verification because it can make the platform more trustworthy. Twitter will make more money if the verification system can help people understand the provenance of tweets because ads will become more valuable.

Since no one asked me, I thought I'd offer a suggestion on how to do this right. You won't be surprised that my solution uses verifiable credentials.

First, Twitter needs to make being verified worthwhile to the largest number of users possible. Maybe that means that tweets from unverified accounts are delayed or limited in some way. There are lots of options and some A/B testing would probably show what incentives work best.

Second, pick a handful (five springs to mind) of initial credential issuers that Twitter will trust and define the credential schema they'd prefer. Companies like Onfido can already do this. It wouldn't be hard for others like Equifax, ID.me, and GLEIF to issue credentials based on the "real person" or "real company" verifications they're already doing. These credential issuers could charge whatever the market would bear. Twitter might get some of this money.

Last, Twitter allows anyone with a "real person" credential from one of these credential issuers to verify their profile. The base verification would be for the holder to use zero-knowledge proof to prove they are a person or legal entity. If they choose, the credential holder might want to prove their real name and professional affiliation, but that wouldn't be required. Verifying these credentials as part of the Twitter profile would be relatively easy for Twitter to implement.

Twitter would have to decide what to do about accounts that are not real people or legal entities. Some of these bots have value. Maybe there's a separate verification process for these that requires that the holder of the bot account prove who they are to Twitter so they can be held responsible for their bot's behavior.

You might be worried that the verified person would sell their verification or verify multiple accounts. There are a number of ways to mitigate this. I explained some of this in Transferable Accounts Putting Passengers at Risk.

Real person verification using verifiable credentials has a number of advantages.

  1. First, Twitter never knows anyone's real name unless that person chooses to reveal it. This means that Twitter can't be forced to reveal it to someone else. They just know they're a real person. This saves Twitter from being put in that position and building infrastructure and teams to deal with it. Yes, the police, for example, could determine who issued the Twitter Real Person credential and subpoena them, but that's the business these companies are in, so presumably they already have processes for doing this.
  2. Another nice perk from this is that Twitter jump starts an ecosystem for real person credentials that might have uses somewhere else. This has the side benefit of making fraud less likely since the more a person relies on a credential the less likely they are to use it for fraudulent purposes.
  3. A big advantage is that Twitter can now give people peace of mind that they accounts they're following are controlled by real people. Tools might let people adjust their feed accordingly so they see more tweets by real people.
  4. Twitter also can give advertisers comfort that their engagement numbers are closer to reality. Twitter makes more money.

Yglesias says:

Charging power users for features that most people don’t need or want makes perfect sense.

But verification isn’t a power user feature, it’s a terrible implementation of what’s supposed to be a feature for the everyday user. It should help newbies figure out what’s going on.

Verifiable credentials can help make Twitter a more trustworthy place by providing authentic data about people and companies creating accounts—and do it better than Twitter's current system. I'm pretty sure Twitter won't. Elon seems adamant that they are going to charge to get the blue checkmark. But, I can dream.

Bonus Link: John Bull's Twitter thread on Trust Thermoclines

Notes


Photo Credit: tree-nature-branch-bird-flower-wildlife-867763-pxhere.com from Unknown (CC0)


The Nature of Identity

A Bundle of Sticks

Cogito, ergo sum.
—René Descartes

The Peace of Westphalia, which ended the Thirty Years' War in 1648, created the concept of Westphalian sovereignty: the principle of international law that "each state has sovereignty over its territory and domestic affairs, to the exclusion of all external powers, on the principle of non-interference in another country's domestic affairs, and that each state (no matter how large or small) is equal in international law."1

The ensuing century saw many of these states begin civil registration for their citizens, in an effort to turn their sovereignty over territory into governance over the people living in those lands. These registrations, from which our modern system of birth certificates springs, became the basis for personal identity and legal identity in a way that conflated these two concepts.

Birth certificates are a source of legal identity and a proof of citizenship, and thus the basis for individual identity in most countries. Civil registration has become the foundation for how states relate to their citizens. As modern nation-states have become more and more influential (and often controlling) in the lives of their citizens, civil registration and its attendant legal identity have come to play a larger and larger role in their lives. People present proof of civil registration for many purposes: to prove who they are and, springing from that, their citizenship.

Even so, Descartes did not say, "I have a birth certificate, therefore I am." When most people hear the word identity, they think about birth certificates, passports, driver's licenses, logins, passwords, and other sorts of credentials. But clearly, we are more than our legal identity. For most purposes and interactions, our identity is defined through our relationships. Even more deeply, we each experience these independently as an autonomous being with an individual perspective.

This dichotomy reflects identity's dual nature. While identity is something others assign to us, it is also something deep inside of us, reflecting what Descartes actually said: "I think, therefore I am."

A Bundle of Sticks?

Another way to think about the dual nature of identity is to ask, "Am I more than a set of attributes?" Property rights are often thought of as a "bundle of sticks": each right is separable from the rest and has value independent of the rest. Similarly, identity is often considered a bundle of attributes, each with independent value. This is known in philosophy as bundle theory, originated by David Hume.

Bundle theory puts attributes into a collection without worrying about what ties them together. As an example, you might identify a plum as purple, spherical, 5 centimeters in diameter, and juicy. Critics of bundle theory question how these attributes can be known to be related without knowing the underlying substance—the thing itself.

Substance theory, on the other hand, holds that attributes are borne by "an entity which exists in such a way that it needs no other entity to exist," according to our friend Descartes. Substance theory gives rise to the idea of persistence in the philosophy of personal identity. People, organizations, and things persist through time. In one sense, you are the same person who you were when you were 16. But in another, you are not. The thing that makes you the same person over your lifetime is substance. The thing that makes you different is the collection of ever-changing attributes you present to the outside world over time.

I'm no philosopher, but I believe both viewpoints are useful for understanding digital identity. For many practical purposes, viewing people, organizations, and things as bundles of attributes is good enough. This view is the assumption upon which the modern web is built. You log into different services and present a different bundle of attributes to each. There is no substance, at least in the digital sense, since the only thing tying them together is you, a decidedly nondigital entity.

This lack of a digital representation of you, that you alone control, is one of the themes I'll return to several times in my book. At present, you are not digitally embodied—your digital existence depends on other entities. You have no digital substance to connect the various attributes you present online. I believe that digital identity systems must embody us and give us substance if we are to build a digital future where people can operationalize their online existence and maintain their dignity as autonomous human beings.


Notes

  1. "Nation-States and Sovereignty," History Guild, accessed October 5, 2022.
  2. Substance theory has many more proponents than Descartes, but his definition is helpful in thinking through identity’s dual nature.

Photo Credit: Smoke sticks for honey harvesting from Lucy McHugh/CIFOR (CC BY-NC-ND 2.0, photo cropped vertically)


Using OpenID4VC for Credential Exchange

OpenID Logo

In my discussions of verifiable credentials, I assume DIDs are the underlying identifier. This implies that DIDComm, the messaging protocol based on DIDs, underlies the exchange of verifiable credentials. This does not have to be the case.

The OpenID Foundation has defined protocols on top of OAuth1 for issuing and presenting credentials. These specifications support the W3C Verifiable credential data model specification and support both full credential and derived credential presentations. The OpenID specifications allow for other credential formats as well, such as the ISO mobile driver’s license.

In addition to defining specifications for issuing and presenting credentials, OpenID for Verifiable Credentials (OpenID4VC) introduces2 a wallet for holding and presenting credentials. Open ID Connect (OIDC) redirects interactions between the identity provider (IdP) and relying party (RP) through a user agent under a person’s control, but there was never an OIDC-specific user agent. The addition of a wallet allows OpenID4VC to break the link that has traditionally existed between the IdP and RP in the form of federation agreements and an interaction protocol wherein the IdP always knew when a person used OIDC to authenticate at the RP. OpenID4VC offers direct presentation using the wallet.

Extending OAuth and OIDC to support the issuance and presentation of verifiable credentials provides for richer interactions than merely supporting authentication. All the use cases we’ve identified for verifiable credentials are available in OpenID4VC as well.

In addition to using the OpenID4VC wallet with a traditional OIDC IdP, OpenID has also added a specification for Self-issued OpenID Providers (SIOP). A SIOP is an IdP that is controlled by the entity who uses the wallet. A SIOP might use DIDs, KERI, or something else for identifiers. The SIOP allows Alice to control the identifiers and claims she releases to the RP. As with DIDs, a SIOP-based relationship between the RP and Alice is not intermediated by an external, federated IdP as it is in the traditional OIDC model.

When Alice uses a wallet that supports OpenID4VC and SIOP to present credentials to an RP, the Alice has a relationship with the RP based on a self-issued identity token she creates. SIOP allows Alice to make presentations independent from any specific IdP. As a result, she can present credentials from any issuer to the RP, not just information from a single IdP as is the case in traditional OIDC.

Like any other credential presentation, the RP can verify the fidelity of the credential cryptographically. This can include knowing that it was issued to the wallet Alice is using to make the presentation. The RP also gets the identifier for the credential issuer inside the presentation and must decide whether to trust the information presented.

To make fidelity and provenance determinations for the credential, the RP will need the public key for the credential issuer as is the case with any credential verification. The verifiable data registry (VDR) in an OpenID4VC credential exchange might be a ledger or other decentralized data store if the presentation uses DIDs, or it might be obtained using PKI or web-pages accessible under a domain name controlled by the issuer. Depending on how this is done, the credential issuer might or might not know which credentials the RP is verifying. The design of the VDR plays a large role in whether credential exchange has all the properties we might desire.

OpenID4VC is an important example of alternatives to DIDComm in verifiable credential exchange thanks to OIDC’s large deployment base and developers’ familiarity with its underlying protocols and procedures. Because the W3C specification for verifiable credentials does not specify an underlying mechanism for exchanging credentials, others are possible. If you find a need from an alternative, be sure to carefully vet its design to ensure it meets your privacy, authenticity, and confidentiality requirements.

  1. Recall that OpenID Connect is based on OAuth.
  2. For details on OpenID4VC, I recommend the introductory whitepaper from the OpenID Foundation: OpenID for Verifiable Credentials: A Shift in the Trust Model Brought by Verifiable Credentials (June 23, 2022)


A New Job at AWS

AWS Logo

I've been dark for a few weeks. Things have been busy. I'm retiring from BYU (after 29 years, albeit with some interruptions) and starting a new job with Amazon Web Services (AWS). The job is in AWS Identity, and involves automated reasoning (formal methods). My Ph.D. dissertation was on using formal methods to verify the correctness of microprocessors. So the new job combines two things I've spent a good portion of my professional life working on. I'm loving it.

The name of what I and the larger group (Automated Reasoning Group) are doing is "provable security." AWS Provable Security automatically generates mathematical proofs to assert universal statements about the security properties of your AWS application. For example, Access Analyzer uses automated reasoning to analyze all public and cross-account access paths to your resources and provides comprehensive analysis of those paths, making statements like "None of your S3 buckets are publicly available."

What is Automated Reasoning? How Is it Used at AWS?
What is Automated Reasoning? How Is it Used at AWS? (click to play)

To understand this better, and get a glimpse of where it could go, I recommend this talk from AWS re:inforce by Neha Rungta and Andrew Gacek.

AWS re:Inforce 2022 - High assurance with provable security
AWS re:Inforce 2022 - High assurance with provable security (click to play)

When I was doing formal methods, we dreamed of the day when automated reasoning would handle real problems for people without access to highly trained Ph.D. researchers. Now, that's possible and available in 1-click, for many problems. I'm excited to be working on it.


ONDC: An Open Network for Ecommerce

Platforms to protocols

I read about the Open Network for Digital Commerce (ONDC) on Azeem Azhar's Exponential View this week and then saw a discussion of it on the VRM mailing list. I usually take multiple hits on the same thing as a sign I ought to dig in a little more.

Open Network for Digital Commerce is a non-profit established by the Indian government to develop open ecommerce. The goal is to end platform monopolies in ecommerce using an open protocol called Beckn. I'd never heard of Beckn before. From the reaction on the VRM mailing list, not many there had either.

This series of videos by Ravi Prakash, the architect of Beckn, is a pretty good introduction. The first two are largely tutorials on open networks and protocols and their application to commerce. The real discussion of Beckn starts about 5'30" into the second video. One of Beckn's core features is a way for buyers to discover sellers and their catalogs. In my experience with decentralized systems, discovery is one of the things that has to work well.

The README on the specifications indicates that buyers (identified as BAPs) address a search to a Beckn gateway of their choice. If the search doesn't specify a specific seller, then the gateway broadcasts the request to multiple sellers (labeled BPPs) whose catalogs match the context of the request. Beckn's protocol routes these requests to the sellers who they believe can meet the intent of the search. Beckn also includes specifications for ordering, fulfillment, and post-fulfillment activities like ratings, returns, and support.

Beckn creates shared digital infrastructure
Beckn creates shared digital infrastructure (click to enlarge)

ONDC's goal is to allow small merchants to compete with large platforms like Amazon, Google, and Flipkart. Merchants would use one of several ONDC-compatible clients to list their catalogs. When a buyer searches, products from their catalog would show up in search results. Small and medium merchants have long held the advantage in being close to the buyer, but lacked ways to easily get their product offerings in front of online shoppers. Platforms hold these merchants hostage because of their reach, but often lack local options. ONDC wants to level that playing field.

Will the big platforms play? The India Times interviewed Manish Tiwary, country Manager for Amazon's India Consumer Business. In the article he says:

I am focused on serving the next 500 million customers. Therefore, I look forward to innovations, which will lift all the boats in the ecosystem.

At this stage, we are engaging very closely with the ONDC group, and we are quite committed to what the government is wanting to do, which is to digitize kiranas, local stores...I spoke about some of our initiatives, which are preceding even ONDC... So yes, excited by what it can do. It's a nascent industry, we will work closely with the government.

An open network for ecommerce would change how we shop online. There are adoption challenges. Not the least of which is getting small merchants to list what they have for sale and keep inventory up to date. Most small merchants don't have sophisticated software systems to interface for automatic updates—they'll do it by hand. If they don't see the sales, they'll not spend the time maintaining their catalog. Bringing the tens of millions of small merchants in India online will be a massive effort.

I'm fascinated by efforts like these. I spend most of my time right now writing about open networks for identity as I wrap up my forthcoming O'Reilly book. I'm not sure anyone really knows how to get them going, so it takes a lot of work with more misses than hits. But I remain optimistic that open networks will ultimately succeed. Don't ask me why. I'm not sure I can explain it.


Photo Credit: Screenshots from Beckn tutorial videos from Ravi Prakash (CC BY-SA 4.0)


The Path to Redemption: Remembering Craig Burton

When I got word that Craig Burton had died, the news wasn't unexpected. He'd been ill with brain cancer for a some time and we knew his time was limited. Craig is a great man, a good person, a valued advisor, and a fabulous friend. Craig's life is an amazing story of success, challenge, and overcoming.

I first met Craig when I was CIO for Utah and he was the storied co-founder of Novell and the Burton Group. Dave Politis calls Craig "one of Utah's tech industry Original Gangsters". I was a bit intimidated. Craig was starting a new venture with his longtime friend Art Navarez, and wanted to talk to me about it. That first meeting was where I came to appreciate his famous wit and sharp, insightful mind. Over time, our relationship grew and I came to rely him whenever I had a sticky problem to unravel. One of Craig's talents was throwing out the conventional thinking and starting over to reframe a problem in ways that made solutions tractable. That's what he'd done at Novell when he moved up the stack to avoid the tangle of competing network standards and create a market in network services.

When Steve Fulling and I started Kynetx in 2007 we knew we needed Craig as an advisor. He mentored us—sometimes gently and sometimes with a swift kick. He advised us. He dove into the technology and developed applications, even though he wasn't a developer. He introduced us to one of our most important investors, and now good friend, Roy Avondet. He was our biggest cheerleader and we were grateful for his friendship and help. Craig wasn't just an advisor. He was fully engaged.

One of Craig's favorite words was "ubiquity" and he lived his life consistent with that philosophy. Let me share three stories about Craig from the Kynetx days that I hope show a little bit of his personality:

  • Steve, Craig, and I had flown to Seattle to meet with Microsoft. Flying with Craig is always an adventure, but that's another story. We met with some people on Microsoft's identity team including Kim Cameron, Craig's longtime friend and Microsoft's Chief Identity Architect. During the meeting someone, a product manager, said something stupid and you could just see Craig come up in his chair. Kim, sitting in the corner, was trying not to laugh because he knew what was coming. Craig, very deliberately and logically, took the PM's argument apart. He wasn't mean; he was patient. But his logic cut like a knife. He could be direct. Craig always took charge of a room.
  • Craig's trademark look
    Craig's trademark look (click to enlarge)
  • We hosted a developer conference at Kynetx called Impact. Naturally, Craig spoke. But Craig couldn't just give a standard presentation. He sat, in a big chair on the stage and "held forth". He even had his guitar with him and sang during the presentation. Craig loved music. The singing was all Craig. He couldn't just speak, he had to entertain and make people laugh and smile.
  • Craig and I at Kynetx Impact in 2011
    Craig and me at Kynetx Impact in 2011 (click to enlarge)
  • At Kynetx, we hosted Free Lunch Friday every week. We'd feed lunch to our team, developers using our product, and anyone else who wanted to come visit the office. We usually brought in something like Jimmy Johns, Costco pizza, or J Dawgs. Not Craig. He and Judith took over the entire break room (for the entire building), brought in portable burners, and cooked a multi-course meal. It was delicious and completely over the top. I can see him with his floppy hat and big, oversized glasses, flamboyant and happy. Ubiquity!
Craig with Britt Blaser at IIW
Craig with Britt Blaser at IIW (click to enlarge)

I've been there with Craig in some of the highest points of his life and some of the lowest. I've seen him meet his challenges head on and rise above them. Being his friend was hard sometimes. He demanded much of his friends. But he returned help, joy, and, above all, love. He regretted that his choices hurt others besides himself. Craig loved large and completely.

The last decade of Craig's life was remarkable. Craig, in 2011, was a classic tragic hero: noble, virtuous, and basking in past success but with a seemingly fatal flaw. But Craig's story didn't end in 2011. Drummond Reed, a mutual friend and fellow traveler wrote this for Craig's service:

Ten years ago, when Craig was at one of the lowest points in his life, I had the chance to join a small group of his friends to help intervene and steer him back on an upward path. It was an extraordinary experience I will never forget, both because of what I learned about Craig's amazing life, and what it proved about the power of love to change someone's direction. In fact Craig went on from there not just to another phase of his storied career, but to reconnect and marry his high school sweetheart.

Craig and his crew: Doc Searls, me, Craig, Craig's son Alex, Drummond Reed, and Steve Fulling
Craig and his crew: Doc Searls, me, Craig, Craig's son Alex, Drummond Reed, and Steve Fulling (click to enlarge)

Craig found real happiness in those last years of his life—and he deserved it.

Craig Burton was a mountain of a man, and a mountain of mind. And he moved the mountains of the internet for all of us. The digital future will be safer, richer, and more rewarding for all of us because of the gifts he gave us.

Starting with that intervention, Craig began a long, painful path to eventual happiness and redemption.

  • Craig overcame his internal demons. This was a battle royale. He had help from friends and family (especially his sisters), but in the end, he had to make the change, tamp down his darkest urges, and face his problems head on. His natural optimism and ability to see things realistically helped. When he finally turned his insightful mind on himself, he began to make progress.
  • Craig had to live and cope with chronic health challenges, many of which were the result of decisions he'd made earlier in his life. Despite the limitations they placed on him, he met them with his usual optimism and love of life.
  • Craig refound his faith. I'm not sure he ever really lost it, but he couldn't reconcile some of his choices with what he believed his faith required of him. In 2016, he decided to rejoin the Church of Jesus Christ of Latter-Day Saints. I was privileged to be able to baptize him. A great honor, that he was kind enough to give me.
  • Craig also refound love and his high school sweetheart, Paula. The timing couldn't have been more perfect. Earlier and Craig wouldn't have been ready. Later and it likely would have been too late. They were married in 2017 and later had the marriage sealed in the Seoul Korea Temple. Craig and Paula were living in Seoul at the time, engaged in another adventure. While Craig loved large, I believe he may have come to doubt that he was worthy of love himself. Paula gave him love and a reason to strive for a little more in the last years of his life.
  • Craig and Paula
    Craig and Paula (click to enlarge)

As I think about the last decade of Craig's life and his hard work to set himself straight, I'm reminded of the parable of the Laborers in the Vineyard. In that parable, Jesus compares the Kingdom of Heaven to a man hiring laborers for his vineyard. He goes to the marketplace and hires some, promising them a penny. He goes back later, at the 6th and 9th hours, and hires more. Finally he hires more laborers in the 11th hour. When it comes time to pay them, he gives everyone the same wage—a penny. The point of the parable is that it doesn't matter so much when you start the journey, but where you end up.

I'm a believer in Jesus Christ and the power of his atonement and resurrection. I know Craig was too. He told me once that belief had given him the courage and hope to keep striving when all seemed lost. Craig knew the highest of the highs. He knew the lowest of the lows. The last few years of his life were among the happiest I ever saw him experience. He was a new man. In the end, Craig ended up in a good place.

I will miss my friend, but I'm eternally grateful for his life and example.

Other Tributes and Remembrances


Photo Credits: Craig Burton, 1953-2022 from Doc Searls (CC BY 2.0)


The Most Inventive Thing I've Done

Pico Logo

In 2007, I co-founded a company called Kynetx and realized that the infrastructure necessary for building our product did not exist. To address that gap, I invented picos, an internet-first, persistent, actor-model programming system. Picos are the most inventive thing I've done. Being internet-first, every pico is serverless and cloud-native, presenting an API that can be fully customized by developers. Because they're persistent, picos support databaseless programming with intuitive data isolation. As an actor-model programming system, different picos can operate concurrently without the need for locks, making them a natural choice for easily building decentralized systems.

Picos can be arranged in networks supporting peer-to-peer communication and computation. A cooperating network of picos reacts to messages, changes state, and sends messages. Picos have an internal event bus for distributing those messages to rules installed in the pico. Rules in the pico are selected to run based on declarative event expressions. The pico matches events on its bus with event scenarios declared in each rule's event expression. The pico engine schedules any rule whose event expression matches the event for execution. Executing rules may raise additional events which are processed in the same way.

As Kynetx reacted to market forces and trends, like the rise of mobile, the product line changed, and picos evolved and matured to match those changing needs, becoming a system that was capable of supporting complex Internet-of-Things (IoT) applications. For example, we ran a successful Kickstarter campaign in 2013 to build a connected car product called Fuse. Fuse used a cellular sensor connected to the vehicle's on-board diagnostics port (OBD2) to raise events from the car's internal bus to a pico that served as the vehicle's digital twin. Picos allowed Fuse to easily provide an autonomous processing agent for each vehicle and to organize those into fleets. Because picos support peer-to-peer architectures, putting a vehicle in more than one fleet or having a fleet with multiple owners was easy.

Fuse presented a conventional IoT user experience using a mobile app connected to a cloud service built using picos. But thanks to the inherently distributed nature of picos, Fuse offered owner choice and service substitutability. Owners could choose to move the picos representing their fleet to an alternate service provider, or even self-host if they desired without loss of functionality. Operationally, picos proved more than capable of providing responsive, scalable, and resilient service for Fuse customers without significant effort on my part. Fuse ultimately shut down because the operator of the network supplying the OBD2 devices went out of business. But while Fuse ran, picos provided Fuse customers with an efficient, capable, and resilient infrastructure for a valuable IoT service with unique characteristics.

The characteristics of picos make them a good choice for building distributed and decentralized applications that are responsive, resilient to failure, and respond well to uneven workloads. Asynchronous messaging and concurrent operation make picos a great fit for modern distributed applications. For example, picos can synchronously query other picos to get data snapshots, but this is not usually the most efficient interaction pattern. Instead, because picos support lock-free asynchronous concurrency, a system of picos can efficiently respond to events to accomplish a task using reactive programming patterns like scatter-gather.

The development of picos has continued, with the underlying pico engine having gone through three major versions. The current version is based on NodeJS and is open-source. The latest version was designed to operate on small platforms like a Raspberry PI as well as cloud platforms like Amazon's EC2. Over the years hundreds of developers have used picos for their programming projects. Recent applications include a proof-of-concept system supporting intention-based ecommerce by Customer Commons.

The architecture of picos was a good fit for Customer Commons' objective to build a system promoting user autonomy and choice because picos provide better control over apps and data. This is a natural result of the pico model where each pico represents a closure over services and data. Picos cleanly separate the data for different entities. Picos, representing a specific entity, and rulesets representing a specific business capability within the pico, provide fine grained control over data and its processing. For example, if you sell a car represented in Fuse, you can transfer the vehicle pico to the new owner, after deleting the Trips application, and its associated data, while leaving untouched the maintenance records, which are isolated inside the Maintenance application in the pico.

I didn't start out in 2007 to write a programming language that naturally supports decentralized programming using the actor-model while being cloud-native, serverless, and databaseless. Indeed, if I had, I likely wouldn't have succeeded. Instead picos evolved from a simple rule language for modifying web pages to a powerful, general-purpose programming system for building any decentralized application. Picos are easily the most important technology I've invented.


Decentralized Systems Don't Care

Ballet scene at the Great Hall of the People attended by President and Mrs. Nixon during their trip to Peking, China

I love getting Azeem Azhar's Exponential View each week. There's always a few things that catch my eye. Recently, he linked to a working paper from Alberto F. Alesina, el. al. called Persistence Through Revolutions (PDF). The paper looks at the fate of the children and grandchildren of landed elite who were systematically persecuted during the cultural revolution (1966 to 1976) in an effort to eradicate wealth and educational inequality. The paper found that the grandchildren of these elite have recovered around two-thirds of the pre-cultural revolution status that their grandparents had. From the paper:

[T]hree decades after the introduction of economic reforms in the 1980s, the descendants of the former elite earn a 16–17% higher annual income than those of the former non-elite, such as poor peasants. Individuals whose grandparents belonged to the pre-revolution elite systematically bounced back, despite the cards being stacked against them and their parents. They could not inherit land and other assets from their grandparents, their parents could not attend secondary school or university due to the Cultural Revolution, their parents were unwilling to express previously stigmatized pro-market attitudes in surveys, and they reside in counties that have become more equal and more hostile toward inequality today. One channel we emphasize is the transmission of values across generations. The grandchildren of former landlords are more likely to express pro-market and individualistic values, such as approving of competition as an economic driving force, and willing to exert more effort at work and investing in higher education. In fact, the vertical transmission of values and attitudes — "informal human capital" — is extremely resilient: even stigmatizing public expression of values may not be sufficient, since the transmission in the private environment could occur regardless.
From Persistence Through Revolutions
Referenced 2022-06-27T11:13:05-0600

There are certainly plenty of interesting societal implications to these findings, but I love what it tells us about the interplay between institutions, even very powerful ones, and more decentralized systems like networks and tribes1. The families are functioning as tribes, but there's like a larger social network in play as well made from connections, relatives, and friends. The decentralized social structure or tribes and networks proved resilient even in the face of some of the most coercive and overbearing actions that a seemingly all-powerful state could take.

In a more IT-related story, I also recently read this article, Despite ban, Bitcoin mining continues in China. The article stated:

Last September, China seemed to finally be serious about banning cryptocurrencies, leading miners to flee the country for Kazakhstan. Just eight months later, though, things might be changing again.

Research from the University of Cambridge's Judge Business School shows that China is second only to the U.S. in Bitcoin mining. In December 2021, the most recent figures available, China was responsible for 21% of the Bitcoin mined globally (compared to just under 38% in the U.S.). Kazakhstan came in third.

From Despite ban, Bitcoin mining continues in China
Referenced 2022-06-27T11:32:29-0600

When China instituted the crackdown, some of my Twitter friends, who are less than enthusiastic about crypto, reacted with glee, believing this would really hurt Bitcoin. My reaction was "Bitcoin doesn't care what you think. Bitcoin doesn't care if you hate it."

What matters is not what actions institutions take against Bitcoin2 (or any other decentralized system), but whether or not Bitcoin can maintain coherence in the face of these actions. Social systems that are enduring, scalable, and generative require coherence among participants. Coherence allows us to manage complexity. Coherence is necessary for any group of people to cooperate. The coherence necessary to create the internet came in part from standards, but more from the actions of people who created organizations, established those standards, ran services, and set up exchange points.

Bitcoin's coherence stems from several things including belief in the need for a currency not under institutional control, monetary rewards from mining, investment, and use cases. The resilience of Chinese miners, for example, likely rests mostly on the monetary reward. The sheer number of people involved in Bitcoin gives it staying power. They aren't organized by an institution, they're organized around the ledger and how it operates. Bitcoin core developers, mining consortiums, and BTC holders are powerful forces that balance the governance of the network. The soft and hard forks that have happened over the years represent an inefficient, but effective governance reflecting the core believes of these powerful groups.

So, what should we make of the recent crypto sell-off? I think price is a reasonable proxy for the coherence of participants in the social system that Bitcoin represents. As I said, people buy, hold, use, and sell Bitcoin for many different reasons. Price lets us condense all those reasons down to just one number. I've long maintained that stable decentralized systems need a way to transfer value from the edge to the center. For the internet, that system was telcos. For Bitcoin, it's the coin itself. The economic strength of a decentralized system (whether the internet of Bitcoin) is a good measure of how well it's fairing.

Comparing Bitcoin's current situation to Ethereum's is instructive. If you look around, it's hard to find concrete reasons for Bitcoin's price doldrums other than the general miasma that is affecting all assets (especially risk assets) because of fears about recession and inflation. Ethereum is different. Certainly, there's a set of investors who are selling for the same reasons they're selling BTC. But Ethereum is also undergoing a dramatic transition, called "the merge", that will move the underlying ledger from proof-of-work to proof-of-stake. These kinds of large scale transitions have a big impact on a decentralized system's coherence since there will inevitably be people very excited about it and some who are opposed—winners and losers, if you will.

Is the design of Bitcoin sufficient for it to survive in the long term? I don't know. Stable decentralized systems are hard to get right. I think we got lucky with the internet. And even the internet is showing weakness against the long-term efforts of institutional forces to shape it in their image. Like the difficulty of killing off decentralized social and cultural traditions and systems, decentralized technology systems can withstand a lot of abuse and still function. Bitcoin, Ethereum, and a few other blockchains have proven that they can last for more than a decade despite challenges, changing expectations, and dramatic architectural transitions. I love the experimentation in decentralized system design that they represent. These systems won't die because you (or various governments) don't like them. The paradox is that they don't care what you think, even as they depend heavily on what everyone thinks.


Notes

  1. To explore this categorization further, see this John Robb commentary on David Ronfeldt's Rand Corporation paper "Tribes, Institutions, Markets, Networks" (PDF).
  2. For simplicity, I'm just going to talk about Bitcoin, but my comments largely apply to any decentralized system

Photo Credit: Ballet scene at the Great Hall of the People attended by President and Mrs. Nixon during their trip to Peking from Byron E. Schumaker (Public Domain)


Fixing Web Login

Dual elevator door buttons

You know the conventional wisdom that the "close" button in elevators isn't really hooked up to anything. That it's just there to make you feel good? "Keep me logged in" is digital identity's version of that button. Why is using authenticated service on the web so unpleasant?

Note that I'm specifically talking about the web, as opposed to mobile apps. As I wrote before, compare your online, web experience at your bank with the mobile experience from the same bank. Chances are, if you're like me, that you pick up your phone and use a biometric authentication method (e.g. FaceId) to open it. Then you select the app and the biometrics play again to make sure it's you, and you're in.

On the web, in contrast, you likely end up at a landing page where you have to search for the login button which is hidden in a menu or at the top of the page. Once you do, it probably asks you for your identifier (username). You open up your password manager (a few clicks) and fill the username and only then does it show you the password field1. You click a few more times to fill in the password. Then, if you use multi-factor authentication (and you should), you get to open up your phone, find the 2FA app, get the code, and type it in. To add insult to injury, the ceremony will be just different enough at every site you visit that you really don't develop much muscle memory for it.

As a consequence, when most people need something from their bank, they pull out their phone and use the mobile app. I think this is a shame. I like the web. There's more freedom on the web because there are fewer all-powerful gatekeepers. And, for many developers, it's more approachable. The web, by design, is more transparent in how it works, inspiring innovation and accelerating it's adoption.

The core problem with the web isn't just passwords. After all, most mobile apps authenticate using passwords as well. The problem is how sessions are set up and refreshed (or not, in the case of the web). On the web, sessions are managed using cookies, or correlation identifiers. HTTP cookies are generated by the server and stored on the browser. Whenever the browser makes a request to the server, it sends back the cookie, allowing the server to correlate all requests from that browser. Web sites, over the years, have become more security conscious and, as a result, most set expirations for cookies. When the cookie has expired, you have to log in again.

Now, your mobile app uses HTTP as well, and so it also uses cookies to link HTTP requests and create a session. The difference is in how you're authenticated. Mobile apps (speaking generally) are driven by APIs. The app makes an HTTP request to the API and receives JSON data in return which it then renders into the screens and buttons you interact with. Most API access is protected by an identity protocol called OAuth.

Getting an access token from the authorization server
Getting an access token from the authorization server (click to enlarge)
Using a token to request data from an API
Using a token to request data from an API (click to enlarge)

You've used OAuth if you've ever used any kind of social login like Login with Apple, or Google sign-in. Your mobile app doesn't just ask for your user ID and password and then log you in. Rather, it uses them to authenticate with an authentication server for the API using OAuth. The standard OAuth flow returns an authentication token that the app stores and then returns to the server with each request. Like cookies, these access tokens expire. But, unlike cookies, OAuth defines a refresh token mechanism that the app can be use to get a new access token. Neat, huh?

The problem with using OAuth on the web is that it's difficult to trust browsers:

  • Some are in public places and people forget to log out.
  • A token in the browser can be attacked with techniques like cross-site scripting.
  • Browser storage mechanisms are also subject to attack.

Consequently, storing the access token, refresh token, and developer credentials that are used to carry out an OAuth flow is hard—maybe impossible—to do securely.

Solving this problem probably won't happen because we solved browser security problems and decided to use OAuth in the browser. A more likely approach is to get rid of passwords and make repeated authentication much less onerous. Fortunately, solutions are at hand. Most major browsers on most major platforms can now be used as FIDO platform authenticators. This is a fancy way of saying you can use the the same mechanisms you use to authenticate to the device (touch ID, face ID, or even a PIN) to authenticate to your favorite web site as well. Verifiable credentials are another up and coming technology that promises to significantly reduce the burdens of passwords and multi-factor authentication.

I'm hopeful that we may really be close to the end for passwords. I think the biggest obstacle to adoption is likely that these technologies are so slick that people won't believe they're really secure. If we can get adoption, then maybe we'll see a resurgence of web-based services as well.


Notes

  1. This is known as "identifier-first authentication". By asking for the identifier, the authentication service can determine how to authenticate you. So, if you're using a token authentication instead of passwords, it can present the right option. Some places do this well, merely hiding the password field using Javascript and CSS, so that password managers can still fill the password even though it's not visible. Others don't, and you have to use your password manager twice for a single login.

Photo Credit: Dual elevator door buttons from Nils R. Barth (CC0 1.0)