Distributing Authorities and Verifying Their Claims

Copyright (c) 1997 by Nick Szabo
permission to redistribute without alteration hereby granted


Reputation systems ultimately need to be based on fact rather than mere opinion or faith to be effective. For example, if we are to have a good credit rating system, we need to be confident that the credit record assembled by the agency is sufficiently accurate. Reputation information is typical gathered and distributed by authorities trusted to perform this task. Other kinds of specific performance are often entrusted to third parties; I call such third parties "authorities". We must be able to trust the authority (credit agency, anti-virus software vendor, certificate authority, digital cash mint, etc.) with their particular claims (about creditworthiness, dangerous byte patterns, identity, conservation of the money supply, etc.) As Reagan well noted, "trust but verify". To deserve our trust, authorities must convince us that their claims are true. We need to be able to "ping" their veracity, verifying that certain claimed transactions in fact occurred. An entire profession exists in market economies to perform this function: auditing.

It has long been recognized in both business and politics that authority is more trustworthy when it is distributed. Consider the following crude but effective "protocols":

Separation of powers: political authority divided into several branches, each responsible for only certain aspects of authority (e.g. one authority passes laws, another different authority enforces them).

Segregation of duties: in a large business, transactions are divided up so that no single person can commit fraud. I call this "the principle of required conspiracy". For example, the functions of warehouse/delivery, sales, and receipt of payments are each performed by different parties, with a policy that each party reports every transaction to a fourth function, accounting. Any singular reported activity (e.g., delivery without receipt of payment) indicates potential fraud (e.g., a delivery was made to a customer and the payment pocketed instead of being put into t he corporate treasury). Segregation of duties is the auditor's favorite tool. Where it is absent the auditor cries "foul", just as a good engineer would react to a single point of failure. Many cryptographic systems have rightfully gone down to commercial failure because they ground down to trust in a single entity rather than segregating functions so as to require conspiracy.

The irony is that with cryptography we can greatly improve upon the traditional techniques of auditing (segregation of duties, cross-checking transactions against counterparties' books, and so on). I'll briefly mention three mechanisms:

Quorum

Quorum (a.k.a., threshold) distribution of performance or control over resources, based on the secret sharing of keys needed to perform or control a resource. Markus Jacobsson has designed a quorum of mints for signing digital coins, for example. Quorum establishes a "required conspiracy" of M out of N to peform a function, providing an option for stronger protection than the typical 2 out of N used in segregation of duties, and greater confidence in the security underlying the segregation.

Post-unforgeable auditing logs

Traditionally, auditors have contacted counterparties in order to verify that a transaction actually took place. (The "principle of required conspiracy" at work again). With post-unforgeable logs, via a heirarchical system of one-way hash functions, a party can publically commit to transactions as they are completed by publishing signed cumulative hashes of the transaction stream. The confidentiality of the transaction is fully maintained until an auditor "pings" the transaction to determine its actual nature. The counterparty identity can remain confidential, because it is not required to establish the other facts of the transaction. The only attack is to forge transactions in real time, as the transaction itself takes place, which in most practical cases will be unfeasible. Most accounting fraud involves analyzing sets of completed transactions and then forging them to make them compute to a desired counterfactual result.

Mutually confidential auditing

Multiparty secure computation allows N parties to share a computation, each learning only what can be inferred from their own inputs and the output of the computation. For example, the parties can compute summary statistics on their shared transaction logs, including cross-checking of the logs against counterparties to a transaction, without revealing those logs. Unfortuneately, straight MSC is far too slow (one Internet message per "bignum" machine instruction), but knowing that this capability exists in principle may lead us to practical solutions.

By combining these cryptographic capabilities, we can gain very high confidence in the factuality of authorities' claims and reports without revealing identifying and other detailed information from the transactions underlying those reports. These provide the basis for solid reputation systems, and other trusted third party systems, that maintain integrity across time, communications, and summarization, and preserve confidentiality for transaction participants.

References:

BRICS, A BRICS Course on Secure Multi-Party Computation

Carol Brown, Internal Control Concepts

Publius (James A. Madison), Federalist No. 47 -- The Particular Structure of the New Government and the Distribution of Power Among Its Different Parts

Publius (James A. Madison), Federalist No. 47 -- Federalist No. 48 -- These Departments Should Not Be So Far Separated as to Have No Constitutional Control Over Each Other

Douglas Stinson, Bibliography on Secret Sharing Schemes

Surety Technologies, Digital Fingerprints

Nick Szabo, Negative Reputation Systems


Please send your comments to