These are the slides for a talk given in Munich in November 2017 at a conference on the Russian Revolution. The basic argument is that much of what John Stuart Mill said in the middle of the 19th century still sounds radical today. The reason is that Marx, Lenin, and the Russian Revolution set back the Left for a century and a half.
Talk: A Tale of Two Invalid Contracts: Coverture and Employment
These are the slides for a talk that focuses on the parallel inalienable rights arguments against the now-outlawed coverture marriage contract and the yet-to-be-outlawed employment contract.
Talk: New Foundations for Information Theory
These are the slides for a number of talks on logical information theory as providing new foundations for information theory.
Review-Essay on Elizabeth Anderson’s “Private Government” book
In her recent book Private Government [2017], Elizabeth Anderson makes a powerful but pragmatic case against the abuses experienced by employees in conventional corporations. The purpose of this review-essay is to contrast Anderson’s pragmatic critique of many abuses in the employment relation with a principled critique of the employment relationship itself.
Brain Functors: A mathematical model of intentional perception and action
Semiadjunctions (essentially a formulation of universal mapping properties using hets) can be recombined in a new way to define the notion of a brain functor that provides an abstract model of the intentionality of perception and action (as opposed to the passive reception of sense-data or the reflex generation of behavior).
Listen Libertarians! A Review of John Tomasi’s “Free Market Fairness”
John Tomasi’s 2012 book, Free Market Fairness, has been well received. On the dust jacket, Tyler Cowen proclaims it “one of the very best philosophical treatments of libertarian thought, ever” and Deirdre McCloskey calls it a “long and friendly conversation between Friedrich Hayek and John Rawls — a conversation which, astonishingly, reaches agreement.”
Logical Information Theory: New Foundations for Information Theory
There is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory.
Logical Entropy: Introduction to Classical and Quantum Logical Information Theory
Logical information theory is the quantitative version of the logic of partitions just as logical probability theory is the quantitative version of the dual Boolean logic of subsets. The resulting notion of information is about distinctions, differences, and distinguishability, and is formalized as the distinctions of a partition (a pair of points distinguished by the partition). This paper is an introduction to the quantum version of logical information theory.
Quantum Logic of Direct-sum Decompositions
The usual quantum logic, beginning with Birkhoff and Von Neumann, was the logic of closed subspaces of a Hilbert space. This paper develops the more general logic of direct-sum decompositions of a vector space. This allows the treatment of measurement of any self-adjoint operators rather than just the projection operators associated with subspaces.
Reframing the Labor Question
Subtitle: On Marginal Productivity Theory and the Labor Theory of Property
This paper reframes the labor question according to the normal juridical principle of imputation whose application to property appropriation is the modern treatment of the old natural rights or labor theory of property.