You’ve Misunderstood the Bit
Bits Are a Principle of Nature, Not a Product of Technology
People often think of bits as belonging to the digital world, but the bit itself is older and more fundamental than any technology. Bits are not just units of computer science, nor are they confined to electronics. The bit is not digital or analog; it is a level deeper: it represents a unit of difference. As such a bit is the most minimal form of sense-making in nature. It captures the distinction between presence and absence, yes and no, true and false. It is what allows any system with purpose to register change, to mark a difference, to say: this, not that. A bit represents the most basic unit of difference, and as such, it forms the foundation not just of technology, but of all information, all memory, all perception.
A bit, at heart, is a difference that makes a difference.
What’s strange, almost unsettling, about the bit is that it is both utterly fundamental and completely scale-free. A bit is the smallest possible unit of distinction, yet it operates across every level of reality. The bit’s logic—yes or no, move or stay, approach or avoid—underwrites choices across scales, from microbial chemotaxis to geopolitical decisions such as launching nuclear weapons. It exists in the spin of a quantum particle, the flip of a genetic switch, the branching of a neural pathway, the logic gates of a microchip, the gap junctions of cells, the rhythm of a heartbeat, the punctuation of a poem. It governs behavior in cells, machines, courts, and love letters. The bit is the grammar of decision and the currency of difference. It makes no assumptions about meaning. It simply marks: something vs. nothing, this vs. that. And yet this minimal act, this marking of presence or absence, is the precondition for everything we call structure, memory, perception, intention, and judgment.
The bit’s indifference to scale or meaning is not a limitation. It is what makes it universal. The same architecture of difference scaffolds the motion of atoms and the movement of ideas. To work with bits is to work with the grammar of reality itself.
This abstraction is what made Claude Shannon’s work in the mid-20th century so revolutionary. Shannon showed that any signal where uncertainty can be measured —be it a poem, a phone call, a genetic code, the universe itself—could be represented as a sequence of bits. His insight was not about symbolic language or digital communication per se, but about uncertainty. Information, in Shannon’s theory, is what reduces uncertainty. The more unpredictable a message, the more bits it takes to describe. This gave us a mathematics of information and a way to measure, transmit, and compress information without needing to interpret its meaning.
Around the same time, Alan Turing was asking a related question: What does it mean to compute? He imagined a simple abstract mathematical model, his famous Turing Machine, made up of a strip of tape, a set of symbols, and a rulebook for manipulating them. Despite its minimal design, this model machine could simulate any other computational process, given enough time and tape. This idea, now known as Turing completeness, means that all computable problems can, in principle, be reduced to a sequence of basic logical operations: AND, OR, NOT—operations that themselves act on bits.
The profound consequence is this: any computation, no matter how complex, can be broken down into operations on bits. Language translation, facial recognition, chess strategy, weather prediction—if it can be computed, it can be rendered in bits. This universality is not a quirk of digital machines. It is a principle of nature. Computation is not the property of any particular device or substrate. It is the property of patterns and rules.
This leads to one of the most important ideas in modern thought: substrate independence. Computation doesn’t care what medium it’s made of. A thought encoded in neurons is computationally equivalent, in theory, to the same thought encoded in silicon. What matters is not the stuff, but the structure. Not the medium, but the pattern of distinctions. In other words: the bits.
What Shannon and Turing gave us, then, is not merely a blueprint for machines, but a language for understanding sense-making itself. The principles of computation are foundational for understanding how any system—biological or artificial—can process, remember, and respond to the world. From this perspective life is what reduces uncertainty. It detect patterns. It makes choices. It stores information. It computes.
Wherever distinctions are tracked and patterns acted upon, the bit is at work. After all to think, to decide, to remember is to mark a difference. And to mark a difference is, at its most fundamental level, to make a bit.


A bit is a difference that makes a difference *to somebody*, ie, we need a triadic logic that includes not just signs and signifieds but interpretants. If information ontologies are semiotic ontologies in that sense, I’m on board.