NOTES > NOEH

Notes on Engineering Health, January 2025: Notes on Biological Speed Limits

Geoffrey W. Smith

Geoffrey W. Smith

January 28, 2025

A recent commentary in the journal Neuron titled “The unbearable slowness of being” estimated that the speed of information flow in the human brain is just 10 bits per second (bps).

By way of comparison, dial-up Internet connections transfer data at a rate around 50,000 bps. Writing in the New York Times about this paper, the science writer Carl Zimmer points out that “[s]treaming a high-definition video takes about 5 million bps. The download rate in a typical American home is about 262 million bps.” Zimmer goes on to note that “[t]he speed of human thought is dwarfed by the flood of information that assaults our senses. The Neuron paper’s authors estimated that the millions of photoreceptor cells in a single eye can transmit 1.6 billion bps. In other words, we sift about one bit out of every 100 million we receive.”

In other words, our conscious processing rate is 100 million times slower than that at which sensory information comes into the human brain, leaving a massive gap between what the brain takes in and what it uses.

In an interview in The Transmitter, one of the commentary’s authors Markus Meister sums up the fundamental contradiction at hand here:

One relevant comparison is the rate at which information gets in through our sensory organs, such as the retina or the ears, and the rate at which we make use of it for behavior. That’s this huge ratio that we call the sifting number. We think of the brain as sifting the 10^9 bits that come in every second in order to pull out the 10 bits that are actually going to be used for behavior. In terms of neuroscience, that’s the contrast that really matters, because you’re pitting one neuroscience fact—how much the eye can take in per second—against another neuroscience fact—how many characters a typist can type per second. That contrast is deeply unexplained.

Despite Zimmer apparently receiving quite a bit of negative feedback about the use of information theory in neuroscience, this story actually highlights the importance of understanding information processing in the biological context.

Information theory, first developed by Claude Shannon in the 1940s, has become increasingly important in understanding biological systems. This connection makes intuitive sense—living systems fundamentally process, store, and transmit information. Here is a short list of some of areas where information theory helps to better understand biology:

1. Genetic Information and DNA
The genetic code is perhaps the most obvious example. DNA serves as an information storage and transmission system, using sequences of nucleotides to encode instructions for building and maintaining organisms. Information theory helps us understand the information capacity of DNA sequences, error detection and correction mechanisms in DNA replication, and the efficiency of the genetic code's encoding scheme.

2. Molecular Recognition
The specificity of molecular interactions can be analyzed through information content of binding sites, recognition efficiency, and molecular communication channels.

3. Cellular Signaling
Cells constantly communicate through molecular signaling pathways. Information theory helps analyze signal transduction efficiency, information flow in biochemical networks, and noise handling in cellular communication.

4. Neural Information Processing
The nervous system can be viewed as an information processing network where neurons act as information channels, action potentials represent discrete signals, and synaptic connections form a complex information network. Information theory helps quantify neural coding efficiency and information transmission rates within neural networks.

5. Development and Morphogenesis
Information theory provides insights into how spatial patterns emerge during development, information flow in morphogen gradients, and pattern formation and tissue organization.

6. Ecological Systems
At a broader scale, information theory helps understand species interactions and communication, ecosystem complexity and stability, and biodiversity patterns.

7. Evolution and Natural Selection
Information theoretical concepts help explain how genetic information accumulates over generations, the role of mutations in introducing new information, and the relationship between environmental complexity and organismal complexity.

In each of these contexts, evolutionary pressures have pushed biological systems toward some level of efficient information processing, but have also been constrained by a collection of speed limits.

Fundamental physical speed limits include the speed of light limiting information transmission rates between different parts of biological systems; quantum mechanics imposing limits on how much information can be stored in a given volume of space (the Bekenstein bound); and Landauer's Principle setting a theoretical minimum energy cost for any irreversible computation (including biological processes).

DNA storage and processing limits include DNA having a theoretical information density of about 2 bits per nucleotide. Current estimates suggest human DNA contains a limit of ~750 megabytes of information. DNA replication operates remarkably close to the theoretical error minimum, with error rates around 10^-9 to 10^-10 per base pair. And the speed of DNA replication is limited by chemical reaction rates and proofreading mechanisms.

In cellular signaling, individual protein molecules can process information at rates of about 1,000 bits per second but cellular networks face fundamental limits due to thermal fluctuations, small numbers of available molecules, and diffusion limitations. As a result, the accuracy of cellular decisions (like chemotaxis) approach theoretical limits set by physics.

For neural processing, the brain's information storage capacity is estimated at 2.5 petabytes (a number that is debated) and individual neurons can fire at maximum rates of about 200-300 Hz.

As noted above, sensory systems such as the human eye can process about 10 million bps, which has photoreceptors operating near the quantum limit of light detection. Similarly, the cochlea approach the theoretical limits for frequency discrimination.

These limits all seemed to be informed by finding an optimal balance related to one or more sets of trade-offs:

• Speed versus Accuracy — faster processing typically increases error rates
• Speed versus Energy — faster processing generally requires more energy
• Speed versus Robustness — faster systems tend to be more vulnerable to noise

Current open questions related to information processing in the biological context tend to involve better understanding of these trade-offs. In any case, many biological systems appear to operate surprisingly close to theoretical physical limits, suggesting strong evolutionary pressure for efficient information processing.

The practical implications of better understanding of biological information processing could involve better designs for synthetic biological systems, the development of bio-inspired computing platforms, and novel therapeutic interventions.

And, returning to the speed limit question we started with, information theoretic understandings of human brain function could explain much of how we experience reality. As the Neuron paper concludes,

The discrepancy between peripheral processing and central cognition suggests that the brain operates in two distinct modes: the outer brain is closely connected to the external world through sensory inputs and motor outputs. This is a realm of high dimen- sionality: many millions of sensory receptors and muscle fibers and extremely high information rates. The inner brain, on the other hand, operates on a dramatically reduced data stream, filtered to the essential few bits that matter for behavior at any one moment. The challenge for the inner brain is to combine the animal’s goals with current inputs from the world and previ- ous memories to make decisions and trigger new actions. The in- formation rates are very low, but the processing must remain flexible because context and goals can shift at a moment’s notice.

Geoffrey W. Smith



First Five
First Five is our curated list of articles, studies, publications, and other things of interest for the month.

1/ One million U.S. adults will develop dementia each year by 2060
New cases of dementia will double by 2060, when 1 million U.S. adults are projected to develop the memory-robbing condition each year, according to a sobering new study published recently in the journal Nature Medicine. Read more here >

2/ Roadmap for the next generation of bioelectronic medicine
“From the ancient Egyptians' use of electric fish to treat headaches to the invention of pacemakers to regulate heart rhythms in the 1950s, the field of bioelectronic medicine—which makes use of electrical signals instead of drugs to diagnose and treat disease—has advanced and is started to come into its own. Where is the field now? And what are the most promising opportunities for life-changing new therapies and diagnostics?” Read more here >

3/ Semi-Dirac fermion observed for the first time
“For the first time, scientists have observed a collection of particles, also known as a quasiparticle, that's massless when moving one direction but has mass in the other direction. The quasiparticle, called a semi-Dirac fermion, was first theorized 16 years ago, but was only recently spotted inside a crystal of semi-metal material called ZrSiS. The observation of the quasiparticle opens the door to future advances in a range of emerging technologies from batteries to sensors.” Read more here >

4/ New third class of magnetism discovered that could transform digital devices
“A new class of magnetism called altermagnetism has been imaged for the first time in a new study. The findings could lead to the development of new magnetic memory devices with the potential to increase operation speeds of up to a thousand times.” Read more here >

5/ Why easy is better than hard for marathon training
New data shows that the biggest difference between elite and middling runners is not their hard workouts, but rather how much time they spend jogging. Read more here >



Did you Know?
In this section of our newsletter, we seek to demystify common terms and practices in our work as investors.

Participating preferred shares vs. non-participating preferred shares
Participating preferred shares and non-participating preferred shares are two types of preferred stock with differing rights and benefits within a company's capital structure:

Participating Preferred Shares:
– These give investors a chance to receive extra money beyond their fixed dividend if the company does well.
– If the company is sold or liquidated, they get their initial investment back and a share of the profits.
– Investors like this because they can earn more if the company succeeds.

Non-Participating Preferred Shares:
– These only promise the fixed dividend without any extra benefits.
– If the company is sold, investors get their initial investment, but nothing more.
– It's a straightforward option, but there's no chance for extra earnings.

The choice between the two depends on negotiations and affects how profits are shared among investors and founders in different situations.

Haiming Chen & Dylan Henderson

To subscribe to Engineering Biology by Jacob Oppenheim, and receive newly published articles via email, please enter your email address below.