Can we improve the resiliency of computer systems by mimicking more squishy and carbon-based systems?
Computer systems are often fragile and susceptible to failures, but thanks to many modern practices and innovations in hardware and software, they have become more resilient and flexible. Many things can quickly take down a system including hardware failures, software bugs, infrastructure misconfiguration, and unexpected spikes in user traffic. Enormous engineering effort has been invested into making complex computer systems more durable, resilient, and flexible to external environmental changes via hardware and software solutions. Even though we rely on them every day, it’s easy to forget that behind the scenes, these systems are powered by a complex network of interconnected computers that can be inherently flaky. We take for granted that these systems just seem to work until they don’t.
In contrast, our human bodies and other multicellular organisms are relatively resilient to environmental stressors. We’re not invulnerable by any means, but human cells can adapt to rather significant changes in their external and internal environment. Bugs in our genetic code? We can thank our mismatch repair proteins for catching mistakes in DNA replication and fixing them. Our immune system can continuously run “health checks” on our somatic cells and take cells “offline” if they start misbehaving. These processes can be imperfect and lead to diseases like cancer and other conditions. Still, issues with these homeostatic and repair pathways are considered abnormalities (we’ll dig more into this below). Large, multicellular organisms wouldn’t have made it this far if every time any one of its cells died, the entire organism did as well! Homeostasis - the ability of the human body to stay in a stable equilibrium even as it faces external challenges that threaten to destabilize it - is a unique and convoluted process powered by complex centralized and decentralized messaging pathways.
On mimicking biological systems
One interesting aspect of computer systems is how they have evolved to mimic biological systems in order to improve their resiliency. For example, the use of containerization and distributed systems resembles the architecture of cellular organisms, where individual containers (or “cells”) can communicate with each other through messaging. This allows for a more flexible and scalable system, where any specific container can be safely removed and replaced without impacting the overall functioning of the system.
This idea isn’t new in computer science. Alan Kay, a computer scientist and biologist, first introduced the concept of object-oriented programming (OOP) based on the idea of message passing between distinct objects, similar to how living cells communicate with each other using hormones and other signaling molecules. While most programmers now associate OOP with class inheritance, Alan’s original intent was to explore the potential of message passing between distinct objects, similar to how living cells could talk to each other using hormones and other signaling molecules, in computer systems.
The downside of copying biology
If we were to keep pushing this metaphor, we would eventually stumble upon the question: can modeling our computer systems against cellular models have negative consequences?
For example in multicellular organisms, a well-known issue will cell replication and proliferation is cancer - where the excessive proliferation of one cell line, usually due to damage to the DNA of the cell or a DNA repair pathway, allows for uncontrolled growth and drain resources from the rest of the organism. This is known as the Warburg effect in humans - where cancer cells consume an abnormal amount of glucose compared to their healthy counterparts.
Does a computational equivalent of cancer exist in computer systems?
There have been cases where bugs can lead to unintended runaway cloud costs and resource hogging, although this doesn’t seem to be a common occurrence. This might be because changes to a computer program’s “DNA” (code) are intentionally designed, unlike the random mutations that can happen to a cell’s genome.
If we are copying cellular-based systems anyways, can we also mimic some more of their repair pathways?
On the other hand, can computer systems tap into the self-healing aspects of biological multicellular organisms? I mean, health checks for containers in container orchestrators are a good start at replicating this functionality. But part of the reasons we as humans don’t develop cancer every time we have a mutation (despite collecting many mutations throughout our lives) is because we have multiple layers of repair processes and immune system checks that catch issues before they can become problematic.
I’m not sure what this would look like in a computer system, but maybe we’ll see something develop in the next few years.
Obviously, there are places where this metaphor falls flat - I’ve made some simplifications in my discussion above for the sake of comparison. I’ve always wondered if the similarity between these biological and computer systems is a case of convergent evolution or direct inspiration. It wouldn’t be the first time biology have inspired the creation of computational equivalents (like neural networks). Overall, the idea of using biological systems as a model for improving the resiliency of computer systems is intriguing and I hope to see more future developments in the space!