Life is really weird. From a physicist’s point of view, it’s even stranger. Life is unlike any other phenomenon in physics. Stars, electrons, and black holes are all amazing in their own way. But only life invents, and the first thing life invents is itself.
Life is creative in ways no other physical system can be, and its unique use of information may be the key to understanding what sets it apart from other physical systems. Now, thanks to a new grant my colleagues and I have received from the Templeton Foundation, we will explore exactly how information makes life work its magic. I’m very excited about the project, and this essay is my first report from the frontier as we delve into terra incognita.
How Information Defines Order
If information is the key to understanding life, the first question is what kind of information does life use? Answering this question is at the heart of our research project. To understand why this matters, you have to go back to the dawn of the information age and its first pioneer, Claude Shannon, a researcher at Bell Laboratories in the mid-twentieth century. In 1948 Shannon wrote an article titled A mathematical theory of communication. It was a game-changer and paved the way for the digital revolution. Shannon’s first step was to define what he meant by information. To do this, he relied on previous work that approached the question in terms of probabilities, or surprising events.
Consider watching a string of characters appear one after another on your computer screen. If the string consisted only of the number 1 repeated several times, the appearance of another 1 would not be very surprising. But if a 7 suddenly appeared, you would straighten up. Something new has appeared on the channel, and for Shannon, that means information has been passed on. There is a link, for him, between the probability of an event and the information it contains. Working with this link between information and events as symbols in a chain, Shannon defined syntactic information. It was all about syntax, he reasoned – the correct order of characters, words, and sentences. Shannon then developed a surprisingly powerful theory of how to use syntactic information that became the basis of modern computers.
Sure, but what does that mean?
What Shannon’s definition didn’t address, however, was meaning. Shannon clarified that it was never his intention, but meaning is the key to common sense understanding of what information is. Information conveys something worth knowing to someone who wants to know. In other words, the information has semantic contentand what Shannon’s syntactic definition deliberately ignored was the role of semantic information.
This is a huge potential problem when it comes to trying to understand the unique role that information plays in life. The reason physicists view information as the key to understanding life as a physical system is simple: life is the only physical system that actively uses it. Of course, I can describe the network of thermonuclear reactions in a star in terms of information. But I could also choose not to describe it that way. After all, the star isn’t using the information in a way that makes such a description necessary.
Life, however, is different. Life should be described in terms of the use of information. This is because the ability for life to create and sustain itself can only occur because it processes information.
But syntactic information alone is not what matters for life – the meaning is also important. The information that life uses always has a valence, an importance for the continuation of the organization. It gives life its agency and autonomy, allowing it to feel and respond to the environment. Even more fundamental is the essential division it explains between an organism and its environment – it separates ‘me’ from ‘not me’.
Our goal is therefore to understand the unique use of semantic information in life. Of course, “meaning” is a slippery philosophical idea, and many books have been written on the subject. But our Templeton project, based on a brilliant 2018 paper by Artemy Kolchinsky and David Wolpert, aims to find a workable, operational, and mathematical definition of semantic information that researchers can use to unlock the physics of life. (Kolchinsky, by the way, is part of the team). If the project is successful, we may be able to understand how a group of chemicals can eventually shape a cell, or how a group of individuals can form a complex technological society.
Subscribe to get counterintuitive, surprising and impactful stories delivered to your inbox every Thursday
It’s a very exciting possibility, and we’re just getting started. Stay tuned to see how it goes.
#Information #frontier #study #life