Previously published in December 2014
The Lagoon Nebula
In a world where scientific theories often sound bizarre and counter to everyday intuition, and where a wide variety of nonsense aspires to be recognized as “scientific,” it’s important to be able to separate science from non-science—what philosophers call “the demarcation problem.” Karl Popper famously suggested the criterion of “falsifiability”—a theory is scientific if it makes clear predictions that can be unambiguously falsified.
It’s a well-meaning idea, but far from the complete story. Popper was concerned with theories such as Freudian psychoanalysis and Marxist economics, which he considered non-scientific. No matter what actually happens to people or societies, Popper claimed, theories like these will always be able to tell a story in which the data are compatible with the theoretical framework. He contrasted this with Einstein’s relativity, which made specific quantitative predictions ahead of time. (One prediction of general relativity was that the universe should be expanding or contracting, leading Einstein to modify the theory because he thought the universe was actually static. So even in this example the falsifiability criterion is not as unambiguous as it seems.)
Modern physics stretches into realms far removed from everyday experience, and sometimes the connection to experiment becomes tenuous at best. String theory and other approaches to quantum gravity involve phenomena that are likely to manifest themselves only at energies enormously higher than anything we have access to here on Earth. The cosmological multiverse and the many-worlds interpretation of quantum mechanics posit other realms that are impossible for us to access directly. Some scientists, leaning on Popper, have suggested that these theories are non-scientific because they are not falsifiable.
The truth is the opposite. Whether or not we can observe them directly, the entities involved in these theories are either real or they are not. Refusing to contemplate their possible existence on the grounds of some a priori principle, even though they might play a crucial role in how the world works, is as non-scientific as it gets.
The falsifiability criterion gestures toward something true and important about science, but it is a blunt instrument in a situation that calls for subtlety and precision. It is better to emphasize two more central features of good scientific theories: they are definite, and they are empirical. By “definite” we simply mean that they say something clear and unambiguous about how reality functions. String theory says that, in certain regions of parameter space, ordinary particles behave as loops or segments of one-dimensional strings. The relevant parameter space might be inaccessible to us, but it is part of the theory that cannot be avoided. In the cosmological multiverse, regions unlike our own are unambiguously there, even if we can’t reach them. This is what distinguishes these theories from the approaches Popper was trying to classify as non-scientific. (Popper himself understood that theories should be falsifiable “in principle,” but that modifier is often forgotten in contemporary discussions.)
It’s the “empirical” criterion that requires some care. At face value it might be mistaken for “makes falsifiable predictions.” But in the real world, the interplay between theory and experiment isn’t so cut and dried. A scientific theory is ultimately judged by its ability to account for the data—but the steps along the way to that accounting can be quite indirect.
Consider the multiverse. It is often invoked as a potential solution to some of the fine-tuning problems of contemporary cosmology. For example, we believe there is a small but nonzero vacuum energy inherent in empty space itself. This is the leading theory to explain the observed acceleration of the universe, for which the 2011 Nobel Prize was awarded. The problem for theorists is not that vacuum energy is hard to explain; it’s that the predicted value is enormously larger than what we observe.
If the universe we see around us is the only one there is, the vacuum energy is a unique constant of nature, and we are faced with the problem of explaining it. If, on the other hand, we live in a multiverse, the vacuum energy could be completely different in different regions, and an explanation suggests itself immediately: in regions where the vacuum energy is much larger, conditions are inhospitable to the existence of life. There is therefore a selection effect, and we should predict a small value of the vacuum energy. Indeed, using this precise reasoning, Steven Weinberg did predict the value of the vacuum energy, long before the acceleration of the universe was discovered.
We can’t (as far as we know) observe other parts of the multiverse directly. But their existence has a dramatic effect on how we account for the data in the part of the multiverse we do observe. It’s in that sense that the success or failure of the idea is ultimately empirical: its virtue is not that it’s a neat idea or fulfills some nebulous principle of reasoning, it’s that it helps us account for the data. Even if we will never visit those other universes.
Science is not merely armchair theorizing; it’s about explaining the world we see, developing models that fit the data. But fitting models to data is a complex and multifaceted process, involving a give-and-take between theory and experiment, as well as the gradual development of theoretical understanding in its own right. In complicated situations, fortune-cookie-sized mottos like “theories should be falsifiable” are no substitute for careful thinking about how science works. Fortunately, science marches on, largely heedless of amateur philosophizing. If string theory and multiverse theories help us understand the world, they will grow in acceptance. If they prove ultimately too nebulous, or better theories come along, they will be discarded. The process might be messy, but nature is the ultimate guide.
This article originally appeared on edge.org and is written by Sean Carroll, Theoretical Physicist, Caltech; Author of The Particle at the End of the Universe and From Eternity to Here: The Quest for the Ultimate Theory of Time
Further reading suggestion: http://theundisciplined.com/2014/11/16/natural-philosophy-falsifiability-and-pseudo-science/
From the Wisdom of Trauma All Access Pass
Zaya and Maurizio connect with therapist Deran Young on her story and the layers of racialized trauma
A segment from 'Hospicing Modernity'
embodying anti-racist practice and cultural building a way of being in the world
What is deep medicine? How can re-establishing our relationships with the Earth and one another help us to heal?
In this beautiful conversation, two impassioned trauma specialists and authors, Rabbi Dr. Tirzah Firestone and Dr. Gabor Maté, shared personal experiences and professional challenges working with the burgeoning field of trauma healing across generations.
In this conversation from the “Talks on Trauma” series Gabor investigates the paths of personal Trauma woven into the Buddhist and personal psychology fields for which Jack and Tara are so well known.
What Does Freedom Mean to You...
exploring the Politics of Cure, the Shadows of Harm Reduction, and Transgressive Networks of Care at World End
Master somatic therapist Peter Levine discusses the physiological origins of trauma, and how his Somatic Experiencing approach provides effective treatment.
A clip from the Reclaiming Authenticity video series with Dr. Maté
A section from the The Wandering, Winding Way of the Wound Course on the modern paths of healing ancient trauma
A short video from the animated philosophy channel, this video featuring Gabor Maté
There is no genuine possibility of hope if we cannot face all aspects of reality and of humanity: the good, the bad, the broken, and the seriously messed up.
Historical trauma is trauma so deeply rooted in the subconscious we may not even know it is there.
Please enter your email and we’ll send you instructions to reset your password