How does science work?
Well, that’s easy, right? We start off by collecting information about the world around us and then we try to make some sense of it: we look for patterns, search for a more general understanding. In time, we develop sufficient awareness of these patterns that we begin making predictions about what else might be out there: explicitly formulating hypotheses of what we would expect to see under specific, controlled scenarios. Then we go ahead and rigorously test our hypotheses by explicitly creating these particular conditions; coolly assessing whether or not what has been predicted does, in fact, occur.
If it doesn’t, or at least doesn’t with any sense of regularity or precisely in the way that we had envisioned, we’ll be forced to accept that at least one of our original hypotheses was incorrect and head back to the drawing board to modify things in an attempt to develop a more accurate level of understanding. Meanwhile, if all of our predictions do come true, then we’ll find ourselves with increasing confidence in our understanding. At some point, we’ll likely start calling it something more grandiose, like a ‘theory’; and if it keeps working like clockwork for everything in its applicable domain that we can imagine, we will eventually be tempted to call it a ‘law’.
Such is, in a nutshell, what most of us mean by “the scientific method.” It is famously objective, logical and eminently reliable; and the fruits of its success, both pure and applied, are the single most obvious factor in distinguishing the varying levels of progress between different human societies throughout history.
But what happens when the experimental arena becomes less and less accessible? In the world of high-energy physics, where hugely expensive laboratory facilities take decades to construct, what can we do in the meantime towards uncovering nature’s secrets? And how might we conceivably make progress and build upon our knowledge when no further experiments are in sight?
Can we just sit back and invent any hypothesis we want, secure in the knowledge that, far away from any experimental arbiter, there is no way of distinguishing, even in principle, between different theoretical possibilities? Does science at this point simply become science fiction?
To Nima Arkani-Hamed, faculty member of the Institute for Advanced Study and one of the world’s foremost theoretical particle physicists, that sort of talk is not only wrong, it demonstrates a profound misunderstanding of all that fundamental physics has accomplished since Galileo brought us into the modern era more than four centuries ago.
Arkani-Hamed emphasizes that our understanding of the physical world is based on two phenomenally successful general principles of 20th century physics – relativity and quantum mechanics – which, when put together “almost completely dictate what the world around us can possibly look like.
“You could easily imagine a world without relativity and you could easily imagine a world without quantum mechanics. Either way things would be tremendously less constrained. It’s the existence of both of them that makes things so complicated.
“In fact, if I were God and I was given principles – or a sub-God, given the principles of relativity and quantum mechanics by the actual God who told me, ‘Now go, build a world,’ I’d say, ‘Sorry, can’t do it. This is just impossible.’
“They seem almost completely incompatible with each other.”
And it is precisely this near-incompatibility, this overpowering rigidity resulting from the necessity of finding common ground between these two general principles, that provides a huge constraint to any successful underlying theoretical framework that we might conceivably imagine.
Which means that even without one single experiment we have almost no wiggle room to come up with truly innovative approaches, a state of affairs that almost never gets communicated to the non-scientist.
“I think that that’s the thing which isn’t appreciated by the general public: the rigidity. Instead, there is some sense that theoretical physicists, unencumbered by data from experiment, are just out there inventing leprechauns and fairies and nymphs and dryads around every corner, and every crazy idea you have is something that you can put out there. Of course we all agree that experiment decides, but until it does, anything goes: leprechauns and nymphs and dryads are all on the same footing.
“But it just is not like that. The incredible rigidity that we have in our framework makes it almost impossible to come up with a new idea. It’s very hard to modify things in any way without ruining everything.”
Appreciating the importance of these constraints and why today’s theorists have such overwhelming confidence in these two guiding principles not only explains how physicists operate but also gives deeper insight into misunderstandings at the core of various public controversies, like the erroneous case of the faster than light neutrinos that suddenly burst onto the scene in 2011.
One of the things that infuriated Arkani-Hamed the most about the whole incident was how it demonstrated a near-total lack of public understanding of how contemporary fundamental physics is done.
“How was it reported in the popular press? Here is this incredible thing, we were told, you can go faster than light – and those physicists who were skeptical of the results were largely described as people refusing to challenge the orthodoxy of big old Al, looking down on us, wagging his finger and telling us, “Don’t you dare go faster than light!”
But that, Arkani-Hamed tells us, is laughingly far from the truth. Indeed, precisely because of the pre-eminent role that relativity plays in our understanding of the world, many theoretical physicists had spent an enormous amount of time and trouble well beforehand explicitly examining whether it would be somehow possible to violate its effects and under what circumstances. And what they had found, after painstaking effort, was in direct contrast to what these experiments were implying.
“The reason why all of us were sure this result had to be wrong was not because we had never entertained the possibility that Einstein might be wrong. Exactly the opposite! That’s what’s so frustrating about it. We had entertained it so well, we had thought about it so much, so responsibly and in such detail, that we knew it was impossible to have an effect as humungous as they found. One part in a hundred thousand sounds like a small effect, but all our previous work on violations of special relativity showed results that were much more stringent: one part in ten to the ten, one part in ten to the fifteen, one part in ten to the twenty – just way off from these humungous-size effects that they were finding. So that was a big source of frustration. We knew it had to be wrong. “
“People assumed that we were convinced it was wrong because we didn’t want to question these underlying principles. Well, that’s true in the sense that we know that you give up a lot if you do it. But we’re not ideologues: we prepare for the possibility and study matters in so much detail that we know it cannot possibly be right, compatible with all the other experimental results we’ve had all this time.
“I think that it’s important for people to understand essential aspects of how this sort of science is actually done. No one is out to suppress rebellious ideas. Quite the contrary: if you can find some even moderately rebellious ideas that have even a modicum of truth associated with them, that’s the way you make your name in the field.
“But again, it needs to be emphasized that we’re in this very, very tight straightjacket. We’re not going to, at the drop of a hat, destroy this entire incredible structure that we built up over four centuries which has served us so well unless there’s a really good reason for it.
“Almost always, experimental or theoretical challenges to the structure are bound to fail. You shouldn’t be surprised that they fail. And people shouldn’t take skepticism as evidence of turf protection. It’s really evidence of the great fact that we have this entire castle that we built over centuries that works so well.”
So what of the future? How can someone like Nima Arkani-Hamed somehow get out of the straightjacket and unlock deeper secrets of the universe? By trying to find a new light in which to examine these very same underlying principles:
“I actually think one of the big things that should happen in the 21st century, the next really big things we have to understand in fundamental physics, in at least our part of theoretical physics, is that we have to understand where space-time comes from in a more fundamental sense.
“And I think one of the things that we will hopefully understand, when we know more how this works, is why it is that these two big ideas of relativity and quantum mechanics seem to both fight each other so much, on the one hand, which is why the world is so constrained. But on the other hand, they also go so wonderfully together, in the sense that in a world that did not have quantum mechanics, it would be easy to imagine modifying relativity somehow. Similarly, in a world without relativity, it’s really easy to modify quantum mechanics.
“Each one buttresses the other in a very strange yet really remarkable way. It’s so hard to make consistent things, and we come up with these tiny menus and it’s incredibly fertile and powerful but that’s all we can do. But why these two principles fit together in the exact way that they do is one of the greatest mysteries.”
To today’s leading theorists, the two founding principles of 20th century physics are simultaneously a constraint and an inspiration: a straightjacket and a guidepost for future discoveries.
The future awaits.
Nima Arkani-Hamed’s two-part Ideas Roadshow conversation called The Power of Principles is part of Ideas Roadshow’s award-winning database.
The eBook called The Power of Principles which contains the unabridged Ideas Roadshow conversation with Nima is available on Amazon.
Faculty and students may already have full access to all Ideas Roadshow resources on our Academic Portal through an institutional subscription, while select high school teachers and members of the general public may have access to our School and Public Library Portals. For more information about institutional subscriptions, please contact email@example.com.
The so-called “mystery of Stradivari” has long confused me. How is it possible, I wondered, that with all of the tools of modern technology, and all of our increased understanding of the way the world works, we can’t seem to find a way to make instruments that can at least equal those of some guy working in Cremona, Italy, over 300 years ago?
Not surprisingly, I’m hardly the first person to find this perplexing: it is a long-standing, indeed quasi-iconic, mystery throughout the classical music world.
It confused Joseph Curtin too. But Curtin is hardly your run-of-the mill ruminator. He is one of the most recognized violin-makers working today, a MacArthur Fellowship-winning artisan who combines premiere craftsmanship with an unabashed enthusiasm for acoustics and rigorous scientific methodology.
He is also a delightfully frank, open and approachable fellow. (more…)
It’s hard to imagine an academic institution that is more committed to undirected, unfettered scholarship – knowledge for knowledge’s sake – than the Institute for Advanced Study in Princeton, New Jersey. In an age when many are bemoaning the so-called corporatization of academe, the IAS stands proudly in the footsteps of its founding Director Abraham Flexner, who diligently leveraged the fortune of Louis Bamberger and his sister Catherine Bamberger Fuld to create the first private, independent, American center for pure research and scholarship in 1930.
The IAS is not a university. Its faculty face no teaching requirements and shoulder very limited administration so as to ensure that they are best able to dedicate themselves to following their intellectual inclinations wherever they might lead. Its postdoctoral fellows (“members”) are given complete freedom to interact with whomever they so choose during their extended stay of typically three years. And its students – well, the IAS formally has no students, either undergraduate or graduate. If you want students, go to nearby Princeton University. There are lots of them there.
It’s not a place that works for everyone, of course. Inevitably some think it too stuffy and out of touch. The celebrated theoretical physicist Richard Feynman once turned down a faculty appointment there, citing the pressure of being in an environment where there was “nothing to do other than think all day long.” But you can’t argue with success: the IAS has been home to one of the most enviable collections of the world’s finest minds: logician Kurt Gödel, cultural anthropologist Clifford Geertz, statesman George Kennan, art historian Erwin Panofsky, mathematical fireball John von Neumann, physicist and Manhattan Project leader Robert Oppenheimer; and, of course, Albert Einstein. (more…)
In many ways, the march of scientific progress can be viewed as an unrelenting peeling away of our innate sense of uniqueness. In the last few hundred years, we have gone from regarding ourselves as a divinely-favored species placed squarely at the center of a revolving cosmological light show, to a randomly evolved concatenation of DNA on an unremarkable planet, orbiting an average type of star, in a not terribly noteworthy position of a fairly run-of-the-mill type of spiral galaxy, in a gargantuan, and thoroughly indifferent, universe.
Such a precipitous descent from the privileged to the mundane is humbling, to say the least, but in what might be somewhat skeptically regarded as our innate predilection to the grandiose, many now have enthusiastically elevated our reluctantly-concluded averageness to nothing less than a cosmological statute: the Copernican principle, sometimes called the mediocrity principle, inverts the argument to pronounce that everything about our situation is overwhelmingly, necessarily, common.
Just as grabbing someone off the street at random will result, in nine out of ten cases, with coming face to face with a right-handed person, one should naturally expect that our planet, star, solar system, galaxy and particular region of the universe – taken, as it were, at random out of all the possible places we might have found ourselves – should instead be, well, pretty much the same as just about everywhere else.
But as fond as we are of postulating principles, modern science relies even more strongly on measurement. The philosophical pendulum might have swung from one end to the other, but now that we have the tools to carefully examine plentiful numbers of planets and planetary systems outside of our solar system, what does the experimental evidence say? (more…)
Many physicists like to throw the word “universe” around: Secrets of the universe. Mysteries of the universe. Origin of the universe. Even parallel universes.
It is a big subject. Fully befitting, we like to think, our big brains.
Sure, you might make more money than we do. You might drive fancier cars and have a bigger house. But we study the universe: top that.
But look a bit closer and the physicist’s universe tends to be a fairly arid place, riddled with abstract notions of guiding principles and fundamental constants. Spend enough time with a physicist, in fact, and it’s easy to forget that the universe actually has stuff inside it.
Jill Tarter, though, hasn’t forgotten. (more…)
The Nobel Prize has always vaguely irritated me. The idea that one’s entire research career might somehow be neatly defined by what a bunch of Swedes happen to find noteworthy has long struck me as arbitrary at best and, in my darker moments, a sad commentary on our need for self-affirmation.
Richard Feynman typically summed it up best when asked if his work on quantum electrodynamics fully merited being awarded the Prize: “I don’t know anything about the Nobel Prize and what’s worth what … I’ve already got the prize. The prize is the pleasure of finding the thing out, the kick of the discovery, the observation that other people use it. Those are the real things. The honors are unreal.”
On the other hand, it’s clear that these sorts of major prizes and awards have their uses. Life without the annual Nobel announcements, for example, would mean that the mainstream media would pay even less attention than usual to scientific discoveries, literary accomplishments, or the enlightened few who are advancing the cause of global peace. (more…)
It took a bit of time to get David Politzer to agree to sit down and talk with me. A co-recipient of the 2004 Nobel Prize in Physics, he was all too familiar with people approaching him out of the sheer excitement at being able to tell their friends that they had spent some time chatting with a Nobel Laureate.
That wasn’t my problem. By a curious twist of fate, over the years I had found myself spending enough time in the company of Nobel Laureates to know that it could often be a highly overrated experience. Indeed, other than a few notable exceptions (such as the almost overwhelmingly genial Tony Leggett), knowing that someone had a Nobel Prize invariably made me look searchingly towards the nearest exit.
Still, it was David’s Nobel that first brought him to my attention, as it were. Years ago an astute colleague urged me to read The Dilemma of Attribution, David’s clever, thoughtful, and humble Nobel lecture that detailed the communal nature of frontline scientific inquiry: using his own “Nobel-winning” work on asymptotic freedom to explicitly demonstrate how science builds shining edifices of our understanding by rigorously compiling insights by different researchers, one upon the other. (more…)
Why study history?
It’s a deceptively penetrating question, and asking it runs a serious risk of being subjected to a barrage of knee-jerk homilies, from the importance of a general cultural understanding to a basic appreciation of different ways of doing things. More often than not, however, you will hear talk of the importance of applying lessons from history to better navigate present challenges, typically invoking George Santanya’s famous quote “Those who cannot remember the past are condemned to repeat it.”
For David Armitage, Lloyd C. Blankfein Professor of History at Harvard University, the reality is rather murkier: it’s not so much that studying history will enable us to avoid committing the same mistakes as our predecessors, but that considerable effort needs to be made to get a clear sense of what these predecessors thought they were doing in the first place.
“How are we to put ideas and other cultural forms into the past in such a way that they become comprehensible in past terms, but then can also be rendered comprehensible in the present?
“Whether it’s a Shakespeare play, an epic poem by John Milton, a work of political thought by Thomas Hobbes or John Locke, or performances of 16th-century music – we can’t hear with 16th-century ears, but we can approximate to 16th-century performance practices in singing: how does one bridge that gap to recover some kind of authenticity, to understand how the original creators, or in the case of music, performers, understood what they were doing?” (more…)
People like to talk about transformative social change, but almost everyone gets it wrong. When I was a teenager, for example, there was much talk about the pressing challenges of finding meaningful ways to fill the bourgeoning amount of free time that our rapidly improving technology would inevitably bring. These days, such talk produces the same sort of whistful smile as the flying cars that futurologists confidently told us that we’d be piloting, Jetson-like, to our personal helipods.
Meanwhile, gay marriage is now largely accepted throughout most Western countries, and recreational marijuana use is legal in no less than 8 American states. Maybe somebody saw that coming, but nobody I know did (myself most definitely included).
When it comes to mental illness, on the other hand, the record is much more mixed. As neuroscience strides ahead at stunning speed and psychologists rush to embrace the likes of fMRI and other real-time imaging tools, it’s difficult to say whether or not, when all is said and done, we live in a more or less tolerant society than when I was in high school. (more…)
Blame it on my physics background. When I first arranged to sit down with world- renowned psychologist Carol Dweck, I was pretty skeptical of the prospect of any genuine insights coming out of the experience. After all, I generally make a habit of avoiding the self-help section of a bookstore with a disdainful smirk, and Carol’s 2006 bestselling work, Mindset: How You Can Fulfill Your Potential seemed to me as good a poster boy of the pop psychology genre as one might find. Still, I told myself, as the Lewis and Virginia Eaton Professor of Psychology at Stanford University, she was a highly respected academic with a lifetime’s worth of accomplishment. Of course, I couldn’t help immediately reflexively adding, that was in psychology. Which is all to say that, awards, publications and an endowed chair at Stanford notwithstanding, expectations were still emphatically low. (more…)