In many ways, the march of scientific progress can be viewed as an unrelenting peeling away of our innate sense of uniqueness. In the last few hundred years, we have gone from regarding ourselves as a divinely-favored species placed squarely at the center of a revolving cosmological light show, to a randomly evolved concatenation of DNA on an unremarkable planet, orbiting an average type of star, in a not terribly noteworthy position of a fairly run-of-the-mill type of spiral galaxy, in a gargantuan, and thoroughly indifferent, universe.
Such a precipitous descent from the privileged to the mundane is humbling, to say the least, but in what might be somewhat skeptically regarded as our innate predilection to the grandiose, many now have enthusiastically elevated our reluctantly-concluded averageness to nothing less than a cosmological statute: the Copernican principle, sometimes called the mediocrity principle, inverts the argument to pronounce that everything about our situation is overwhelmingly, necessarily, common.
Just as grabbing someone off the street at random will result, in nine out of ten cases, with coming face to face with a right-handed person, one should naturally expect that our planet, star, solar system, galaxy and particular region of the universe – taken, as it were, at random out of all the possible places we might have found ourselves – should instead be, well, pretty much the same as just about everywhere else.
But as fond as we are of postulating principles, modern science relies even more strongly on measurement. The philosophical pendulum might have swung from one end to the other, but now that we have the tools to carefully examine plentiful numbers of planets and planetary systems outside of our solar system, what does the experimental evidence say?
Scott Tremaine, Richard Black Professor of Astrophysics at the Institute for Advanced Study in Princeton, New Jersey, and an internationally renowned expert in both galactic-scale and planetary-scale astronomy, was an ideal person to ask. So, does the mediocrity principle hold sway, or does our solar system veered surprisingly towards the left-handed?
“Over the last two decades, we’ve discovered hundreds, probably thousands, of planetary systems orbiting other stars. We now know, then, that planets are common. This wasn’t at all obvious. In fact, in the late 19th century and the first half of the 20th century, the standard model for planet formation involved a very rare, unusual event: the close passage of two stars, which was likely to have only happened a handful of times in the galaxy. So, in that model, the solar system was this extraordinarily unique and unusual configuration that you practically never saw.
“That model had already been superseded in the 1960s, but we now know for certain that that’s wrong, because if you look at a typical star, even with our limited abilities to detect planetary systems around other stars, probably a significant fraction – 10%, 20%, 30%, depending on how you define it – have planets around them.”
Well, that seems to settle it. After all, if huge numbers of stars come with planets orbiting them, then that seems to give us a definite push towards garden-variety status.
But then, there’s this:
“Most systems we’ve seen don’t look like the solar system. Many of them have giant planets that are much closer to the host star compared to our own giant planets. In fact, many of them have planets that are much closer to the host star even than Mercury, the innermost planet in our own system.
“We don’t have a good theory for how those formed, and we don’t have a good theory for why they’re different from our own planetary system. One of the reasons for the difference, of course, is that it would be quite hard to detect our own planetary system.
“If we were sitting on a planet orbiting a star a few light-years away and were conducting the same planetary surveys with our current technology, we would just barely be able to see Jupiter. And we would be a little puzzled, because Jupiter is in a much more circular orbit than most of the analogs to Jupiter around other stars.
“So, astrophysicists sitting on this hypothetical planet ten light-years away would probably have said, ‘That system looks a little unusual because the planet that we see is a lot more circular than most of the planets that we see.’ ”
Score one for uniqueness, then. Perhaps we’re not so common, after all? It seems that the short answer is that it’s simply too early to tell. Given the fact that modern approaches to exoplanet detection are only 20 years old or so, it could well be that the statistics will level out as time, in tandem with detection technology, evolves. But then again they might not. Deciding between the two possibilities is simply an empirical question.
For Scott, however, there is a vital lesson to be learned here about transcending personal biases that extends far beyond the statistics of one particular solar system, however near and dear it may be to our hearts.
The goal of any scientific enterprise is to understand phenomena in the most general possible terms in accordance with the laws of nature. But in our quest to generalize, we often unthinkingly limit ourselves to our own particular experiences.
“The analogy I sometimes use is: suppose Darwin were developing the theory of evolution and all he’d ever seen was a butterfly. He might get the general idea of the theory of evolution right, but he probably would have put some things in it to explain why evolution could only produce things with thin wings of a certain size that fluttered around from plant to plant. We now know that, in addition to butterflies, there are lions and tigers and fish and birds; and that doesn’t mean that the basic idea of evolution would be different, but you have a much better idea, much better feedback, on how it must actually have worked, from the fact that you see this tremendous variety of systems.”
But, what if you don’t have the good fortune to be presented with such variety in the first place? What if we’re stuck in the dark, like Darwin and just his butterfly? What do we do then? That, says Scott, is the core intellectual challenge of astrophysics, what keeps him coming in to work each day with a smile on his face.
“This is an example of one of the things I enjoy about astrophysics: its detective nature – having to figure out to what extent you’ve been assuming that everything is the same as the things we’ve already seen. You have to have the imagination to ask if there are other things that are allowed by the laws of physics that we haven’t detected which might be quite different; and, if so, should we have seen them already, and are there techniques for detecting them?”
You also have to learn from past mistakes and lost opportunities, and the field of exoplanet astronomy has some lessons to teach us there, too.
“The irony, I think, is not so much that the technology is now good enough to detect exoplanets, it’s that these planets probably could have been detected much earlier, but the planets that you can see by these techniques are those that are quite close to the host star; and all the people who were looking for extra-solar planets in the 60s, 70s and 80s, when the technology was already good enough to detect them, thought that we should be looking for systems like our own solar system, in which case neither of these two methods would have worked.
“Because we’d only seen one planetary system, the community, then, not unusually made the assumption that all the other ones should look the same. As a result, nobody really took seriously the idea that you could use these techniques to deduce planets.
“Again, it comes down to the analogy of Darwin with the butterfly – no matter how well you understand evolution, if all you’ve seen is a butterfly, you’re not going to predict the existence of fish.”
We are all, to some extent, naturally limited by our pasts, unable to leverage first-hand encounters with the entire variety of potential systems. But the successful scientist can overcome the limitations of her past through a conscious act of creativity: harnessing her imagination to envision what might lie beyond her experiences, thoughtfully designing both theories and experiments that can capture the possible.
Such critical thinking skills may or may not be unique in the universe. But if we don’t exploit ours to the fullest, we’ll surely never find out.
For a complete list of Ideas Roadshow videos featuring Scott Tremaine, click here The eBook called Astrophysical Wonders containing the unabridged Ideas Roadshow conversation with Scott and a range of additional resources is available on Amazon .
University staff and students may already have full access to all Ideas Roadshow resources on our Academic Portal through an institutional subscription, while select high school teachers and members of the general public have access to our School and Public Library Portals. For more information please contact email@example.com.
It’s hard to imagine an academic institution that is more committed to undirected, unfettered scholarship – knowledge for knowledge’s sake – than the Institute for Advanced Study in Princeton, New Jersey. In an age when many are bemoaning the so-called corporatization of academe, the IAS stands proudly in the footsteps of its founding Director Abraham Flexner, who diligently leveraged the fortune of Louis Bamberger and his sister Catherine Bamberger Fuld to create the first private, independent, American center for pure research and scholarship in 1930.
The IAS is not a university. Its faculty face no teaching requirements and shoulder very limited administration so as to ensure that they are best able to dedicate themselves to following their intellectual inclinations wherever they might lead. Its postdoctoral fellows (“members”) are given complete freedom to interact with whomever they so choose during their extended stay of typically three years. And its students – well, the IAS formally has no students, either undergraduate or graduate. If you want students, go to nearby Princeton University. There are lots of them there.
It’s not a place that works for everyone, of course. Inevitably some think it too stuffy and out of touch. The celebrated theoretical physicist Richard Feynman once turned down a faculty appointment there, citing the pressure of being in an environment where there was “nothing to do other than think all day long.” But you can’t argue with success: the IAS has been home to one of the most enviable collections of the world’s finest minds: logician Kurt Gödel, cultural anthropologist Clifford Geertz, statesman George Kennan, art historian Erwin Panofsky, mathematical fireball John von Neumann, physicist and Manhattan Project leader Robert Oppenheimer; and, of course, Albert Einstein. (more…)
Many physicists like to throw the word “universe” around: Secrets of the universe. Mysteries of the universe. Origin of the universe. Even parallel universes.
It is a big subject. Fully befitting, we like to think, our big brains.
Sure, you might make more money than we do. You might drive fancier cars and have a bigger house. But we study the universe: top that.
But look a bit closer and the physicist’s universe tends to be a fairly arid place, riddled with abstract notions of guiding principles and fundamental constants. Spend enough time with a physicist, in fact, and it’s easy to forget that the universe actually has stuff inside it.
Jill Tarter, though, hasn’t forgotten. (more…)
The Nobel Prize has always vaguely irritated me. The idea that one’s entire research career might somehow be neatly defined by what a bunch of Swedes happen to find noteworthy has long struck me as arbitrary at best and, in my darker moments, a sad commentary on our need for self-affirmation.
Richard Feynman typically summed it up best when asked if his work on quantum electrodynamics fully merited being awarded the Prize: “I don’t know anything about the Nobel Prize and what’s worth what … I’ve already got the prize. The prize is the pleasure of finding the thing out, the kick of the discovery, the observation that other people use it. Those are the real things. The honors are unreal.”
On the other hand, it’s clear that these sorts of major prizes and awards have their uses. Life without the annual Nobel announcements, for example, would mean that the mainstream media would pay even less attention than usual to scientific discoveries, literary accomplishments, or the enlightened few who are advancing the cause of global peace. (more…)
It took a bit of time to get David Politzer to agree to sit down and talk with me. A co-recipient of the 2004 Nobel Prize in Physics, he was all too familiar with people approaching him out of the sheer excitement at being able to tell their friends that they had spent some time chatting with a Nobel Laureate.
That wasn’t my problem. By a curious twist of fate, over the years I had found myself spending enough time in the company of Nobel Laureates to know that it could often be a highly overrated experience. Indeed, other than a few notable exceptions (such as the almost overwhelmingly genial Tony Leggett), knowing that someone had a Nobel Prize invariably made me look searchingly towards the nearest exit.
Still, it was David’s Nobel that first brought him to my attention, as it were. Years ago an astute colleague urged me to read The Dilemma of Attribution, David’s clever, thoughtful, and humble Nobel lecture that detailed the communal nature of frontline scientific inquiry: using his own “Nobel-winning” work on asymptotic freedom to explicitly demonstrate how science builds shining edifices of our understanding by rigorously compiling insights by different researchers, one upon the other. (more…)
Why study history?
It’s a deceptively penetrating question, and asking it runs a serious risk of being subjected to a barrage of knee-jerk homilies, from the importance of a general cultural understanding to a basic appreciation of different ways of doing things. More often than not, however, you will hear talk of the importance of applying lessons from history to better navigate present challenges, typically invoking George Santanya’s famous quote “Those who cannot remember the past are condemned to repeat it.”
For David Armitage, Lloyd C. Blankfein Professor of History at Harvard University, the reality is rather murkier: it’s not so much that studying history will enable us to avoid committing the same mistakes as our predecessors, but that considerable effort needs to be made to get a clear sense of what these predecessors thought they were doing in the first place.
“How are we to put ideas and other cultural forms into the past in such a way that they become comprehensible in past terms, but then can also be rendered comprehensible in the present?
“Whether it’s a Shakespeare play, an epic poem by John Milton, a work of political thought by Thomas Hobbes or John Locke, or performances of 16th-century music – we can’t hear with 16th-century ears, but we can approximate to 16th-century performance practices in singing: how does one bridge that gap to recover some kind of authenticity, to understand how the original creators, or in the case of music, performers, understood what they were doing?” (more…)
People like to talk about transformative social change, but almost everyone gets it wrong. When I was a teenager, for example, there was much talk about the pressing challenges of finding meaningful ways to fill the bourgeoning amount of free time that our rapidly improving technology would inevitably bring. These days, such talk produces the same sort of whistful smile as the flying cars that futurologists confidently told us that we’d be piloting, Jetson-like, to our personal helipods.
Meanwhile, gay marriage is now largely accepted throughout most Western countries, and recreational marijuana use is legal in no less than 8 American states. Maybe somebody saw that coming, but nobody I know did (myself most definitely included).
When it comes to mental illness, on the other hand, the record is much more mixed. As neuroscience strides ahead at stunning speed and psychologists rush to embrace the likes of fMRI and other real-time imaging tools, it’s difficult to say whether or not, when all is said and done, we live in a more or less tolerant society than when I was in high school. (more…)
Blame it on my physics background. When I first arranged to sit down with world- renowned psychologist Carol Dweck, I was pretty skeptical of the prospect of any genuine insights coming out of the experience. After all, I generally make a habit of avoiding the self-help section of a bookstore with a disdainful smirk, and Carol’s 2006 bestselling work, Mindset: How You Can Fulfill Your Potential seemed to me as good a poster boy of the pop psychology genre as one might find. Still, I told myself, as the Lewis and Virginia Eaton Professor of Psychology at Stanford University, she was a highly respected academic with a lifetime’s worth of accomplishment. Of course, I couldn’t help immediately reflexively adding, that was in psychology. Which is all to say that, awards, publications and an endowed chair at Stanford notwithstanding, expectations were still emphatically low. (more…)
Sometimes it’s good to take a little distance.
As we begin the week that formally ushers in an era in American politics that few saw coming and many regard as little short of catastrophic, we turn to the eminent intellectual historian Quentin Skinner, Barber Beaumont Professor of Humanities at Queen Mary University of London for some much-needed perspective.
Not, as it happens, to try to understand how we got here, for Quentin will be the first to tell you that he is not that sort of historian.
“I’m very interested in the history of moral and political philosophy, but I’m not interested in the history because it’s what we talk about now.
“I believe that the sensibility of the historian is to try to study the past on its own terms, insofar as we can manage. If we do that, then what we find is that in our modern culture there are many paths not taken.
“We tend to write history as the history of the winners. We write the history of wars as the history of the winners, but we also write the history of our culture as the history of the winners. But did the winners always deserve to win?” (more…)
Last week’s post, Reasons for Optimism, highlighted the work of famed primatologist Frans de Waal and our ever-growing, albeit gradual, acknowledgement of the deep similarities between humans and other animals across a wide range of domains, from the intellectual to the ethical.
Language, however, seems to be different. While it’s clear that other animals have developed numerous ways to effectively communicate with each other, it’s equally clear that the gulf between animal and human communication is overwhelmingly vast. In other words, however much we narrow the gap between humans and our fellow creatures, to speak of a chimpanzee or dolphin Shakespeare seems to stretch credulity to the point of absurdity. Little wonder, then, that throughout the ages many have regarded language as the single most defining characteristic of being human.
But what, you might ask, is language anyway? This might seem an obvious question, but a few moments of reflection demonstrate that it’s hardly so simple. While most languages exist in written form, not all do (or did). And while language obviously allows us to communicate basic information to each other (The food is over there), by limiting any definition to such a rudimentary level we are running a serious risk of blurring the very distinction that gives language its unique evocative power. (more…)