Matt Ridley is the author of provocative books on evolution, genetics and society. His books have sold over a million copies, been translated into thirty languages, and have won several awards.
Please note that this blog does not accept comments. If you're reading this blog and want to respond then please use the contact form on the site, or comment on his Facebook page. You can also follow him on Twitter @mattwridley.
His new book How Innovation Works is now available in the UK as well as in the US and Canada.
My latest Mind and Matter column is on the origin of vision in animals and a vindication for Darwin:
Until recently it was possible, even plausible, to think that the faculty of vision had originated several times during the course of animal evolution. New research suggests not: vision arose only once and earlier than expected, before 700 million years ago.
Davide Pisani and colleagues from the National University of Ireland have traced the ancestry of the three kinds of "opsin" protein that animals use, in combination with a pigment, to detect light. By comparing the genome sequences of sponges, jellyfish and other animals, they tracked the origin of opsins back to the common ancestor of all animals except sponges, but including a flat, shapeless thing called a placozoan. Some time after 755 million years ago, the common ancestor of ourselves and the placozoa duplicated a gene and changed one of the copies into a recognizable opsin.
I have an article in this week's Spectator about ash trees and exotic pests:
I'm pessimistic about the ash trees. It seems unlikely that a fungus that killed 90 per cent of Denmark's trees and spreads by air will not be devastating here, too. There is a glimmer of hope in the fact that ash, unlike elms, reproduce sexually so they are not clones - uniformly vulnerable to the pathogen. But it's only a glimmer: tree parasites, from chestnut blight to pine beauty moth, have a habit of sweeping through species pretty rampantly, because trees are so long-lived they cannot evolve resistance in time.
The Forestry Commission's apologists are pleading 'cuts' as an excuse for its failure to do anything more timely to get ahead of the threat, but as a woodland owner I am not convinced. An organisation that has the time and the budget to pore over my every felling or planting application in triplicate and come back with fussy and bossy comments could surely spare a smidgen of interest in looming threats from continental fungi that have been spreading out from Poland for 20 years. The commission was warned four years ago of the problem.
My latest Mind and Matter column at the Wall Street Journal is on wolves and "mesopredators":
The return of the wolf is one of the unexpected ecological bonuses of the modern era. So numerous are wolves that this fall Wisconsin and Wyoming have joined Idaho and Montana in opening wolf-hunting seasons for the first time in years. Minnesota follows suit next month; Michigan may do so next year. The reintroduced wolves of Yellowstone National Park have expanded to meet the expanding packs of Canada and northern Montana.
The same is happening in Europe. Wolf populations are rising in Spain, Italy and Eastern Europe, while in recent years wolves have recolonized France, Germany, Sweden and Norway, and have even been seen in Belgium and the Netherlands. Nor are wolves the only "apex predators" to boom in this way. In the U.S., bears and mountain lions are spreading, to joggers' dismay. Coyotes are reappearing even within cities like Chicago and Denver.
My latest Mind and Matter column in the Wall Street Journal:
In 1965, the computer expert Gordon Moore published his famous little graph showing that the number of "components per integrated function" on a silicon chip-a measure of computing power-seemed to be doubling every year and a half. He had only five data points, but Moore's Law has settled into an almost iron rule of innovation. Why is it so regular?
This week's award of the Nobel Prize for medicine to John Gurdon and Shinya Yamanaka effectively recognizes the science of epigenetics. Dr. Gurdon showed that almost any cell (in a frog) contains all the genetic information to become an adult. What makes the cell develop a certain way is a pattern of "epigenetic" modifications to the DNA specific to each tissue-turning genes on and off. Dr. Yamanaka showed that if you can remove that epigenetic modification (in a mouse) you can reprogram a cell to be an embryo.
Yet to most people the word "epigenetics" has come to mean something quite different: the inheritance of nongenetic features acquired by a parent. Most scientists now think the latter effect is rare, unimportant and hugely overhyped.
There are several mechanisms of modifying DNA without altering the genetic code itself. The key point is that these modifications survive the division of cells.
My latest Mind and Matter column in the Wall Street Journal is on genetically modified crops:
Generally, technologies are judged on their net benefits, not on the claim that they are harmless: The good effects of, say, the automobile and aspirin outweigh their dangers. Today, arguably, adopting certain new technologies is harder not just because of a policy of precaution but because of a bias in much of the media against reporting the benefits.
Shale gas is one example, genetically modified food another, where the good news is deemed less newsworthy than the bad. A recent French study claimed that both pesticides and GM corn fed to cancer-susceptible strains of rats produced an increase in tumors. The study has come in for withering criticism from mainstream scientists for its opaque data, small samples, unsatisfactory experimental design and unconventional statistical analysis, yet it has still gained headlines world-wide. (In published responses, the authors have stood by their results.)
My latest Mind and Matter column in the Wall Street Journal finds that just as liberals and conservatives have predictable personalities, so do libertarians:
An individual's personality shapes his or her political ideology at least as much as circumstances, background and influences. That is the gist of a recent strand of psychological research identified especially with the work of Jonathan Haidt. The baffling (to liberals) fact that a large minority of working-class white people vote for conservative candidates is explained by psychological dispositions that override their narrow economic interests.
My latest Mind and Matter column in the Wall Street Journal is about the retreat of Arctic Sea Ice and what it means:
This week probably saw the Arctic Ocean's sea ice reach its minimum extent for the year and begin to expand again, as it usually does in mid-September. Given that the retreat of Arctic ice has become a key piece of evidence for those who take a more alarmed view of global warming, it's newsworthy that 2012's melt was the greatest since records began in 1979, with sea ice in the Northern Hemisphere shrinking to about 1.3 million square miles, or about half the 1979-2008 average.
As this column has sometimes pointed out ways in which the effects of global warming are happening more slowly than predicted, it is fair to record that this rate of decline in Arctic sea ice is faster than many predicted. Although an entirely ice-free Arctic Ocean during at least one week a year is still several decades away at this rate, we are halfway there after just three decades.
I have an article in the Spectator drawing attention to the curious fact that Rachel Carson's Silent Spring owed much to a passionate tobacco denier. It's behind a paywall, but there it is with the sources as links. Hat tip Ron Bailey.
Rachel Carson's Silent Spring, published 50 years ago this month, effectively marked the birth of the modern environmental movement. "Silent Spring came as a cry in the wilderness, a deeply felt, thoroughly researched, and brilliantly written argument that changed the course of history," wrote Al Gore in his introduction to the 1994 edition.
Bill Moggridge, who invented the laptop computer in 1982, died last week. His idea of using a hinge to attach a screen to a keyboard certainly caught on big, even if the first model was heavy, pricey and equipped with just 340 kilobytes of memory. But if Mr. Moggridge had never lived, there is little doubt that somebody else would have come up with the idea.
The phenomenon of multiple discovery is well known in science. Innovations famously occur to different people in different places at the same time. Whether it is calculus (Newton and Leibniz), or the planet Neptune (Adams and Le Verrier), or the theory of natural selection (Darwin and Wallace), or the light bulb (Edison, Swan and others), the history of science is littered with disputes over bragging rights caused by acts of simultaneous discovery.
My latest Mind and Matter column in the Wall Street Journal is a review of a remarkable new science book:
Your great-grandparents faced infectious diseases that hardly threaten you today: tuberculosis, polio, cholera, malaria, yellow fever, measles, mumps, rubella, smallpox, typhoid, typhus, tapeworm, hookworm…. But there's also a long list of modern illnesses that your great-grandparents barely knew: asthma, eczema, hay fever, food allergies, Crohn's disease, diabetes, multiple sclerosis, rheumatoid arthritis. The coincidence of the rise in these "inflammation" diseases, characterized by an overactive immune system, with the decline of infection is almost certainly not a coincidence.
My latest Mind and Matter column at the Wall Street Journal:
The astronomer Martin Rees recently coined the neat phrase "Copernican demotion" for science's habit of delivering humiliating disappointment to those who think that our planet is special. Copernicus told us the Earth was not at the center of the solar system; later astronomers found billions of solar systems in each of the billions of galaxies, demoting our home to a cosmic speck.
Mr. Rees says further Copernican demotion may loom ahead. "The entire panorama that astronomers can observe could be a tiny part of the aftermath of 'our' big bang, which is itself just one bang among a perhaps-infinite ensemble." Indeed, even our physics could be a parochial custom: Mr. Rees says that different universes could be governed by different rules and our "laws of nature" may be local bylaws.
The Times has published my article on Northumberlandia today.
My latest Mind and Matter column in the Wall Street Journal is on selfish DNA:
The theory of selfish DNA was born as a throwaway remark in the book "The Selfish Gene" by Richard Dawkins, when he pondered why there is so much surplus DNA in the genomes of some animals and plants.
My latest Mind and Matter column discusses the debate about how non-Africans got their 1-4% Neanderthal DNA:
So did we or didn't we? Last week saw the publication of two new papers with diametrically opposed conclusions about whether non-African people have Neanderthal-human hybrids among their ancestors-a result of at least some interspecies dalliance in the distant past.
That non-Africans share 1% to 4% of their genomes with Neanderthals is not in doubt, thanks to the pioneering work of paleo-geneticists led by the Max Planck Institute's Svante Paabo. At issue is how to interpret that fact. Dr. Paabo originally recognized that there are two possible explanations, hybridization (which got all the press) or "population substructure."
When the sun rises on December 22, as it surely will, do not expect apologies or even a rethink. No matter how often apocalyptic predictions fail to come true, another one soon arrives. And the prophets of apocalypse always draw a following-from the 100,000 Millerites who took to the hills in 1843, awaiting the end of the world, to the thousands who believed in Harold Camping, the Christian radio broadcaster who forecast the final rapture in both 1994 and 2011.
Predictions of global famine and the end of oil in the 1970s proved just as wrong as end-of-the-world forecasts from millennialist priests. Yet there is no sign that experts are becoming more cautious about apocalyptic promises. If anything, the rhetoric has ramped up in recent years. Echoing the Mayan calendar folk, theBulletin of the Atomic Scientists moved its Doomsday Clock one minute closer to midnight at the start of 2012, commenting: "The global community may be near a point of no return in efforts to prevent catastrophe from changes in Earth's atmosphere."
Over the five decades since the success of Rachel Carson's Silent Spring in 1962 and the four decades since the success of the Club of Rome's The Limits to Growth in 1972, prophecies of doom on a colossal scale have become routine. Indeed, we seem to crave ever-more-frightening predictions-we are now, in writer Gary Alexander's word, apocaholic. The past half century has brought us warnings of population explosions, global famines, plagues, water wars, oil exhaustion, mineral shortages, falling sperm counts, thinning ozone, acidifying rain, nuclear winters, Y2K bugs, mad cow epidemics, killer bees, sex-change fish, cell-phone-induced brain-cancer epidemics, and climate catastrophes.
My latest Mind and Matter column for the Wall Street Journal:
Identifying unique features of human beings is a cottage industry in psychology. In his book "Stumbling on Happiness," the Harvard psychologist Daniel Gilbert jokes that every member of his profession lives under the obligation at some time in his career to complete a sentence which begins: "The human being is the only animal that..." Those who have completed the sentence with phrases like "makes tools," "is conscious" or "can imitate" have generally now conceded that some other animals also have these traits.
Plenty of human uniqueness remains. After all, uniqueness is everywhere in the biological world: Elephants and worms also have unique features. As fast as one scientist demotes human beings from being unique in one trait, another scientist comes up with a new unique trait: grandparental care, for instance, or extra spines on the pyramidal cells of our prefrontal cortex.
My latest Mind and Matter column in the Wall Street Journal is the third in the series on confirmation bias.
I argued last week that the way to combat confirmation bias-the tendency to behave like a defense attorney rather than a judge when assessing a theory in science-is to avoid monopoly. So long as there are competing scientific centers, some will prick the bubbles of theory reinforcement in which other scientists live.
If, as I argued last week, scientists are just as prone as everybody else to confirmation bias to looking for evidence to support rather than test their ideas then how is it that science, unlike cults and superstitions, does change its mind and find new things?
The answer was spelled out by the psychologist Raymond Nickerson of Tufts University in a paper written in 1998: "It is not so much the critical attitude that individual scientists have taken with respect to their own ideas that has given science the success it has enjoyed... but more the fact that individual scientists have been highly motivated to demonstrate that hypotheses that are held by some other scientist(s) are false."
There's a myth out there that has gained the status of a cliché: that scientists love proving themselves wrong, that the first thing they do after constructing a hypothesis is to try to falsify it. Professors tell students that this is the essence of science.
Yet most scientists behave very differently in practice. They not only become strongly attached to their own theories; they perpetually look for evidence that supports rather than challenges their theories. Like defense attorneys building a case, they collect confirming evidence.
My latest Mind and Matter column in the Wall Street Journal
If all goes well next month, Curiosity, NASA's latest mission to Mars, will land in the Gale crater, a 3.5-billion-year-old, 96-mile-wide depression near the planet's equator. Out will roll a car-size rover to search for signs of life, among other things. It will drill into rocks and sample the contents, using a mass spectrometer, a gas chromatograph and a laser spectrometer.
In the unlikely event that the project finds evidence of life, then what? In particular, who is in charge of deciding what we should do if we encounter living Martian creatures?
The Times published my op-ed on banking reform:
It is not yet clear whether the current rage against the banks will do more harm than good: whether we are about to throw the baby of banking as a vital utility out with the bathwater of banking as a wasteful casino. But what is clear is that the current mood of Bankerdämmerung is an opportunity as well as a danger. The fact that so many people agree that some kind of drastic reform is needed, all the way along a spectrum from Milibands to mega-Tories, might just open the window through which far-reaching reform of the financial system enters.
All the actors involved bear some blame. First, investment bankers and the principals in financial companies that cluster around them have trousered an increasing share of the returns from the financial markets, leaving less for their customers and shareholders, while getting "too big to fail", so passing their risks to taxpayers.
Two rival designs of plant biochemistry compete to dominate the globe. One, called C3 after the number of carbon atoms in the initial sugars it makes, is old, but still dominant. Rice is a C3 plant. The other, called C4, is newer in evolutionary history, and now has about 21% of the photosynthesis "market." Corn is a C4 plant. In hot weather, the C3 mechanism becomes inefficient at grabbing carbon dioxide from the air, but in cool weather C4 stops working altogether. So at first glance it seems as if global warming should benefit C4.
I wrote the following op-ed in The Times (behind a paywall) on 2 July.
As I cowered in my parked car in a street in Newcastle last Thursday, nearly deafened by hail on the roof of the car, thunder from the black sky and shrieking girls from the doorway of a school, a dim recollection swam into my mind. After inching back home slowly, through the flooded streets, I googled to refresh the memory. On 23 March this year, the Meteorological Office issued the following prediction:
"The forecast for average UK rainfall slightly favours drier-than-average conditions for April-May-June as a whole, and also slightly favours April being the driest of the 3 months. With this forecast, the water resources situation in southern, eastern and central England is likely to deteriorate further during the April-May-June period."
One of the delights of science is its capacity for showing us that the world is not as it seems. A good example is the startling statistic that there are at least 10 times as many bacterial cells (belonging to up to 1,000 species) in your gut as there are human cells in your entire body: that "you" are actually an entire microbial zoo as well as a person. You are 90% microbes by cell count, though not by volume-a handy reminder of just how small bacteria are.
This fact also provides a glimpse of the symbiotic nature of our relationship with these bugs. A recent study by Howard Ochman at Yale University and colleagues found that each of five great apes has a distinct set of microbes in its gut, wherever it lives. So chimpanzees can be distinguished from human beings by their gut bacteria, which have been co-evolving with their hosts for millions of years.
These days the heritability of intelligence is not in doubt: Bright adults are more likely to have bright kids. The debate was not always this calm. In the 1970s, suggesting that IQ could be inherited at all was a heresy in academia, punishable by the equivalent of burning at the stake.
More than any other evidence, it was the study of twins that brought about this change. "Born Together-Reared Apart," a new book by Nancy L. Segal about the Minnesota study of Twins Reared Apart (Mistra), narrates the history of the shift. In 1979, Thomas Bouchard of the University of Minnesota came across a newspaper report about a set of Ohio twins, separated at birth, who had been reunited and proved to possess uncannily similar habits. Dr. Bouchard began to collect case histories of twins raised apart and to invite them to Minneapolis for study.
Part of the preamble to Agenda 21, the action plan that came out of the Rio Earth Summit of 1992, reads: "We are confronted with a perpetuation of disparities between and within nations, a worsening of poverty, hunger, ill health and illiteracy, and the continuing deterioration of the ecosystems on which we depend for our well-being."
Update: a couple of small corrections inserted in square brackets below. Thanks to Stephen Coles of UCLA.
Human beings love sharing. We swap, collaborate, care, support, donate, volunteer and generally work for each other. We tend to admire sharing when it's done for free but frown upon it-or consider it a necessary evil-when it's done for profit. Some think that online, we're at the dawn of a golden age of free sharing, the wiki world, in which commerce will be replaced by mass communal sharing-what the futurist John Perry Barlow called "dot communism."
Receive all my latest posts straight to your inbox. simply subscribe below:
[*] denotes a required field