Matt Ridley is the author of provocative books on evolution, genetics and society. His books have sold over a million copies, been translated into thirty languages, and have won several awards.
Please note that this blog does not accept comments. If you're reading this blog and want to respond then please use the contact form on the site.
You can also follow me on twitter.
The fruit of a narrow-leaved campion, buried in permafrost by a ground squirrel 32,000 years ago on the banks of the Kolyma river in Siberia, has been coaxed into growing into a new plant, which then successfully set seed itself in a Moscow laboratory. Although this plant species was not extinct, inch by inch scientists seem to be closing in on the outrageous goal of bringing a species back from the dead. I don't expect to live to see a herd of resurrected mammoths roaming the Siberian steppe, but I think my grandchildren just might.
The mammoth is the best candidate for resurrection mainly because flash-frozen ones with well-preserved tissues are regularly found in the Siberian permafrost. Occasionally these have been fresh enough to tempt scientists to cook and eat them, usually with disappointing results. Just last week a Chinese paleontologist in Canada, Xing Lida, filmed himself frying and eating what he said was a small mammoth steak. Cells from such carcasses have been recovered, encouraging a rivalry between Japanese and Russian scientists to be the first to revive one of these huge, elephant-like mammals by cloning. Four years ago the mammoth genome was sequenced, so we at least now know the genetic recipe.
The news of the resurrected flower does, apparently, remove one obstacle. After 32,000 years the plant's DNA had not been so damaged by natural radioactivity in the soil as to make it unviable, which is a surprise. Mammoth carcasses are often much younger - the youngest, on Wrangel Island, being about 4,700 years old, contemporary with the Pharoahs. So the DNA should be in even better shape.
For people who profess to be kind and tolerant, the defenders of Christianity can be remarkably unpleasant and intolerant. For all his frank and sometimes brusque bluster, I cannot think of anything that Richard Dawkins has said that is nearly as personally offensive as the insults that have been deluged upon his head in the past few days.
"Puffed-up, self-regarding, vain, prickly and militant," snaps one commentator. Running a "Foundation for Enlightening People Stupider than Professor Richard Dawkins," scoffs another. Descended from slave owners, smears a third, visiting the sins of a great-great-great-great-great- great-grandfather upon the son (who has made and given away far more money than he inherited).
In all the coverage of last week's War of Dawkins Ear, there has been a consistent pattern of playing the man, not the ball: refusing to engage with his ideas but thinking only of how to find new ways to insult him. If this is Christian, frankly, you can keep it.
My latest Mind and Matter column for the Wall Street Journal is on the good and the bad consequences of our surprising internet honesty:
It is now well known that people are generally accurate and (sometimes embarrassingly) honest about their personalities when profiling themselves on social-networking sites. Patients are willing to be more open about psychiatric symptoms to an automated online doctor than a real one. Pollsters find that people give more honest answers to an online survey than to one conducted by phone.
But online honesty cuts both ways. Bloggers find that readers who comment on their posts are often harshly frank but that these same rude critics become polite if contacted directly. There's a curious pattern here that goes against old concerns over the threat of online dissembling. In fact, the mechanized medium of the Internet causes not concealment but disinhibition, giving us both confessional behavior and ugly brusqueness. When the medium is impersonal, people are prepared to be personal.
My latest Mind and Matter column in the Wall Street Journal is on citizen science:
The more specialized and sophisticated scientific research becomes, the farther it recedes from everyday experience. The clergymen-amateurs who made 19th-century scientific breakthroughs are a distant memory. Or are they? Paradoxically, in an increasing variety of fields, computers are coming to the rescue of the amateur, through crowd-sourced science.
Last month, computer gamers working from home redesigned an enzyme. Last year, a gene-testing company used its customers to find mutations that increase or decrease the risk of Parkinson's disease. Astronomers are drawing amateurs into searching for galaxies and signs of extraterrestrial intelligence. The modern equivalent of the Victorian scientific vicar is an ordinary person who volunteers his or her time to solving a small piece of a big scientific puzzle.
My latest Mind and Matter column for the Wall Street Journal is about the exodus from Africa, either 125,000 years ago or 65,000 years ago.
Everybody is African in origin. Barring a smattering of genes from Neanderthals and other archaic Asian forms, all our ancestors lived in the continent of Africa until 150,000 years ago. Some time after that, say the genes, one group of Africans somehow became so good at exploiting their environment that they (we!) expanded across all of Africa and began to spill out of the continent into Asia and Europe, invading new ecological niches and driving their competitors extinct.
There is plenty of dispute about what gave these people such an advantage-language, some other form of mental ingenuity, or the collective knowledge that comes from exchange and specialization-but there is also disagreement about when the exodus began. For a long time, scientists had assumed a gradual expansion of African people through Sinai into both Europe and Asia. Then, bizarrely, it became clear from both genetics and archaeology that Europe was peopled later (after 40,000 years ago) than Australia (before 50,000 years ago).
My latest Mind and Matter column in the Wall Street Journal is about the role of disease in species conservation:
Some beekeepers, worried by the collapse of their bee colonies in recent years, are pointing a finger this month at a class of insecticide (neo-nicotinoids) that they think is responsible for lowering the insects' resistance to disease. They may be right, but I'm cautious. History shows that, again and again, blaming chemicals for the decline of a species has prematurely exonerated the real culprit, which is often disease alone.
The role of parasites in causing species to decline is often overlooked. Native European red squirrels, for example, have long been retreating in Britain at the hands of the American gray squirrel, which menagerie-owning aristocrats introduced in the 19th century. For years it was thought to be the competition for food that prevented the squirrels' co-existence, but now scientists place most of the blame on a parapox virus that causes a mild illness to the grays but kills the reds.
My latest Mind and Matter column for the Wall Street Journal is on gene-culture co-evolution:
Human beings, we tend to think, are at the mercy of their genes. You either have blue eyes or you do not (barring contact lenses); no amount of therapy can change it. But genes are at the mercy of us, too. From minute to minute, they switch on and off (i.e., are actively used as recipes to make proteins) in the brain, the immune system or the skin in response to experience. Sunbathing, for example, triggers the expression of genes for the pigment melanin.
As a recent study confirms, on a much longer time scale, genes are even at the mercy of culture. The paradigmatic example is lactose tolerance. All mammals can digest lactose sugars in milk as babies, but the lactase gene switches off at weaning when no longer needed. In much of Europe and parts of Africa, by contrast, most people can digest lactose even as adults, because the lactase gene remains switched on. (About 90% of East Asians and 70% of South Indians are lactose-intolerant to some degree.)
One of my favourite writers these days is Willis Eschenbach, whose essays at wattsupwiththat often combine ingenious scientific rationality with lyrical prose. Here he is on the subject of the sea ice off Alaska:
My point in this post? Awe, mostly, at the damaging power of cold. As a seaman, cold holds many more terrors than heat. When enough ice builds up on a boat's superstructure, it rolls over and men die. The sun can't do that. The Titanic wasn't sunk by a heat wave.
The thing about ice? You can't do a dang thing about it. You can't blow up a glacier, or an ice sheet like you see in the Bering Sea above. You can't melt it. The biggest, most powerful icebreaker can't break through more than a few feet of it. When the ice moves in, the game is over.
My latest Mind and Matter column in the Wall Street Journal: Even a rational optimist is pessimistic about some things. Here's one: the gradual distortion of the human sex ratio by sex-selective abortion. A new essay by the demographer Nicholas Eberstadt concludes that "the practice has become so ruthlessly routine in many contemporary societies that it has impacted their very population structures." He finds "ample room for cautious pessimism" in the fact that this phenomenon is still very much on the increase.
For obscure reasons, the human sex ratio is always slightly male-biased, but in the natural state it rarely goes above 105 male births per 100 female ones, except in small samples. In China's last mini-census in 2005, the ratio was nearly 120 to 100 and in some districts over 150. That this is caused by sex-selective abortion (and not, for example, by a hepatitis-B epidemic, which can favor male births) is proved by a ratio of 107 to 100 among first-born children but nearer 150 among ones born later.
China is not the only country where this is happening. By the early 21st century, all four Asian "tigers"-South Korea, Singapore, Hong Kong and Taiwan-had a "naturally impossible" ratio of 108 or higher. India has an increasing ratio, as high as 120 in some states. Even some European and central Asian countries (including Albania, Georgia and even Italy) have unnaturally male-biased births. Nearly half the world falls in this category.
Each year, John Brockman's website, The Edge, asks a question and gets many answers to it. This year, the question is: What is your favourite deep, elegant, or beautiful explanation? Some of the answers are fascinating. Here's mine:
It's hard now to recall just how mysterious life was on the morning of 28 February 1953 and just how much that had changed by lunchtime. Look back at all the answers to the question "what is life?" from before that and you get a taste of just how we, as a species, floundered. Life consisted of three-dimensional objects of specificity and complexity (mainly proteins). And it copied itself with accuracy. How?
How do you set about making a copy of a three-dimensional object? How to do you grow it and develop it in a predictable way? This is the one scientific question where absolutely nobody came close to guessing the answer. Erwin Schrodinger had a stab, but fell back on quantum mechanics, which was irrelevant. True, he used the phrase "aperiodic crystal" and if you are generous you can see that as a prediction of a linear code, but I think that's stretching generosity.
Here's my latest Mind and Matter column in the Wall Street Journal, with added links and charts. On interglacials. The entire 10,000-year history of civilization has happened in an unusually warm interlude in the Earth's recent history. Over the past million years, it has been as warm as this or warmer for less than 10% of the time, during 11 brief episodes known as interglacial periods. One theory holds that agriculture and dense settlement were impossible in the volatile, generally dry and carbon-dioxide-starved climates of the ice age, when crop plants would have grown more slowly and unpredictably even in warmer regions.
This warm spell is already 11,600 years old, and it must surely, in the normal course of things, come to an end. In the early 1970s, after two decades of slight cooling, many scientists were convinced that the moment was at hand. They were "increasingly apprehensive, for the weather aberrations they are studying may be the harbinger of another ice age," said Time in 1974. The "almost unanimous" view of meteorologists was that the cooling trend would "reduce agricultural productivity for the rest of the century," and "the resulting famines could be catastrophic," said Newsweek in 1975.
Since then, of course, warmth has returned, probably driven at least partly by man-made carbon-dioxide emissions. A new paper, from universities in Cambridge, London and Florida, drew headlines last week for arguing that these emissions may avert the return of the ice age. Less noticed was the fact that the authors, by analogy with a previous warm spell 780,000 years ago that's a "dead ringer" for our own, expect the next ice age to start "within about 1,500 years." Hardly the day after tomorrow.
My latest Mind and Matter column in the Wall Street Journal: Coral reefs around the world are suffering badly from overfishing and various forms of pollution. Yet many experts argue that the greatest threat to them is the acidification of the oceans from the dissolving of man-made carbon dioxide emissions.
The effect of acidification, according to J.E.N. Veron, an Australian coral scientist, will be "nothing less than catastrophic.... What were once thriving coral gardens that supported the greatest biodiversity of the marine realm will become red-black bacterial slime, and they will stay that way."
This is a common view. The Natural Resources Defense Council has called ocean acidification "the scariest environmental problem you've never heard of."
My Mind and Matter column for the Wall Street Journal on 1 January 2012 is here:
Here's a New Year's thought. With some nine million species on the planet, and with each species lasting a million years on average, about nine species will go extinct naturally this coming year (with more, almost certainly, going extinct unnaturally). But about nine new species also will be born in 2012.
Here is the Mind and Matter column in the Wall Street Journal, published on 24th December.
Which American city has more inhabitants: San Antonio or San Diego? More Germans than Americans get the answer right (San Diego). What about Hanover or Bielefeld? More Americans than Germans get the answer right (Hanover). In each case, the foreigners pick the right answer by choosing the city they have heard more about, assuming that it's bigger. The natives know too much and let the excess information get in the way.
This is an example of a "heuristic," a highfalutin name for a "rule of thumb" or "gut feeling." Most business people and physicians privately admit that many of their decisions are based on intuition rather than on detailed cost-benefit analysis. In public, of course, it's different. To stand up in court and say you made a decision based on what your thumb or gut told you is to invite damages. So both business people and doctors go to some lengths to suppress or disguise the role that intuition plays in their work.
My latest Mind and Matter column for the Wall Street Journal is on metaphors for the Higgs Boson.
In 1993 a British science minister, William Waldegrave, was sitting on a train reading the speech that his staff had prepared for him for a physics conference. Finding the draft "unspeakably dull," he decided instead to challenge the assembled scientists to answer, on a single sheet of paper, the question: "What is the Higgs boson, and why do we want to find it?" He pledged to the winner a bottle of vintage Champagne.
Even before its existence was at last tentatively suggested by an experiment this week, many people had heard of the Higgs boson, the mysterious manifestation of the field that causes matter to have mass, according to a theory minted in 1964. Yet almost nobody, myself included, knows what a Higgs boson is, or at least can give a sensible description of it. This is a serious handicap if Higgsism, as I hereby christen it, is to have an impact on human culture, let alone on technology.
Prospect has published my essay on bioenergy, in which my research left me astonished at the environmental and economic harm that is being perpetrated. Biomass and biofuels are not carbon neutral, can't displace much fossil fuel, require huge subsidies, increase hunger and directly or indirectly cause rain forest destruction. Apart from that, they're fine... Here's the text: From a satellite, the border between Haiti and the Dominican Republic looks like the edge of a carpet. While the Dominican Republic is green with forest, Haiti is brown: 98 per cent deforested. One of the chief reasons is that Haiti depends on bioenergy. Wood-mostly in the form of charcoal-is used not just for cooking but for industry as well, providing 70 per cent of Haiti's energy. In contrast, in the Dominican Republic, the government imports oil and subsidises propane gas for cooking, which takes the pressure off forests.
Haiti's plight is a reminder there is nothing new about bioenergy. A few centuries ago, Britain got most of its energy from firewood and hay. Over the years the iron industry moved from Sussex to the Welsh borders to Cumberland and then Sweden in an increasingly desperate search for wood to fire its furnaces. Cheap coal and oil then effectively allowed the gradual reforestation of the country. Britain's forest cover-12 per cent-is three times what it was in 1919 and will soon rival the levels recorded in the Doomsday Book of 1086.
Yet if the government has its way, we will instead emulate Haiti. In 2007, Tony Blair signed up to a European Union commitment that Britain would get 20 per cent of its energy from renewable sources by 2020. Apparently neither he nor his officials noticed this target was for "energy" not "electricity." Since much energy is used for heating, which wind, solar, hydro and the like cannot supply, this effectively committed Britain to using lots of wood and crops for both heat and electricity to hit that target. David Cameron and Chris Huhne, anxious to seem the "greenest of them all," dare not weaken the target, despite its unattainability.
I have published the following editorial in City AM, a British financial newspaper: WHEN is a job not a job? Answer: when it is a green job. Jobs in an industry that raises the price of energy effectively destroy jobs elsewhere; jobs in an industry that cuts the cost of energy create extra jobs elsewhere.
The entire argument for green jobs is a version of Frederic Bastiat's broken-window fallacy. The great nineteenth century French economist pointed out that breaking a window may provide work for the glazier, but takes work from the tailor, because the window owner has to postpone ordering a new suit because he has to pay for the window.
You will hear claims from Chris Huhne, the anti-energy secretary, and the green-greed brigade that trousers his subsidies for their wind and solar farms, about how many jobs they are creating in renewable energy. But since every one of these jobs is subsidised by higher electricity bills and extra taxes, the creation of those jobs is a cost to the rest of us. The anti-carbon and renewable agenda is not only killing jobs by closing steel mills, aluminium smelters and power stations, but preventing the creation of new jobs at hairdressers, restaurants and electricians by putting up their costs and taking money from their customers' pockets.
In a strongly worded editorial in Science magazine this week, Calestous Juma, the director of the Agricultural Innovation in Africa program at Harvard's Kennedy School, called for a government-led initiative to introduce biotechnology into Africa. "Major international agencies such as the United Nations have persistently opposed expanding biotechnology to regions most in need of its societal and economic benefits," he wrote.
Genetic modification has had a huge impact on agriculture worldwide. More than 15 million farmers now plant GM crops on almost 370 million acres, boosting yields by 10% to 25%. Despite opponents' fears that the technology would poison people, spread superweeds and entrench corporate monopolies, it's now clear that the new crops have reduced not only hunger but pesticide use, carbon emissions, collateral damage to biodiversity and rain-forest destruction.
Yet, while much of North and South America, Australia and Asia are expanding the use of GM crops, only three African countries have adopted them (a further four are conducting trials). Mr. Juma argues that Africa is the place that most needs a boost from biotech: Many of the continent's farmers cannot afford to buy pesticides, so corn and cotton that are genetically insect-resistant could make a big difference there. Over the past five decades, while Asian yields have quadrupled, African yields have barely budged.
Here's an article I wrote, published by The Times this week.
The anti-capitalists, now more than 50 days outside St Paul's, have a point:
capitalism is proving unfair. But I would like to try to persuade them that the reason is because it is not free-market enough. (Good luck, I hear you cry.) The market, when allowed to flourish, tears apart monopoly and generates freedom and fairness better than any other human institution. Today's private sector, by contrast, is increasingly dominated by companies that are privileged by government through cosy contract, soft subsidy, convenient regulation and crony conversation. That is why it is producing such unfair outcomes.
My latest column in the Wall Street Journal is on the purpose of dreams:
Chancing last week on a study about the calming effect of dreams on people with post-traumatic stress disorder, I decided to read recent research on dreams. When I looked at this topic about 20 years ago, it was clear that our ignorance of the purpose of dreaming was almost total, notwithstanding the efforts of Sigmund Freud, Francis Crick and other fine minds. Is that still true?
To my delight, the answer seems to be no. Some ingenious experiments have replaced general ignorance with specific and intriguing ignorance (as is science's habit). We now know enough to know what it is we do not know about dreams.
Here's a column in The Times, imagining what the world might look like if the UN's low-fertilty scenario comes true.
The peak is in sight. Even as the population passes seven billion, the growth rate of the world population has halved since the 1960s. The United Nations Population Division issues high, medium and low forecasts. Inevitably the high one (fifteen billion people by 2100) gets more attention than the low one (six billion and falling). But given that the forecasts have generally proved too high for the past few decades, let us imagine for a moment what might happen if that proves true again.
Africa is currently the continent with the highest birth rates, but it also has the fastest economic growth. The past decade has seen Asian-tiger-style growth all across Africa. HIV is in retreat, malaria in decline. When child mortality fell and economic growth boomed like this in Europe, Latin America and Asia, the result was a rapid fall in the birth rate. For fertility to fall, contraception provides the means, but economic growth and public health provide the motive. So the current slow decline in Africa's birth rate may turn into a plummet.
As a science communicator, I found this fascinating.
The following is an email that was sent in 2003 by a very senior scientist, Stephen Schneider, to a long list of other senior scientists about an article in a newspaper by an economist. Read it and see what you think of the economist, Ross McKitrick at the end. Hello all. Ah ha-the latest idiot-McKitrick-reenters the scene. He and another incompetent had a book signing party at the US Capitol-Mike MacCracken went and he can tell you about it-last summer. McKitrick also had an article-oped, highly refereed of course-in the Canadian National Post on June 4 this year. Here is the URL that worked back then: http://www.nationalpost.com/search/site/story.asp?id=045D5241-FD00-4773-B816-76222A771778
It was a scream. He argued there is no such thing as global temperature change, just local-all natural variablity mostly. To prove this he had a graph of temperature trends in Erie Pennsylvania for the past 50 years (this is from memory) which showed a cooling. THat alone proves nothing, but when reading the caption I noticed the trend was for temperature in October and November!! So one station for two months consitituted his "refutation" of global warming-another even dumber than Lomborg economist way out of depth and polemicizing. I showed it to a class of Stanford freshman, and one of them said: "I wonder how many records for various combinations of months they had to run through to find one with a cooling trend?" THe freshman was smarter than this bozo. It is improtant to get that op-ed to simply tell all reporters how unbelievably incompetent he is, and should not even be given the time of day over climate issues, for which his one "contribution" is laughably incompetent. By the way, the Henderson/Castles stuff he mentions is also mostly absurd, but that is a longer discussion you all don't need to get into-check it out in the UCS response to earlier Inhofe polemics with answers I gave them on Henderson/Castles if you want to know more about their bad economics on top of their bad climate science
My latest Mind and Matter column for the Wall Street Journal is about the possibility that big meteorites can trigger volcanic activity:
About 65 million years ago, the dinosaurs and maybe two-thirds of all other species suddenly died out. For three decades, the dominant explanation for this mass extinction has been that it was probably caused by the impact of a large meteorite.
A layer of iridium-rich rock from roughly the right date is the fingerprint that convicted this extraterrestrial killer (iridium is more common in space than in the Earth's crust). Even the bullet hole has apparently been found in the shape of a 110-mile-diameter crater called Chicxulub off the coast of Mexico. The explosion would have been the equivalent of two million hydrogen bombs.
My latest Mind and Matter column from the Wall Street Journal: Earthquakes are natural disasters. However much culpability there is afterward about the building standards that may have worsened the death toll or the response of the emergency authorities, nobody is to blame for the actual shock.
At least, not normally. An exception is the phenomenon of "induced seismicity," whereby human activity such as geothermal energy projects, mining, gas drilling or the filling of reservoirs apparently sets off swarms of very small earthquakes where there are susceptible geological faults and in certain kinds of underlying rock.
A recent report from the U.S. Geological Survey concludes, for example, that a nearby shale gas well probably caused a swarm of 43 very small earthquakes (largest magnitude, 2.8) in Garvin County, Okla., last January. A few hours before the quakes began, the well had ceased hydraulic fracturing or "fracking": that is, injecting high-pressure water into the ground to crack deep rocks.
Here's an interview I did for the Globe and Mail in Toronto during my recent visit to Canada.
Joanne Nova has a really fine essay on Naomi Klein. This is great writing, easily as fluent as Klein herself, only rational. An excerpt:
By building her whole argument on un-scientific quicksand, Klein makes mindless statements that unwittingly apply more to her own arguments than anyone elses. She explores "how the right has systematically used crises-real and trumped up-to push through a brutal ideological agenda designed not to solve the problems that created the crises but rather to enrich elites."
No one uses trumped-up-crises better than the left: Which team is demanding billions to "stop the storms"? And which elites will be enriched? The carbon traders and financiers.
My latest Wall Street Journal Mind and Matter column:
The list of scientific heretics who were persecuted for their radical ideas but eventually proved right keeps getting longer. Last month, Daniel Shechtman won the Nobel Prize for the discovery of quasicrystals, having spent much of his career being told he was wrong.
"I was thrown out of my research group. They said I brought shame on them with what I was saying," he recalled, adding that the doyen of chemistry, the late Linus Pauling, had denounced the theory with the words: "There is no such thing as quasicrystals, only quasi-scientists."
Latest Mind and Matter column in the Wall Street Journal:
"You can't change human nature." The old cliché draws support from the persistence of human behavior in new circumstances. Shakespeare's plays reveal that no matter how much language, technology and mores have changed in the past 400 years, human nature is largely undisturbed. Macbeth's ambition, Hamlet's indecision, Iago's jealousy, Kate's feistiness and Juliet's love are all instantly understandable.
Recently, however, geneticists have surprised themselves by finding evidence of recent and rapid changes in human genomes in response to the pressures of civilization. For example, fair skin allows more absorption of the sun's ultraviolet rays necessary for the skin to make vitamin D. So when the northern Europeans, living in a climate with little sunshine, started to farm wheat, a food low in vitamin D, they evolved fair skin to compensate and get more of the vitamin.
There's a fine article at Spiked by Tim Black exposing what Robert* Malthus actually said. Malthus was a reactionary nostalgic pessimist who was not just wrong about population growth outstripping food supply. He was also wrong in his cynicism about helping the poor lest they breed more.
(*Everybody calls him Thomas these days, whereas his contemporaries all called him Robert, which was his second name. Calling him Thomas is like calling the first director of the FBI John Hoover.)
Mylecture on scientific heresyto the RSA this week has been reprinted onbishop-hill.netandwattsupwiththat.com, where it has generated much discussion. Thanks to Andrew Montford and Antony Watts for their interest.
Receive all my latest posts straight to your inbox. simply subscribe below:
[*] denotes a required field