While going through previously published Posts, I came across two, first shown on the 17th & 18th August, 2009, that seemed a better fit being rolled into one.
So here they are.
oooOOOooo
This is not my dog but it brings out the same feelings in me as if I was looking at my German Shepherd.
And this is my German Shepherd! The photograph was taken in 2006 when Pharaoh was 3 years old. The aircraft, by the way, is an L21B, the military variant of the Piper Super Cub. The aircraft was originally delivered to the Dutch Air Force in 1954 and has dispensation from the UK CAA to retain the original registration and callsign of R151.
oooOOOooo
More on the aircraft.
Originally when the first half of today’s Post was published separately readers asked for more information on the aircraft.
So here it is.
Piper Super Cub, L-21B, R-151
A/C Construction No. 18-3841, Frame No. 18-3843
Original Engine, Lycoming 135 Type 0-290-D2, 54/2441
Romeo 151 was one of a batch of 298 L-21’s delivered in 1954. There were 584 L-21B’s produced by Piper for military use, the ‘L’ standing for Liaison. The L-21B’s were PA-18-135’s with civil Lycoming 0-290-D2 engines, glasswork as most L-21A’s and L-18’s and a gross weight of 1760 lbs.
This aircraft was delivered to Koninklijke Luchtmacht, Dutch Air Force, on the 1st July, 1954 and registered R-151. After various homes R-151 transferred to the Dutch civil register as PH-GER, 1st April 1976 with 4,458 hours and shortly thereafter was registered to Vlieclub Hoogeveen, Certificate Number 2380.
On the 27th March, 1981 the aircraft was delivered to the UK with a total time of 5,043 hours and in September, 1981 became G-BIYR. In April, 1983 YR was the first of type to be given a Public Transport CofA and was used for training at Tollaton. YR reverted to a Private CofA in January, 1984 when purchased by Mike and Barbara Fairclough at 5,120 hours.
In 1992 YR was re-engined with a Lycoming 150HP, 0320-A2B No. L49809-27A (zero hours). Finally on the 2nd June, 1995 the a/c was repainted in original Dutch insignia and given CAA (UK Civil Aviation Authority) permission to use the original call-sign, Romeo 151.
The aircraft is based in South Devon, England and owned by the five members of the Delta Foxtrot Flying Group.
A few photos of the aircraft.
Approaching home airfield in South Devon, EnglandFlying in the French Alps, Mt Blanc in sight9,300 ft up in the French Alps
The grandeur of the ancient relationship between dog and man.
A couple of weeks ago, I came across a fascinating article that had been published in American Scientist magazine (online version) written by Professor Pat Shipman. The article provided the background and evidence to support the proposition that dogs may have been man’s best friend for thousands of years longer than we realized.
Very quickly I came across Pat Shipman’s website and learnt that this is one clever lady. As her About page explains,
CAREER SUMMARY
Prof. Shipman
I am internationally known as a paleoanthropologist and conducted research for many years in Africa on human evolution and the animal communities in which humans evolved.
I have conducted research on material from sites in France, Spain, the United States, Java, Ethiopia, Kenya, Tanzania, and South Africa. I have written more than 50 scholarly articles, appearing in journals such as Nature, Science, Journal of Archaeological Science, Paleobiology, Journal of Human Evolution, and Current Anthropology.
I have written more than 100 articles in popular science magazines or newspapers, including The Guardian, The New York Times, The Times Literary Supplement, American Scientist, Discover, and Natural History. Two of my books were featured on the cover of The New York Times Book Review: The Neadertals and Taking Wing. Taking Wing won the Phi Beta Kappa prize for science book of the year and was a runner-up for the LA Times Science Book prize.
My book on Homo erectus, Wisdom of the Bones, was co-authored by Alan Walker and won the Rhone-Poulenc Prize in science writing.
My books have been widely praised as compelling, accessible, and highly readable, with a strong narrative thread. Reviewers frequently comment upon the meticulous research that underpins my books, a feature I consider to be my trademark.
My most recent popular science book, The Ape in the Tree, written with Alan Walker, was called by The Vancouver Sun “part adventure story, part cutting-edge science.” In a Science magazine review, the book was praised as “a fine account of new ways to puzzle out the behaviors of fossilized animals from odd scraps of bone.” Another reviewer raved, “Wonderfully engaging and insightful, The Ape in the Tree, is sure to become a classic in the literature on human origins.” MacArthur fellow John Fleagle wrote in the Quarterly Review of Biology, “Science writing doesn’t get any better than this.” In 2009, this book was awarded the W.W. Howells Book Prize by the American Anthropological Association.
In Britain, my new biography of Mata Hari, Femme Fatale, was selected as The Book of the Week by BBC radio. Each day during the week, an actress gave dramatic readings from the book on the air for fifteen minutes.
With The Animal Connection, I return to paleoanthropology and consider the influence of our connection with animals on human evolution and the origin of modern human behavior.
See what I mean!
Anyway, as you can readily understand, as the author of a blog that writes about what we can learn from this ancient relationship between the dog and man, it struck me as wonderful if I might be permitted to republish in full that article. Prof. Shipman promptly gave me such permission.
So today, I am doing just that and tomorrow I want to write more about Pat Shipman’s latest book, The Animal Connection.
oooOOOooo
The Woof at the Door
Dogs may have been man’s best friend for thousands of years longer than we realized
It’s funny how much difference a single letter makes. A “woof” at the door is a very different thing from a wolf at the door. One is familiar, domestic, reassuring; the other is a frightening apparition of imminent danger. The distinction between our fond companions and the ferocious predator of northern climes goes back a long way.
Dogs are descended from wolves, probably the gray wolf. Some scientists argue that, because dogs and wolves can and do interbreed, they shouldn’t be considered to be separate species at all. They believe that domestic dogs are only a subspecies or variant of the gray wolf, Canis lupus, and ought to be called Canis lupus familiaris (the familiar or domestic wolf) instead of Canis familiaris (the familiar or domestic dog). Although the ability to interbreed and produce fertile offspring is a tried-and-true criterion for recognizing that two populations are really variants of a single species, the reality is more nuanced. We cannot know whether dog-wolf hybrids will thrive and survive, or die out, in the long run.
Prehistoric cave paintings rarely depict wolves or other carnivores. This watercolor tracing of a cave painting was made by the archaeologist Abbé Henri Breuil in the early 1900s from the Grotte de Font-de-Gaume in France. The 17,000-year-old cave paintings number about 250 and mostly show bison and mammoths—only one is thought to be a wolf. Canids may have been domesticated by this point; it is possible that portraying wolves and humans was taboo. Paul Bahn
Certainly we expect to be able to distinguish a dog from a wolf if we see one. Of course, domestic dogs are wildly variable in size and shape, thanks to several hundred years of selective breeding. Some have long, fluffy coats; others have tightly curled, nearly waterproof coats and webbed feet. Some are leggy and swift, whereas others are solid, stoutly built guard dogs. Some fit neatly into a pocketbook, but others barely fit into a compact car. As Robert K. Wayne of the University of California at Los Angeles declares, “Dogs show more diversity in appearance than any other mammal.”
What is it that tells us this animal is “dog” and that one is “wolf?”
Modern wolves and dogs can be distinguished reasonably easily by their appearance. The most telling feature of dogs is the snout, which is significantly shorter and wider than wolves’ snouts. Only a few dog breeds with extremely elongated, slender snouts, such as Irish wolfhounds, surpass wolves in “snoutiness.”
But a crucial part of the difference we perceive is in the animals’ manner and attitude towards humans. Domesticated dogs are just that: canids that live in the house or domicile of humans. They are genetically disposed to seek out human attention and approval and to accept human leadership. Wolves are not.
How did this important change come about? Probably in the distant past, humans took in a wolf cub, or even a whole litter of cubs, and provided shelter, food and protection. As the adopted cubs matured, some were aggressive, ferocious and difficult to handle; those probably ended up in the pot or were cast out. The ones that were more accepting of and more agreeable to humans were kept around longer and fed more. In time, humans might have co-opted the natural abilities of canids, using the dogs’ keen noses and swift running skills, for example, to assist in hunting game. If only the most desirable dogs were permitted to breed, the genes encoding for “better” dogs would continue to be concentrated until the new domesticated species (or subspecies) was formed.
Time to Tame
The creation of a domestic, useful, familiar canid by years of selectively breeding wild and terrifying wolves was almost certainly unplanned. The wolf at the beginning of the process of domestication was tamed—made individually docile—but the essential fact is that, over time, the offspring of those initial wolves were genetically inclined to be more tractable.
Domestication was one of the most brilliant accidents in the entire history of humankind. What’s more, we got it right the first time: Dogs were the original trial animal, and successful product, of such an accident—the happy outcome of years of unwitting experiments and dumb luck.
How long does domestication take? Nobody knows. In an experiment, Russian biologists kept a breeding colony of silver foxes and intentionally selected for breeding those with the least fear and the least aggression toward humans. After 10 generations, 18 percent of the foxes sought human contact and showed little fear. After 30 or so generations, a “domesticated fox” had been created.
The catch is that this experiment was deliberate and strictly controlled. The foxes could not breed with wild foxes and dilute the changing gene pool. Human contact was minimized so animals could not be tamed by their handlers. And because of the experiment’s scientific intent, no one could say, “Oh this one is so cute, let’s let it breed even if it is a little aggressive.” So in the case of dogs, without all these controls, the process could have taken much longer.
Another way of estimating the time at which domestic dogs originated is to consider their genetic differences from wolves. One prominent group of researchers, including Robert Wayne, along with Carles Vilà of the Uppsala University in Sweden and their collaborators, initially estimated in 1997 that dogs diverged from gray wolves 100,000 to 135,000 years ago. After more study, they revised their divergence date to between 40,000 and 100,000 years ago. Another group, led by Peter Savolainen of the Royal Institute of Technology in Sweden, favored the Chinese wolf, a subspecies of the gray wolf, as the probable ancestor and estimated in 2002 that it was domesticated between 15,000 and 40,000 years ago.
How do these genetic estimates stack up against the fossil record? Until 2009, the oldest known remains of domestic dogs were two adult skulls dated to between 13,000 and 17,000 years ago, from Eliseevichi, a region in Russia. Both had the relatively broad, short snout typical of dogs, and both were large, heavy animals, nearly the size of great Danes.
Then a team led by Mietje Germonpré of the Royal Belgian Institute of Natural Sciences reported a stunning new finding in the February 2009 issue of Journal of Archaeological Science: a nearly complete fossil dog skull dated to 31,680 + 250 years ago.
Another Look
Germonpré and her colleagues thought that researchers might have overlooked early prehistoric dogs in the fossil record of the Upper Paleolithic, so they analyzed skulls of large canids (wolves or dogs) from various European sites. The Upper Paleolithic time period spanned 40,000 to 10,000 years ago and is divided into sections based on the artifacts from those times. By convention, each span is named for a culture of people who made the artifacts, and the people, in turn, are usually named for the geographical location where the artifacts were found. The Epigravettian culture existed from 14,000 to 10,000 years ago; before that, the Magdalenian culture thrived from 18,000 to 10,000 years ago; and skipping back a few sections, the Aurignacian culture occurred from 32,000 to 26,000 years ago.
In order to identify the fossil skulls accurately, Germonpré’s team first analyzed a large reference sample of 48 wild, modern wolves and 53 dogs belonging to 11 different breeds. They also examined five skulls (including the ones found in Eliseevichi) that were firmly established as prehistoric domesticated dogs.
In order to establish the morphological differences between wolves and dogs, a group of researchers led by Mietje Germonpré statistically analyzed skulls from 48 modern, wild wolves and 53 modern dogs from 11 breeds, as well as five skulls that were previously established to be from prehistoric dogs. Recent wolves (pink) and prehistoric dogs (blue) clustered into their own groups, based on the length of their toothrows and the shape of their snouts. Modern dogs clustered into four groups, with some overlap in their areas. Recent dogs with archaic proportions included huskies (brown), recent dogs with wolflike snouts included German shepherds (yellow), recent dogs with short toothrows included great Danes (orange), and recent dogs with slender snouts included doberman pinscers (green). One modern dog, a Central Asian shepherd, clustered with the prehistoric dogs. The group then classified new skulls into the established groupings; examples that fell slightly outside of the ranges but that are statistically likely to be within the group are shown as lighter-shaded areas. Recent young wolves fell into the recent-wolf group, whereas wolves kept in captivity were classifed as recent dogs with wolflike snouts. Fossil canid skulls divided between the recent-wolf group and the prehistoric-dog group, with one falling in the group of recent dogs with wolflike snouts. Stephanie Freese, data courtesy of Mietje Germonpré.
The team used statistical analysis of cranial and dental measurements on the skulls to sort the reference sample into six natural clusters. One cluster contained modern wolves. Another consisted of recent dogs of archaic proportions (such as chow-chows and huskies); a single specimen of a Central Asian shepherd was closer to this group than any other but fell outside it. A third cluster included dogs, such as German shepherds and malinois, which have wolflike proportions. These three groups overlapped each other in their cranial proportions. A fourth group of modern dogs has short toothrows—the length of the jaw that contains teeth—and includes such breeds as great Danes, mastiffs and rottweilers. This group overlapped slightly with the archaic-proportioned dog group but not with the others.
The fifth and sixth clusters were completely separate from all others. One consisted of dogs with extremely long, slender snouts, such as Doberman pinschers. The final group, which had long toothrows and short, broad snouts, was made up of the prehistoric dogs. Statistically, the team’s ability to identify any individual specimen as belonging to the correct group was highly significant and accurate.
Using these clusters as reference categories, Germonpré and colleagues used a statistical technique (called discriminant function analysis ) to assign 17 unknown fossil canid skulls to the established categories. Not all of the “unknowns” were truly unknown, however. Five were immature modern wolves that might have had different proportions because of their age, two were wolves that had been kept in captivity, and one was the Central Asian shepherd that didn’t cluster into any of the groups. Additional unknowns were 11 fossil skulls from sites in Belgium, the Ukraine and Russia, although two of these fossil skulls proved to be too incomplete to classify.
The technique correctly classified all of the immature wolves as wolves, but the two zoo wolves were classified as recent dogs with wolflike snouts. Five of the fossil skulls also fell easily into the modern wolf group; although two of these specimens fell into the region of measurements that overlapped with the group of recent dogs with wolflike snouts, they had a higher statistical probability of being wolves. One fossil skull fit directly into the group of recent dogs with wolflike snouts, even though this specimen was clearly ancient.
The remaining three fossil skulls—one from Goyet Cave in Belgium and one each from Mezin and Mezhirich in the Ukraine—resembled each other closely. All three were classified as prehistoric dogs with probabilities of 99 percent, 73 percent and 57 percent, respectively, as was the (modern) Central Asian shepherd, with a 64 percent probability. In addition, the Mezin skull was odd enough in appearance (for a wolf) that another researcher has suggested it might have been a captive wolf. Germonpré and her team were delighted with these results.
The group also successfully extracted mitochondrial DNA (mtDNA) from seven ancient canid bones from Goyet Cave and Trou des Nutons in Belgium. Rather than damage precious skulls, they sampled only bones in which wolves and dogs differ little, so they presumed all of those they sampled for mtDNA were wolves. From each sample, they sequenced a segment of the mtDNA that is highly variable in living wolves and dogs. Each fossil had a unique mtDNA sequence, or haplotype , in this region, which could not be matched with any known sequences for modern wolves (of which there are about 160) or modern dogs (of which more than 1,000 exist) stored in GenBank, a database of all publicly available nucleotide sequences.
“I was not so surprised at the rich genetic diversity of the fossil wolves,” says Germonpré, because there have been other studies with similar findings. Foxes and wolves underwent a severe bottleneck in population size at the end of the last Ice Age, and many genetic lineages went extinct at this time.
“But we were surprised at the antiquity of the Goyet dog,” Germonpré adds. “We expected it would probably be Magdalenian,” perhaps 18,000 to 10,000 years old. This outcome would fit with their results for the Mezin and Mezhirich skulls, which were found with Epigravettian artifacts roughly 14,000 to 10,000 years old. When the age of this specimen from Goyet was directly dated using accelerated mass spectroscopy radiocarbon-dating techniques, the team found that it was not 18,000 years old, but almost twice as old as the next oldest dog, placing the Goyet dog in the Aurignacian period.
A Time of Change
The Goyet dog fossil shows that the domestication of the first animal was roughly contemporaneous with two fascinating developments in Europe.
Around this time, Europeans began producing objects that are recognizable as art. Some of the earliest known art objects from Europe include the remarkable cave paintings of Chauvet Cave in France, the oldest of which were made 32,900 ± 490 years ago. None of the hundreds of glorious Chauvet paintings show wolves. However, the cave preserves something even more haunting: the footprints of a human child about four-and-a-half feet tall, as well as many footprints of large canids and bears.
Around 33,000 years ago, humans began perforating teeth for use in decoration. Although canid teeth made up a very small percentage of the total fauna teeth available, they were used in a majority of the ornaments. Fangs from foxes and wolves appear to have been favorites. One example of a perforated wolf tooth (shown in two views at right) is from Abri Castanet in France and has been dated to 33,000 years ago. A strand of beads interspersed with fox teeth came from the Russian site of Sungir and has been dated to 24,000 years ago (left). There is no specific evidence that canid teeth were used in necklaces; the fox-teeth strand may have been a belt. Randall White
Michel-Alain Garcia of the Centre National de la Recherche Scientifique in Nanterre noticed in 1999 that one track of canid prints appears to accompany the child’s prints. These canid prints, unlike the others, have a shortened middle digit on the front paw: a characteristic of dogs. Garcia suggested that the child and dog might have explored the cave together. Charcoal from a torch the child carried is 26,000 years old.
The Upper Paleolithic cultures of Europe are famous for the flowering of all kinds of exquisite art: sculptures, carvings, paintings and engravings. Animals are common and readily recognizable subjects. Prehistoric art expert Paul Bahn notes that depictions of carnivores, including wolves or dogs, and of humans are rare. Bahn conjectures that portraying wolves and humans might have been taboo.
Anne Pike-Tay of Vassar College offers another perspective. She observes that the scarcity of artistic depictions of carnivores parallels their scarcity in the fossil faunas of the Upper Paleolithic. If domesticated dogs were helping humans hunt, she speculates that they might have been placed in a completely different symbolic category from other animals.
“What if dogs were put in the ‘human family’ category as an extension of the hunter, and like humans, warranted no (or very few) painted or engraved depictions?” she wonders.
The second development of the Aurignacian period is the appearance of objects of personal adornment: jewelry. Although beads and perforated objects occurred much earlier in Africa, the earliest such objects in Europe appeared about 40,000 years ago. At 33,000 years ago, early Aurignacian people began perforating animal teeth (and occasionally human teeth) to wear as pendants or other ornaments, such as belts.
Which teeth did they choose? Among their favorite sources are what have been identified as fangs of foxes and wolves. These identifications might better be termed “small or large canids,” because until now no one has considered the possibility that dogs might have been domesticated so long ago. Besides, identifying a single canid tooth specifically as dog or wolf would be difficult, if not impossible.
Randall White of New York University argues that Aurignacian and later people chose to wear objects that displayed their identity or membership in a certain group or clan. Like gang colors or a t-shirt that proclaims its wearer to be a fan of a particular band, ancient people wore things that made their allegiances clear.
Fossils have helped to establish a far earlier timeframe for dog domestication. A paleolithic canid skull from Goyet in Belgium, about 31,000 years old, has traits characteristic of a dog rather than a wolf (a). When compared to wolves from a similar era, one from Trou Ballu (b) and one from Trou des Nutons (c) in France, the Goyet dog has a relatively wider snout and larger carnassial teeth, and it also has a wider braincase. Elsevier Ltd.
White observes that the teeth Aurignacian people chose to wear were obviously not a random sample of the animals in the fauna. For example, the fauna from the Grotte des Hyènes (Cave of Hyenas) at Brassempouy, France, is dominated by horses, aurochs (a type of cattle) and reindeer—mostly as food remains that often show cutmarks or charring—as well as hyenas, which probably lived in the cave when humans did not. Wolves are rare, making up less than 3 percent of the total fauna. Of approximately 1,600 animal teeth at Brassempouy, only about 2 percent were modified for use as ornaments. However, nearly two-thirds of the ornaments are teeth of wolves or foxes. The rest of the perforated teeth are from other rare species: bear, humans and red deer. None of the teeth of the most common species were used as ornaments at Brassempouy.
Did someone who wore a perforated canid tooth 33,000 years ago proclaim him- or herself to be one of the group that domesticated dogs?
Possibly. Domesticating dogs was a remarkable human achievement that doubtless provided a definite selective advantage to those who accomplished it successfully. They might well have had reason to brag about their accomplishment by wearing canid teeth.
Bibliography
Germonpré, M., et al. 2009. Fossil dogs and wolves from Paleolithic sites in Belgium, the Ukraine and Russia: Osteometry, ancient DNA and stable isotopes. Journal of Archaeological Science 36:473–490.
Morey, D. F. 1994. The early evolution of the domestic dog. American Scientist 82:336–347.
Ostrander, E. A. 2007. Genetics and the shape of dogs. American Scientist 95:406–413.
Savolainen, P., et al. 2002. Genetic evidence for an East Asian origin of domestic dogs. Science 298:1610–1613.
Trut, L. N. 1999. Early canid domestication: The farm-fox experiment. American Scientist 87:160–169.
Vilà, C., et al. 1997. Multiple and ancient origins of the domestic dog. Science 276:1687–1689.
Like many people I had been aware of the hunt for this strange particle, the Higgs boson. Like many people as well, I suspect, I really didn’t comprehend what it was all about.
Then in The Economist print edition of the July 7th the newspaper’s primary story and leader were about the discovery of the Higgs announced on the 4th July. The leader, in particular, was both clear and compelling. I held my breath and asked for permission to republish that leader in Learning from Dogs.
Well the good people from the relevant department at The Economist promptly gave written permission for their leader to be available here for a period of one year. Thanks team!
oooOOOooo
The Higgs boson
Science’s great leap forward
After decades of searching, physicists have solved one of the mysteries of the universe
Jul 7th 2012 | from the print edition
HISTORICAL events recede in importance with every passing decade. Crises, political and financial, can be seen for the blips on the path of progress that they usually are. Even the horrors of war acquire a patina of unreality. The laws of physics, though, are eternal and universal. Elucidating them is one of the triumphs of mankind. And this week has seen just such a triumphant elucidation.
On July 4th physicists working in Geneva at CERN, the world’s biggest particle-physics laboratory, announced that they had found the Higgs boson. Broadly, particle physics is to the universe what DNA is to life: the hidden principle underlying so much else. Like the uncovering of DNA’s structure by Francis Crick and James Watson in 1953, the discovery of the Higgs makes sense of what would otherwise be incomprehensible. Its significance is massive. Literally. Without the Higgs there would be no mass. And without mass, there would be no stars, no planets and no atoms. And certainly no human beings. Indeed, there would be no history. Massless particles are doomed by Einstein’s theory of relativity to travel at the speed of light. That means, for them, that the past, the present and the future are the same thing.
Deus et CERN
Such power to affect the whole universe has led some to dub the Higgs “the God particle”. That, it is not. It does not explain creation itself. But it is nevertheless the most fundamental discovery in physics for decades.
Unlike the structure of DNA, which came as a surprise, the Higgs is a long-expected guest. It was predicted in 1964 by Peter Higgs, a British physicist who was trying to fix a niggle in quantum theory, and independently, in various guises, by five other researchers. And if the Higgs—or something similar—did not exist, then a lot of what physicists think they know about the universe would be wrong.
Physics has two working models of reality. One is Einstein’s general relativity, which deals with space, time and gravity. This is an elegant assembly of interlocking equations that poured out of a single mind a century ago. The other, known as the Standard Model, deals with everything else more messily.
The Standard Model, a product of many minds, incorporates the three fundamental forces that are not gravity (electromagnetism, and the strong and weak nuclear forces), and also a menagerie of apparently indivisible particles: quarks, of which protons and neutrons, and thus atomic nuclei, are made; electrons that orbit those nuclei; and more rarefied beasts such as muons and neutrinos. Without the Higgs, the maths which holds this edifice together would disintegrate.
Finding the Higgs, though, made looking for needles in haystacks seem simple. The discovery eventually came about using the Large Hadron Collider (LHC), a machine at CERN that sends bunches of protons round a ring 27km in circumference, in opposite directions, at close to the speed of light, so that they collide head on. The faster the protons are moving, the more energy they have. When they collide, this energy is converted into other particles (Einstein’s E=mc2), which then decay into yet more particles. What these decay particles are depends on what was created in the original collision, but unfortunately there is no unique pattern that shouts “Higgs!” The search, therefore, has been for small deviations from what would be seen if there were no Higgs. That is one reason it took so long.
Another was that no one knew how much the Higgs would weigh, and therefore how fast the protons needed to be travelling to make it. Finding the Higgs was thus a question of looking at lots of different energy levels, and ruling each out in turn until the seekers found what they were looking for.
Queerer than we can suppose?
For physicists, the Higgs is merely the LHC’s aperitif. They hope the machine will now produce other particles—ones that the Standard Model does not predict, and which might account for some strange stuff called “dark matter”.
Astronomers know dark matter abounds in the universe, but cannot yet explain it. Both theory and observation suggest that “normal” matter (the atom-making particles described by the Standard Model) is only about 4% of the total stuff of creation. Almost three-quarters of the universe is something completely obscure, dubbed “dark energy”. The rest, 22% or so, is matter of some sort, but a sort that can be detected only from its gravity. It forms a giant lattice that permeates space and controls the position of galaxies made of visible matter (see article). It also stops those galaxies spinning themselves apart. Physicists hope that it is the product of one of the post-Standard Model theories they have dreamed up while waiting for the Higgs. Now, they will be able to find out.
For non-physicists, the importance of finding the Higgs belongs to the realm of understanding rather than utility. It adds to the sum of human knowledge—but it may never change lives as DNA or relativity have. Within 40 years, Einstein’s theories paved the way for the Manhattan Project and the scourge of nuclear weapons. The deciphering of DNA has led directly to many of the benefits of modern medicine and agriculture. The last really useful subatomic particle to be discovered, though, was the neutron in 1932. Particles found subsequently are too hard to make, and too short-lived to be useful.
This helps explain why, even at this moment of triumph, particle physics is a fragile endeavour. Gone are the days when physicists, having given politicians the atom bomb, strode confidently around the corridors of power. Today they are supplicants in a world where money is tight. The LHC, sustained by a consortium that was originally European but is now global, cost about $10 billion to build.
That is still a relatively small amount, though, to pay for knowing how things really work, and no form of science reaches deeper into reality than particle physics. As J.B.S. Haldane, a polymathic British scientist, once put it, the universe may be not only queerer than we suppose, but queerer than we can suppose. Yet given the chance, particle physicists will give it a run for its money.
Before signing off on this very important step forward for physics, here are a couple of footnotes.
First, here’s a video of the announcement that was widely shown on the 4th.
Secondly, the BBC News website had a really good piece on the 12th July written by their science correspondent, Quentin Cooper, called Higgs: What was left unsaid.Here’s a flavour taken from the early part of the article,
So that’s it, search over, Higgs boson found. Almost 50 years after physicist Peter Higgs first theorised it was out there, public elementary number one has finally been captured in the data from two detectors at the Large Hadron Collider at Cern. Case closed. Champagne popped. Boson nova danced.
If only. That handily simplified and heavily fictionalised telling of the tale has helped transform a spectacular scientific success story into one that is also global front page news. Without it the 4 July announcement might not have generated such a frenzy of coverage and so many claims about it being a historic milestone for our species. One particle physicist only half jokingly told me that in future the date may come to be celebrated as Higgs Day, rather than anything to do with American independence.
Don’t get me wrong. What has happened at Cern represents a magnificent accomplishment; big science at its biggest and boldest. And it’s fantastic that it has been perceived and received as being of such importance. It’s just that there is more to the story from the very beginning right through to the, probably false, ending.
For starters, as Peter Higgs himself acknowledges, he was just one of several scientists who came up with the mechanism which predicted the particle which bears his name, but the others rarely get a mention*. As to the finish – well, as small children are fond of saying, are we there yet? There is very strong evidence that the LHC teams have found a new elementary particle, but while this is exciting it is far less clear that what they’ve detected is the fabled Higgs. If it is, it seems curiously lighter than expected and more work is needed to explain away the discrepancy. If it’s not, then the experimentalists and theorists are going to be even busier trying to see if it can be shoehorned into the current Standard Model of particle physics. Either way, it’s not exactly conclusive.
Do take the simple step of clicking here and read the BBC piece in full.
Well done, Mr. Peter Higgs and all those very persistent scientists associated with the Large Hadron Collider; I suspect we haven’t heard the last of this!
A guest post from Martin Lack points to the crux of the issue of denying man-caused climate change.
Introduction
I saw this post on Martin’s Blog Lack of Environment the day after I wrote a piece called In praise of fairness. In my piece I mentioned the sad case of Mr. Bob Diamond and Martin continued with the theme in such a manner that I wanted to republish his article in full. Here it is.
oooOOOooo
Are you negligent, incompetent or complicit?
This was a question posed to former Barclays CEO Bob Diamond this week, when he appeared before a Parliamentary Select Committee of MPs on Wednesday. It is a question that I would like to ask Dr Richard Lindzen… In fact, I have asked the question and – just as Bob Diamond did – he has refused to answer it… Here is the evidence on which you should decide for yourself:
Many readers will recall that, following my visit to London to hear Lindzen speak to a room full of fake sceptics in the Palace of Westminster on 22 February this year, I attempted to get some answers to questions. Unfortunately, I failed. I have been particularly frustrated by one thing; possibly the most misleading aspect of Lindzen’s entire presentation – a combination of graphs of recent atmospheric CO2 and temperature data that was mysteriously omitted from the PDF of the presentation that was initially posted on the Internet. Although Lindzen never answered any of my questions, he did insert this slide into the PDF of his presentation despite my pointing out to him – MIT and the AGU – that it was essentially meaningless (as the y-axes could be stretched to show either correlation or no correlation as preferred by the speaker).
Here is a screenshot of the misleading graph from the video of the presentation:
Steeply inclined Keeling curve versus apparently non-correlating temperature – if you stretched the temperature axis enough it would appear to correlate quite well. Therefore slide neither proves not disproves anything.
This bears more than a passing resemblance to the World Climate Widget – a very similar-looking combination of graphs (i.e. manipulated to suggest that there is no correlation between recent atmospheric CO2 and temperature data) – that can be downloaded as a widget from Anthony Watts’ Watts Up With That? (WUWT) misinformation blog.
If you go to the WUWT widget page, you will find the two graphs in both of these images (above and right) are there presented separately. However, to prove my point – that anyone using these graphs to try and prove there is no correlation between long-term CO2 and temperature changes – just look at what happens when you take the graph of University of Alabama at Huntsville (UAH) global lower atmosphere data as used by WUWT (i.e. cooler than surface temperature data) and stretch it:
Clearing the fog of data misrepresentation created by Lindzen et al. – Note the clear upward trend in the temperature graph on the left (it was there all the time).
Therefore, for anyone – including Lindzen – to try and use the original combination of graphs to suggest there is no correlation between CO2 and temperature, this suggests that they are either negligent, incompetent, or deliberately trying to mislead people. For many people who are not scientists to be fooled by this is understandable but, for a prominent scientist like Lindzen to make this mistake – and not apologise for doing so – is unforgivable. Furthermore, it would seem that, no matter how many times he is criticised, he just keeps repeating the same old mistakes: Skeptical Science: Lindzen and Choi 2011 – Party Like It’s 2009
It would appear that, despite the best efforts of the majority of prominent climate scientists, Lindzen’s London Illusions are still fooling a lot of people. If you follow that last link, it will take you to the website of what I prefer to call The Global Wonky Policy Foundation, where it is reported that only 43% of the British adult population felt able to agree with the following statement: “Global warming is a fact and is mostly caused by emissions from vehicles and industrial facilities”.
It has been suggested to me that this question is carefully phrased to deter people from saying “yes” (i.e. they might agree that warming is occurring and/or that humans are the primary cause; but they might not agree that vehicles and factories are the primary source of emissions). However, this is ‘clutching at straws’ in my opinion; and leaves me wondering what percentage of the population would feel able to agree with this statement:
“The sunrise is a fact and is mostly caused by the Earth not being flat and spinning once a day whilst orbiting the Sun”…?
oooOOOooo
I’m very grateful to Martin for allowing me to republish this.
An original idea that shouldn’t be regarded as innovative.
We live in interesting times! Whenever I use that phrase, and it seems to slip from my lips too often these days, I am reminded of the ancient Chinese curse, “May you live in interesting times!”
There are a goodly number of countries that have legislation that ‘impose’ a minimum wage for employees. Here in the USA, the Federal level for 2012 is $7.25 per hour but it isn’t necessarily the same across all States. Based on a 40-hour working week, 50 weeks a year, that comes to a gross of $14,500 for the full year.
Let’s contrast that with a person who has been in the news recently, Mr. Bob Diamond, Chief Executive of Barclays.
Mr Diamond has said he will not take a bonus for this year as a result of the scandal.
It is not the first time the 60-year-old Boston-born former academic – he began his career as a university lecturer – has made the headlines.
Mr Diamond was previously best-known for his huge wealth: last year he topped the list of the highest-paid chief executives in the FTSE 100.
‘Unacceptable face’
In 2011 Mr Diamond earned £20.9m, comprising salary, bonuses and share options, and he is reported to have a personal wealth of £105m.
There has long been controversy about the amount he earns.
In 2010, Lord Mandelson described him as the “unacceptable face of banking”, saying he had taken a £63m salary for “deal-making and shuffling paper around”.
Barclays dismissed the figure as “total fiction” saying that his salary as head of Barclays Capital was actually £250,000.
BBC business editor Robert Peston said he believed Mr Diamond had earned £6m in 2009 from a long-term incentive scheme and £27m from selling his stake in a Barclays-owned business that had been sold.
So whether he earns £20.9m, £6m or even £250,000 frankly makes no difference to the fact that the gap between what the poorest may earn and the sorts of monies that are given to Mr. Diamond and his like is just plain wrong. [And since writing this on Monday, the news broke on Tuesday morning that Mr. Diamond is now unemployed.] Don’t often quote the bible in Learning from Dogs but 2 Corinthians 8:13-15 is irresistible (King James Version),
Our desire is not that others might be relieved while you are hard pressed, but that there might be equality. At the present time your plenty will supply what they need, so that in turn their plenty will supply what you need. The goal is equality, as it is written: “The one who gathered much did not have too much, and the one who gathered little did not have too little.” [my emphasis]
I subscribe to Naked Capitalism and the other day there was a deeply interesting article about France pushing for a maximum wage. Let me take the liberty of quoting all of it,
SUNDAY, JULY 1, 2012
France Pushing for a Maximum Wage; Will Others Follow?
A reader pointed out a news item we missed, namely, that the new government in France is trying to implement a maximum wage for the employees of state-owned companies. From the Financial Times:
France’s new socialist government has launched a crackdown on excessive corporate pay by promising to slash the wages of chief executives at companies in which it owns a controlling stake, including EDF, the nuclear power group.
In a departure from the more boardroom-friendly approach of the previous right-of-centre administration, newly elected president François Hollande wants to cap the salary of company leaders at 20 times that of their lowest-paid worker.
According to Jean-Marc Ayrault, prime minister, the measure would be imposed on chief executives at groups such as EDF’s Henri Proglio and Luc Oursel at Areva, the nuclear engineering group. Their pay would fall about 70 per cent and 50 per cent respectively should the plan be cleared by lawyers and implemented in full…
France is unusual in that it still owns large stakes in many of its biggest global companies, ranging from GDF Suez, the gas utility; to Renault, the carmaker; and EADS, parent group of passenger jet maker Airbus.
Of course, in the US, we have companies feeding so heavily at the government trough that they hardly deserve the label of being private, but the idea that the public might legitimately have reason to want to rein in ever-rising executive pay is treated as a rabid radical idea.
For those, however, receiving bailouts, deposit insurance, government guarantees, tax breaks, tax credits, other forms of public financing, government contracts of any sort – and so on – the top paid person cannot receive more than twenty-five times the bottom paid person. This ratio, by the way, is what business visionary Peter Drucker recommended as most effective for organization performance as well as society. It also echoes Jim Collins who, in his book Good To Great, found that the most effective top leaders are paid more modestly than unsuccessful ones. And, critically, it is a ratio that is in line with various European and other nations that have dramatically lower income inequality than the United States.
In other words, the French proposal isn’t that big a change from existing norms, at least in most other advanced economics (ex the UK, which has also moved strongly in the direction of US top level pay). But despite the overwhelming evidence that corporate performance is if anything negatively correlated with CEO pay, the myth of the superstar CEO and the practical obstacles to shareholder intervention (too fragmented; too many built in protections for incumbent management, like staggered director terms; major free rider problems if any investor tries to discipline extractive CEO and C level pay, which means it’s easier to sell than protest) means ideas like this are unlikely to get even a hearing in the US. Let the looting continue!
As Patrice Ayme commented on that Naked Capitalism article, “France will pass the 20 to 1 law, as the socialists control the entire state, senate, National Assembly, Regions, big cities, etc. Only the French Constitutional Court could stop it. That’s unlikely, why? Because one cannot have a minimum wage, without a maximum wage. It’s not a question of philosophy, but of mathematics.”
Let me go back and requote this,
…. the top paid person cannot receive more than twenty-five times the bottom paid person. This ratio, by the way, is what business visionary Peter Drucker recommended as most effective for organization performance as well as society. It also echoes Jim Collins who, in his book Good To Great, found that the most effective top leaders are paid more modestly than unsuccessful ones. And, critically, it is a ratio that is in line with various European and other nations that have dramatically lower income inequality than the United States.
Thus if society was to embrace this approach to fairness, in America the top paid person in 2012 in the USA would be on 25 times the minimum wage level of $14,500 a year or, in other words, $362,500 a year.
I’m not a raving liberal but I am bound to say that this sits pretty well with me. How about you?
As I opened, an original idea that shouldn’t be regarded as innovative.
The Declaration of Independence, CONGRESS, July 4, 1776.
The unanimous Declaration of the thirteen united States of America,
When in the Course of human events, it becomes necessary for one people to dissolve the political bands which have connected them with another, and to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature’s God entitle them, a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation.
We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.–That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed, –That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness. Prudence, indeed, will dictate that Governments long established should not be changed for light and transient causes; and accordingly all experience hath shewn, that mankind are more disposed to suffer, while evils are sufferable, than to right themselves by abolishing the forms to which they are accustomed. But when a long train of abuses and usurpations, pursuing invariably the same Object evinces a design to reduce them under absolute Despotism, it is their right, it is their duty, to throw off such Government, and to provide new Guards for their future security.–Such has been the patient sufferance of these Colonies; and such is now the necessity which constrains them to alter their former Systems of Government. The history of the present King of Great Britain is a history of repeated injuries and usurpations, all having in direct object the establishment of an absolute Tyranny over these States. To prove this, let Facts be submitted to a candid world.
He has refused his Assent to Laws, the most wholesome and necessary for the public good.
He has forbidden his Governors to pass Laws of immediate and pressing importance, unless suspended in their operation till his Assent should be obtained; and when so suspended, he has utterly neglected to attend to them.
He has refused to pass other Laws for the accommodation of large districts of people, unless those people would relinquish the right of Representation in the Legislature, a right inestimable to them and formidable to tyrants only.
He has called together legislative bodies at places unusual, uncomfortable, and distant from the depository of their public Records, for the sole purpose of fatiguing them into compliance with his measures.
He has dissolved Representative Houses repeatedly, for opposing with manly firmness his invasions on the rights of the people.
He has refused for a long time, after such dissolutions, to cause others to be elected; whereby the Legislative powers, incapable of Annihilation, have returned to the People at large for their exercise; the State remaining in the mean time exposed to all the dangers of invasion from without, and convulsions within.
He has endeavoured to prevent the population of these States; for that purpose obstructing the Laws for Naturalization of Foreigners; refusing to pass others to encourage their migrations hither, and raising the conditions of new Appropriations of Lands.
He has obstructed the Administration of Justice, by refusing his Assent to Laws for establishing Judiciary powers.
He has made Judges dependent on his Will alone, for the tenure of their offices, and the amount and payment of their salaries.
He has erected a multitude of New Offices, and sent hither swarms of Officers to harrass our people, and eat out their substance.
He has kept among us, in times of peace, Standing Armies without the Consent of our legislatures.
He has affected to render the Military independent of and superior to the Civil power.
He has combined with others to subject us to a jurisdiction foreign to our constitution, and unacknowledged by our laws; giving his Assent to their Acts of pretended Legislation:
For Quartering large bodies of armed troops among us:
For protecting them, by a mock Trial, from punishment for any Murders which they should commit on the Inhabitants of these States:
For cutting off our Trade with all parts of the world:
For imposing Taxes on us without our Consent:
For depriving us in many cases, of the benefits of Trial by Jury:
For transporting us beyond Seas to be tried for pretended offences
For abolishing the free System of English Laws in a neighbouring Province, establishing therein an Arbitrary government, and enlarging its Boundaries so as to render it at once an example and fit instrument for introducing the same absolute rule into these Colonies:
For taking away our Charters, abolishing our most valuable Laws, and altering fundamentally the Forms of our Governments:
For suspending our own Legislatures, and declaring themselves invested with power to legislate for us in all cases whatsoever.
He has abdicated Government here, by declaring us out of his Protection and waging War against us.
He has plundered our seas, ravaged our Coasts, burnt our towns, and destroyed the lives of our people.
He is at this time transporting large Armies of foreign Mercenaries to compleat the works of death, desolation and tyranny, already begun with circumstances of Cruelty & perfidy scarcely paralleled in the most barbarous ages, and totally unworthy the Head of a civilized nation.
He has constrained our fellow Citizens taken Captive on the high Seas to bear Arms against their Country, to become the executioners of their friends and Brethren, or to fall themselves by their Hands.
He has excited domestic insurrections amongst us, and has endeavoured to bring on the inhabitants of our frontiers, the merciless Indian Savages, whose known rule of warfare, is an undistinguished destruction of all ages, sexes and conditions.
In every stage of these Oppressions We have Petitioned for Redress in the most humble terms: Our repeated Petitions have been answered only by repeated injury. A Prince whose character is thus marked by every act which may define a Tyrant, is unfit to be the ruler of a free people.
Nor have We been wanting in attentions to our Brittish brethren. We have warned them from time to time of attempts by their legislature to extend an unwarrantable jurisdiction over us. We have reminded them of the circumstances of our emigration and settlement here. We have appealed to their native justice and magnanimity, and we have conjured them by the ties of our common kindred to disavow these usurpations, which, would inevitably interrupt our connections and correspondence. They too have been deaf to the voice of justice and of consanguinity. We must, therefore, acquiesce in the necessity, which denounces our Separation, and hold them, as we hold the rest of mankind, Enemies in War, in Peace Friends.
We, therefore, the Representatives of the united States of America, in General Congress, Assembled, appealing to the Supreme Judge of the world for the rectitude of our intentions, do, in the Name, and by Authority of the good People of these Colonies, solemnly publish and declare, That these United Colonies are, and of Right ought to be Free and Independent States; that they are Absolved from all Allegiance to the British Crown, and that all political connection between them and the State of Great Britain, is and ought to be totally dissolved; and that as Free and Independent States, they have full Power to levy War, conclude Peace, contract Alliances, establish Commerce, and to do all other Acts and Things which Independent States may of right do. And for the support of this Declaration, with a firm reliance on the protection of divine Providence, we mutually pledge to each other our Lives, our Fortunes and our sacred Honor.
The 56 signatures on the Declaration of Independence
It’s some ago that I read Lester Brown’s book World on the Edge but I still recall the effect it had on me. Namely, this is not some environmentalist’s ‘willy waving’ but something that has the potential to hurt, I mean HURT! Since the book was published the stream of information and evidence has turned into a flood of awareness that if we don’t change our ways soon then we, as in the vastness of human life, will go over the edge.
So it was a good reminder to come across a recent extract on the Earth Policy Institute website that is republished in full, as follows:
No previous civilization has survived the ongoing destruction of its natural supports. Nor will ours. Yet economists look at the future through a different lens. Relying heavily on economic data to measure progress, they see the near 10-fold growth in the world economy since 1950 and the associated gains in living standards as the crowning achievement of our modern civilization. During this period, income per person worldwide climbed nearly fourfold, boosting living standards to previously unimaginable levels. A century ago, annual growth in the world economy was measured in the billions of dollars. Today, it is measured in the trillions. In the eyes of mainstream economists, our present economic system has not only an illustrious past but also a promising future.
Mainstream economists see the 2008–09 global economic recession and near-collapse of the international financial system as a bump in the road, albeit an unusually big one, before a return to growth as usual. Projections of economic growth, whether by the World Bank, Goldman Sachs, or Deutsche Bank, typically show the global economy expanding by roughly 3 percent a year. At this rate the 2010 economy would easily double in size by 2035. With these projections, economic growth in the decades ahead is more or less an extrapolation of the growth of recent decades.
But natural scientists see that as the world economy expanded some 20-fold over the last century, it has revealed a flaw—a flaw so serious that if it is not corrected it will spell the end of civilization as we know it. At some point, what had been excessive local demands on environmental systems when the economy was small became global in scope.
A study by a team of scientists led by Mathis Wackernagel aggregates the use of the earth’s natural assets, including carbon dioxide overload in the atmosphere, into a single indicator—the ecological footprint. The authors concluded that humanity’s collective demands first surpassed the earth’s regenerative capacity around 1980. By 2007, global demands on the earth’s natural systems exceeded sustainable yields by 50 percent. Stated otherwise, it would take 1.5 Earths to sustain our current consumption. If we use environmental indicators to evaluate our situation, then the global decline of the economy’s natural support systems—the environmental decline that will lead to economic decline and social collapse—is well under way.
How did we get into this mess? Our market-based global economy as currently managed is in trouble. The market does many things well. It allocates resources with an efficiency that no central planner could even imagine, much less achieve.
However the market, which sets prices, is not telling us the truth. It is omitting indirect costs that in some cases now dwarf direct costs. Consider gasoline. Pumping oil, refining it into gasoline, and delivering the gas to U.S. service stations may cost, say, $3 per gallon. The indirect costs, including climate change, treatment of respiratory illnesses, oil spills, and the U.S. military presence in the Middle East to ensure access to the oil, total $12 per gallon. Similar calculations can be done for coal.
We delude ourselves with our accounting system. Leaving such huge costs off the books is a formula for bankruptcy. Environmental trends are the lead indicators telling us what lies ahead for the economy and ultimately for society itself. Falling water tables today signal rising food prices tomorrow. Shrinking polar ice sheets are a prelude to falling coastal real estate values.
Beyond this, mainstream economics pays little attention to the sustainable yield thresholds of the earth’s natural systems. Modern economic thinking and policymaking have created an economy that is so out of sync with the ecosystem on which it depends that it is approaching collapse. How can we assume that the growth of an economic system that is shrinking the earth’s forests, eroding its soils, depleting its aquifers, collapsing its fisheries, elevating its temperature, and melting its ice sheets can simply be projected into the long-term future? What is the intellectual process underpinning these extrapolations?
We are facing a situation in economics today similar to that in astronomy when Copernicus arrived on the scene, a time when it was believed that the sun revolved around the earth. Just as Copernicus had to formulate a new astronomical worldview after several decades of celestial observations and mathematical calculations, we too must formulate a new economic worldview based on several decades of environmental observations and analyses.
The archeological record indicates that civilizational collapse does not come suddenly out of the blue. Archeologists analyzing earlier civilizations talk about a decline-and-collapse scenario. Economic and social collapse was almost always preceded by a period of environmental decline.
For past civilizations it was sometimes a single environmental trend that was primarily responsible for their decline. Sometimes it was multiple trends. For Sumer, rising salt concentrations in the soil, as a result of an environmental flaw in the design of their otherwise extraordinary irrigation system, led to a decline in wheat yields. The Sumerians then shifted to barley, a more salt-tolerant crop. But eventually barley yields also began to decline. The collapse of the civilization followed.
For the Mayans, it was deforestation and soil erosion. As more and more land was cleared for farming to support the expanding empire, soil erosion undermined the productivity of their tropical soils. A team of scientists from the National Aeronautics and Space Administration has noted that the extensive land clearing by the Mayans likely also altered the regional climate, reducing rainfall. In effect, the scientists suggest, it was the convergence of several environmental trends, some reinforcing others, that led to the food shortages that brought down the Mayan civilization.
Although we live in a highly urbanized, technologically advanced society, we are as dependent on the earth’s natural support systems as the Sumerians and Mayans were. If we continue with business as usual, civilizational collapse is no longer a matter of whether but when. We now have an economy that is destroying its natural support systems, one that has put us on a decline and collapse path.
The reality of our situation may soon become clearer for mainstream economists as we begin to see some of the early economic effects of overconsuming the earth’s resources, such as rising world food prices. On the social front, the most disturbing trend is spreading hunger.
As rapid population growth continues, cropland becomes scarce, wells go dry, forests disappear, soils erode, unemployment rises, and hunger spreads. As environmental degradation and economic and social stresses mount, the more fragile governments are losing their capacity to govern. They become failing states—countries whose governments can no longer provide personal security, food security, or basic social services, such as education and health care. As the list of failing states grows longer each year, it raises a disturbing question: How many states must fail before our global civilization begins to unravel?
How much longer can we remain in the decline phase, whether measured in natural asset liquidation, spreading hunger, or failing states, before our global civilization begins to break down? We are dangerously close to the edge. Peter Goldmark, former Rockefeller Foundation president, puts it well: “The death of our civilization is no longer a theory or an academic possibility; it is the road we’re on.”
Adapted fromWorld on the Edge by Lester R. Brown. The message is clear.
But if you haven’t read the book it’s available online, for free! Just go here and not only will you find the link to the book but also links to other valuable materials.
Founder and President of the Earth Policy Institute, Lester Brown, speaks about his new book World on the Edge: How to Prevent Environmental and Economic Collapse. The issues, says Brown, are critical — and the big question is whether we can change direction before “we go over the edge.” Among his points: solar, wind and geothermal energy, with energy efficiency, can provide all the power we need, but a massive effort must be made now to fully shift to these clean, safe, renewable energy technologies. He strongly rejects nuclear power.
Before going to a recent BBC report about this important subject, let me offer a personal anecdote.
A couple of months ago I had cause to be seen by a neurologist. I wanted to get a professional opinion as to whether a degree of forgetfulness that I was experiencing was normal for a person of my age (68 next birthday). Dr. G. not only confirmed that there was absolutely no sign of dementia but that my forgetfulness was perfectly normal for someone of my age who had been through some major life changes in the last few years.
Dr. G. stressed (probably not the best word but you know what I mean!) that me worrying about forgetting stuff and the resulting anxiety was a self-feeding issue. I had to stop being anxious. Indeed, Dr. G. said the following (and this I haven’t forgotten!):
Anxiety is the killer of good bodies and the killer of good brains!
So with those words ringing in your ears, have a read of this recent report from the BBC News website.
Role of stress in dementia investigated
By Michelle Roberts, Health editor, BBC News online
UK experts are to begin a study to find out if stress can trigger dementia.
The investigation, funded by the Alzheimer’s Society, will monitor 140 people with mild cognitive impairment or “pre-dementia” and look at how stress affects their condition.
The researchers will take blood and saliva samples at six-monthly intervals over the 18 months of the study to measure biological markers of stress.
They hope their work will reveal ways to prevent dementia.
The results could offer clues to new treatments or better ways of managing the condition, they say.
Dementia triggers
People who have mild cognitive impairment are at an increased risk of going on to develop dementia – although some will remain stable and others may improve.
And past work suggests mid-life stress may increase a person’s risk of Alzheimer’s disease.
A Swedish study that followed nearly 1,500 women for a period of 35 years found the risk of dementia was about 65% higher in women who reported repeated periods of stress in middle age than in those who did not.
Scottish scientists, who have done studies in animals, believe the link may be down to hormones the body releases in response to stress which interfere with brain function.
Prof Clive Holmes, from the University of Southampton, who will lead the study, said: “All of us go through stressful events. We are looking to understand how these may become a risk factor for the development of Alzheimer’s.
“Something such as bereavement or a traumatic experience – possibly even moving home – are also potential factors.
“This is the first stage in developing ways in which to intervene with psychological or drug-based treatments to fight the disease.
“We are looking at two aspects of stress relief – physical and psychological – and the body’s response to that experience.”
Dr Simon Ridley, of Alzheimer’s Research UK, said: “We welcome any research that could shed new light on Alzheimer’s disease and other causes of dementia.
“Understanding the risk factors for Alzheimer’s could provide one piece of the puzzle we need to take us closer to a treatment that could stop the disease in its tracks.”
The sad story of the death of Lonesome George, a giant tortoise.
When you’re gone, you’re gone, it is said. But in the case of this example of the beauty of Mother Nature, the idea of being gone is as final as it comes; George was the last of his species.
Staff at the Galapagos National Park in Ecuador say Lonesome George, a giant tortoise believed to be the last of its subspecies, has died.
Scientists estimate he was about 100 years old.
Park officials said they would carry out a post-mortem to determine the cause of his death.
With no offspring and no known individuals from his subspecies left, Lonesome George became known as the rarest creature in the world.
For decades, environmentalists unsuccessfully tried to get the Pinta Island tortoise to reproduce with females from a similar subspecies on the Galapagos Islands.
Park officials said the tortoise was found dead in his corral by his keeper of 40 years, Fausto Llerena.
While his exact age was not known, Lonesome George was estimated to be about 100, which made him a young adult as the subspecies can live up to an age of 200.
Lonesome George was first seen by a Hungarian scientist on the Galapagos island of Pinta in 1972.
Environmentalists had believed his subspecies(Chelonoidis nigra abingdoni) had become extinct.
Lonesome George became part of the Galapagos National Park breeding programme.
After 15 years of living with a female tortoise from the nearby Wolf volcano, Lonesome George did mate, but the eggs were infertile.
He also shared his corral with female tortoises from Espanola island, which are genetically closer to him than those from Wolf volcano, but Lonesome George failed to mate with them.
He became a symbol of the Galapagos Islands, which attract some 180,000 visitors a year.
Galapagos National Park officials said that with George’s death, the Pinta tortoise subspecies has become extinct.
They said his body would probably be embalmed to conserve him for future generations.
Tortoises were plentiful on the Galapagos islands until the late 19th century, but were later hunted for their meat by sailors and fishermen to the point of extinction.
Their habitat furthermore suffered when goats were introduced from the mainland.
The differences in appearance between tortoises from different Galapagos islands were among the features which helped the British naturalist Charles Darwin formulate his theory of evolution.
Some 20,000 giant tortoises of other subspecies still live on the Galapagos.
Continuing the tribute, Chris Mazzarella had some stunning photographs on his wonderful photographic blogsite Fast Forward. (Do take a look!) I held my breath and asked Chris for permission to republish his article and was delighted to be given his approval. Thanks Chris, thanks very much.
oooOOOooo
An Ode To George
To pay tribute to our late friend Lonesome George, I thought it would be appropriate to write a post in celebration of turtles. George was the last tortoise of the subspecies Chelonoidis nigra abingdoni from Pinta Island in the Galapagos. Sadly George passed yesterday at the tender age of 100 years. This could be considered middle aged for the tortoise who’s counterparts can live beyond 200 years.
This is easily the smallest one I’ve seen all year. To give you some perspective, this lily pad is about eight inches across.
In Vermont, we have seven species of turtles, and I run into many of them while kayaking around the state. The one I see most often is the painted turtle. I spot these guys by the dozen basking in the sun while I’m paddling throughout the northeast. They are very cooperative subjects, but will head for a swim if you get too close. I don’t like spoiling anyone’s sunbath so I do my best to keep a respectable distance out on the water.
You can check out the biggest turtle I’ve seen all year in an April post entitled Snappers.
I’ve read that snapping turtles are the most common turtle in Vermont, yet I do not see quite as many in my travels. When I do see them, they are usually trolling underwater, covered in algae.
I have encountered a few snappers above the surface this year. I found this old guy lounging on a log in Bradford, Vermont a few days ago.
One of the rarer species of turtle I encountered this spring was a wood turtle in Magalloway Brook. I didn’t have much time to prepare for this shot before he launched off the log and into the water. It was a brief meeting, but certainly a memorable one as this is the only wood turtle I’ve ever photographed.
While turtles are not known for their speed they do offer unique challenges for photographers, particularly when shooting in the sun. Their reflective carapace makes them easy to spot, but difficult to expose for. A polarizing filter is sometimes necessary to reduce the glare on their wet shells. While this will help, the ideal situation is to shoot them under overcast skies.
Another thing to keep in mind is the angle of your shot. The kayak makes a great vehicle for wildlife photography because it keeps you low on the water. I often try to shoot wildlife at eye level. This gives you the same perspective from which the animal views the world. It’s much more interesting than a bird’s eye view, for example, and embodies the subject with the sense of pride that it deserves.
George’s passing marks the end of an important legacy, as the Galapagos turtles played a very important role in the foundation of Darwin’s theory of Natural Selection.
Living in an American Age of Techno-Wonder and Unreason
Introductory note from yours truly!
It’s becoming a regular item in the agenda of Learning from Dogs to republish essays that appear on Tom Engelhardt’s blog, Tom Dispatch. But as the following was, in turn, a republication by Tom of a very interesting essay by Lewis Lapham, I took the precaution of asking Mr. Lapham’s office for permission to republish. It showed the standing in which Tom is held by the very prompt reply from Michelle Legro, Associate Editor of Lapham’s Quarterly who confirmed, “If you have permission from Tom Englehart to republish the piece, than it is fine with us.”
So it’s more than important that today I include Tom’s plea that headed up Lewis Lapham’s article, as follows,
[Note for TomDispatch readers:The 30,000 of you who get email notices whenever a new piece is posted, as well as the tens of thousands who bookmark TD or read its pieces reposted elsewhere, can support this site by encouraging new readers to sign on. TomDispatch spreads mainly thanks to word of mouth, a formidable force in the online world. For those of you already hooked, I urge you to lend the site a little more of that word-of-mouth power. I hope you’ll consider putting together a modest list of friends, colleagues, relatives, or, for that matter, people you like to argue with who might benefit from getting TomDispatch regularly. Urge them to go to the “subscribe” window to the right of the main screen, put in their e-mail addresses, hit “submit,” answer the confirmation letter that will quickly arrive in email boxes (or, fair warning, spam folders), and join the TD crew. Many thanks in advance for your efforts. They do matter! Tom]
So please do subscribe to Tom’s deeply interesting blog. The home page is here and the ‘subscribe’ window is slightly down on the right-hand side of the ‘home’ page. You will not be disappointed.
OK, now on to Nick Turse’s introduction to Lewis Lapham’s article.
It is said, Lewis Lapham tells us, that Abbot John Trithemius of Sponheim, a fifteenth-century scholar and mage, devised a set of incantations to carry “messages instantaneously… through the agency of the stars and planets who rule time.” In 1962, Lapham adds, Bell Labs “converted the thought into Telstar, the communications satellite relaying data, from earth to heaven and back to earth, in less than six-tenths of a second.” Magic had become science. Today, the Pentagon is picking up the centuries old gauntlet, asking the brightest minds in academe — through its far-out research arm, the Defense Advanced Research Projects Agency or DARPA — to come up with a means for a 20-something-kid-cum-lieutenant or perhaps the military’s much-lauded “strategic corporal” to be wired into unprecedented amounts of information beamed down from the heavens above.
At some level, even the language of DARPA’s solicitation for its SeeMe program seems to conjure up the visions that danced in Trithemius’s head. Its goal, we are told, “is to provide useful on-demand imagery information directly to the lowest echelon warfighter in the field from a very low cost satellite constellation launched on a schedule that conforms to DoD [Department of Defense] operational tempos.” Those heavenly-sounding constellations are, however, tempered by the reality of what the Pentagon is really after.
Yesterday’s future of high-tech satellites that would allow our thoughts to slip “the surly bonds of Earth,” while connecting the far reaches of the planet and linking minds globally in ways even Trithemius couldn’t imagine, is now being exchanged for a low-bid, low-rent system of military satellites. These will be capable of allowing a kid just out of high school to more efficiently target a kid who probably never went to high school — all courtesy of a well-educated university scientist who never bothered to think of the implications of his tenure-producing, tax-payer-funded research. This can’t be what Trithemius had in mind. And yet, that’s where we’re at.
If the Pentagon has its way, SeeMe will eventually fill the skies with cheap, disposable “satellites at very low altitudes, networked to existing fielded communications systems and handheld platforms.” So much for the “the high untrespassed sanctity of space.” But let Lewis Lapham explore further the borderlands of science and magic that have somehow been fused into the very center of our lives. The famed former editor of Harper’s Magazine now edits Lapham’s Quarterly, which, four times a year, brilliantly unites some of the most provocative and original voices in history around a single topic. (You can subscribe to it by clicking here.) TomDispatch thanks the editors of that journal for allowing us to offer an exclusive online first look at Lapham’s elegant history of unreason in this techno-age of ours. Nick Turse
oooOOOooo
Magic and the Machine Living in an American Age of Techno-Wonder and Unreason
By Lewis H. Lapham
[A longer version of this essay appears in “Magic Shows,” the Summer 2012 issue of Lapham’s Quarterly, and is posted at TomDispatch.com with the kind permission of that magazine.]
As between the natural and the supernatural, I’ve never been much good at drawing firm distinctions. I know myself to be orbiting the sun at the speed of 65,000 miles per hour, but I can’t shake free of the impression shared by Pope Urban VIII, who in 1633 informed Galileo that the earth doesn’t move. So also the desk over which I bend to write, seemingly a solid mass of wood but in point of fact a restless flux of atoms bubbling in a cauldron equivalent to the one attended by the witches in Macbeth.
Nor do I separate the reality from the virtual reality when conversing with the airy spirits in a cell phone, or while gazing into the wizard’s mirror of a television screen. What once was sorcery maybe now is science, but the wonders technological of which I find myself in full possession, among them indoor plumbing and electric light, I incline to regard as demonstrations magical.
This inclination apparently is what constitutes a proof of being human, a faculty like the possession of language that distinguishes man from insect, guinea hen, and ape. In the beginning was the word, and with it the powers of enchantment. I take my cue from Christopher Marlowe’s tragical drama Doctor Faustus because his dreams of “profit and delight,/Of power, of honor, of omnipotence,” are the stuff that America is made of, as was both the consequence to be expected and the consummation devoutly to be wished when America was formed in the alembic of the Elizabethan imagination. Marlowe was present at the creation, as were William Shakespeare, the navigators Martin Frobisher and Francis Drake, and the Lord Chancellor Francis Bacon envisioning a utopian New Atlantis on the coast of Virginia.
It was an age that delighted in the experiment with miracles, fiction emerging into fact on the far shores of the world’s oceans, fact eliding into fiction in the Globe Theatre on an embankment of the Thames. London toward the end of the sixteenth century served as the clearinghouse for the currencies of the new learning that during the prior 150 years had been gathering weight and value under the imprints of the Italian Renaissance and the Protestant Reformation in Germany. The Elizabethans had in hand the writings of Niccolò Machiavelli and Martin Luther as well as those of Ovid and Lucretius, maps drawn by Gerardus Mercator and Martin Waldseemüller, the observations of Nicolaus Copernicus, Johannes Kepler, Giordano Bruno, and Paracelsus.
The medieval world was dying an uneasy death, but magic remained an option, a direction, and a technology not yet rendered obsolete. Robert Burton, author ofThe Anatomy of Melancholy, found the air “not so full of flies in summer as it is at all times of invisible devils.” To the Puritan dissenters contemplating a departure to a new and better world the devils were all too visible in a land that “aboundeth with murders, slaughters, incests, adulteries, whoredom, drunkenness, oppression, and pride.”
Think Tanks of the Sixteenth and Twentieth Centuries
In both the skilled and unskilled mind, astronomy and astrology were still inseparable, as were chemistry and alchemy, and so it is no surprise to find Marlowe within the orbit of inquisitive “intelligencers” centered on the wealth and patronage of Henry Percy, “the Wizard Earl” of Northumberland, who attracted to his estate in Sussex the presence of Dr. John Dee, physician to Queen Elizabeth blessed with crystal showstones occupied by angels, as well as that of Walter Raleigh, court poet and venture capitalist outfitting a voyage to Guiana to retrieve the riches of El Dorado.
The earl had amassed a library of nearly 2,000 books and equipped a laboratory for his resident magi, chief among them Thomas Hariot, as an astronomer known for his improvement of the telescope (the “optic tube”), and as a mathematician for his compilation of logarithmic tables. As well versed in the science of the occult as he was practiced in the study of geography, Hariot appears in Charles Nicholl’s book The Reckoning as a likely model for Marlowe’s Faustus.
During the same month last spring in which I was reading Nicholl’s account of the Elizabethan think tank assembled by the Wizard Earl, I came across its twentieth-century analog in Jon Gertner’s The Idea Factory: Bell Labs and the Great Age of American Innovation. As in the sixteenth century, so again in the twentieth: a gathering of forces both natural and supernatural in search of something new under the sun.
The American Telephone and Telegraph Company undertook to research and develop the evolving means of telecommunication, and to that end it established an “institute of creative technology” on a 225-acre campus in Murray Hill, New Jersey, by 1942 recruiting nearly 9,000 magi of various description (engineers and chemists, metallurgists, and physicists) set to the task of turning sand into light, the light into gold.
All present were encouraged to learn and borrow from one another, to invent literally fantastic new materials to fit the trajectories of fanciful new hypotheses. Together with the manufacture of the laser and the transistor, the labs derived from Boolean algebra the binary code that allows computers to speak to themselves of more things in heaven and earth than were dreamed of in the philosophies of either Hamlet or Horatio.
Gertner attributes the epistemological shape-shifting to the mathematician Claude Shannon, who intuited the moving of “written and spoken exchanges ever deeper into the realm of ciphers, symbols, and electronically enhanced puzzles of representation” — i.e., toward the “lines, circles, scenes, letters, and characters” that Faustus most desired. The correspondence is exact, as is the one to be drawn from John Crowley’s essay, “A Well Without a Bottom,” that recalls the powers of the Abbot Trithemius of Sponheim, a fifteenth-century mage who devised a set of incantations “carrying messages instantaneously… through the agency of the stars and planets who rule time.” Bell Labs in 1962 converted the thought into Telstar, the communications satellite relaying data, from earth to heaven and back to earth, in less than six-tenths of a second.
Between the 1940s and the 1980s, Bell Labs produced so many wonders both military and civilian (the DEW line and the Nike missile as well as the first cellular phone) that AT&T’s senior management was hard put to correct the news media’s tendency to regard the Murray Hill estate as “a house of magic.” The scientists in residence took pains to discount the notion of rabbits being pulled from hats, insisting that the work in hand followed from a patient sequence of trial and error rather than from the silk-hatted magician Eisenheim’s summoning with cape and wand the illusions of “The Magic Kettle” and “The Mysterious Orange Tree” to theater stages in nineteenth-century Paris, London, and Berlin.
The disavowals fell on stony ground. Time passed; the wonders didn’t cease, and by 1973 Arthur C. Clarke, the science-fiction writer believed by his admirers to be the twentieth-century avatar of Shakespeare’s Prospero, had confirmed the truth apparent to both Ariel and Caliban: “Any sufficiently advanced technology is indistinguishable from magic.”
As chairman of the British Interplanetary Society during the 1950s, Clarke had postulated stationing a communications satellite 22,300 miles above the equator in what is now recognized by the International Astronomical Union as “The Clarke Orbit,” and in 1968 he had co-written the film script for 2001: A Space Odyssey. The opening sequence — during which an ape heaves into thin air a prehistoric bone that becomes a spaceship drifting among the stars — encompasses the spirit of an age that maybe once was Elizabethan but lately has come to be seen as a prefiguration of our own.
The New World’s Magical Beginnings (and Endings)
New philosophies call all in doubt, the more so as the accelerating rates of technological advance — celestial, terrestrial, and subliminal — overrun the frontiers between science, magic, and religion. The inventors of America’s liberties, their sensibilities born of the Enlightenment, understood the new world in America as an experiment with the volatile substance of freedom. Most of them were close students of the natural sciences: Thomas Paine an engineer, Benjamin Rush a physician and chemist, Roger Sherman an astronomer, Thomas Jefferson an architect and agronomist.
Intent upon enlarging the frame of human happiness and possibility, they pursued the joy of discovery in as many spheres of reference as could be crowded onto the shelves of a Philadelphia library or a Boston philosophical society. J. Hector St. John de Crèvecoeur, colonist arriving from France in 1755, writes in his Letters from an American Farmer to express gratitude for the spirit in which Benjamin Franklin’s invention of the lightning rod — “by what magic I know not” — was both given and received: “Would you believe that the great electrical discoveries of Mr. Franklin have not only preserved our barns and our houses from the fire of heaven but have even taught our wives to multiply their chickens?”
A similar approach to the uses of learning informed Jefferson’s best hopes for the new nation’s colleges and schools, and for the better part of the last two centuries it has underwritten the making of America into what the historian Henry Steele Commager named “the empire of reason.” An empire that astonishes the world with the magnificence of its scientific research laboratories, but one never safe from frequent uprisings in the rebel provinces of unreason.
Like England in the late sixteenth century, America in the early twenty-first has in hand a vast store of new learning, much of it seemingly miraculous — the lines and letters that weave the physics and the metaphysics into strands of DNA, Einstein’s equations, Planck’s constant and the Schwarzschild radius, the cloned sheep and artificial heart. America’s scientists come away from Stockholm nearly every year with a well-wrought wreath of Nobel prizes, and no week goes by without the unveiling of a new medical device or weapons system.
The record also suggests that the advancement of our new and marvelous knowledge has been accompanied by a broad and popular retreat into the wilderness of smoke and mirrors. The fear of new wonders technological — nuclear, biochemical, and genetic — gives rise to what John Donne presumably would have recognized as the uneasy reawakening of a medieval belief in magic.
We find our new Atlantis within the heavenly books of necromancy inscribed on walls of silicon and glass, the streaming data on an iPad or a television screen lending itself more readily to the traffic in spells and incantation than to the distribution of reasoned argument. The less that can be seen and understood of the genies escaping from their bottles at Goldman Sachs and MIT, the more headlong the rush into the various forms of wishful thinking that increasingly have become the stuff of which we make our politics and social networking, our news and entertainment, our foreign policy and gross domestic product.
How else to classify the Bush administration’s invasion of Iraq if not as an attempt at alchemy? At both the beginning and end of the effort to transform the whole of the Islamic Middle East into a democratic republic like the one pictured in the ads inviting tourists to Colonial Williamsburg, the White House and the Pentagon issued press releases in the voice of the evil angel counseling Faustus, “Be thou on earth as Jove is in the sky,/Lord and commander of these elements.”
Charles Krauthammer, neoconservative newspaper columnist and leading soloist in the jingo chorus of the self-glorifying news media, amplified the commandment for the readers of Time magazine in March 2001, pride going before the fall six months later of the World Trade Center: “America is in a position to reshape norms, alter expectations, and create new realities. How? By unapologetic and implacable demonstrations of will.”
So again four years later, after it had become apparent that Saddam Hussein’s weapons of mass destruction were made of the same stuff as Eisenheim’s projection of “The Vanishing Lady.” The trick had been seen for what it was, but Defense Secretary Donald Rumsfeld emerged from the cloud of deluded expectation, unapologetic and implacable, out of which he had spoken to the groundlings at a NATO press conference in 2002: “The message is that there are no ‘knowns.’ There are things we know that we know. There are known unknowns… but there are also unknown unknowns… The absence of evidence is not evidence of absence.”
“Perform What Desperate Enterprise I Will”
The Rumsfeldian message accounts not only for what was intended as a demonstration magical in Iraq, but also for the Obama administration’s current purpose in Afghanistan, which is to decorate a wilderness of tribal warfare with the potted plant of a civilized and law-abiding government that doesn’t exist. Choosing to believe in what isn’t there accords with the practice adopted on Wall Street that brought forth the collapse of the country’s real-estate and financial markets in 2008.
The magnitude of the losses measured the extent to which America assigns to the fiction of its currency the supernatural powers of a substance manufactured by a compensation committee of sixteenth-century alchemists. The debacle was not without precedent. Thomas Paine remarked on the uses of paper money (“horrid to see, and hurtful to recollect”) that made a mess of America’s finances during its War of Independence, “It is like putting an apparition in place of a man; it vanishes with looking at, and nothing remains but the air.”
Paine regarded the “emissions” of paper money as toxic, fouling the air with the diseases (vanity, covetousness, and pride) certain to destroy the morals of the country as well as its experiment with freedom. A report entitled “Scientific Integrity in Policy Making,” issued in February 2004 by the Union of Concerned Scientists, advanced Paine’s argument against what it diagnosed as the willed ignorance infecting the organism of the Bush administration.
Signed by more than 60 of the country’s most accomplished scientists honored for their work in many disciplines (molecular biology, superconductivity, particle physics, zoology), the report bore witness to their experience when called upon to present a federal agency or congressional committee with scientific data bearing on a question of the public health and welfare. Time and again in the 40-page report, the respondents mention the refusal on the part of their examiners to listen to, much less accept, any answers that didn’t fit with the administration’s prepaid and prerecorded political agenda.
Whether in regard to the lifespan of a bacteria or the trajectory of a cruise missile, ideological certainty overruled the objections raised by counsel on behalf of logic and deductive reasoning. On topics as various as climate change, military intelligence, and the course of the Missouri River, the reincarnations of Pope Urban VIII reaffirmed their conviction that if the science didn’t prove what it had been told to prove, then the science had been tampered with by Satan.
The report spoke to the disavowal of the principle on which the country was founded, but it didn’t attract much notice in the press or slow down the retreat into the provinces of unreason. The eight years that have passed since its publication have brought with them not only the illusion of “The Magic Kettle” on Wall Street, but also the election of President Barack Obama in the belief that he would enter the White House as the embodiment of Merlin or Christ.
To the extent that more people become more frightened of a future that calls all into doubt, they exchange the force of their own thought for the power they impute to supernatural machines. To wage the war against terror the Pentagon sends forth drones, robots, and surveillance cameras, hard-wired as were the spirits under the command of Faustus, “to fetch me what I please,/Resolve me of all ambiguities,/Perform what desperate enterprise I will.”
Wall Street clerks subcontract the placing of $100 billion bets to the judgment of computer databanks that stand as silent as the stones on Easter Island, while calculating at the speed of light the rates of exchange between the known unknowns and the unknown unknowns. By way of projecting a federal budget deficit into both the near and distant future, the season’s presidential candidates float cloud-capped towers of imaginary numbers destined to leave not a rack behind.
The American body politic meanwhile dissolves into impoverished constituencies of one, stripped of “profit and delight” in the realm of fact, but still sovereign in the land of make-believe. Every once and future king is possessed of a screen like the enchanted mirror that Lady Galadriel shows to Frodo Baggins in the garden at Caras Galadhon; the lost and wounded self adrift in a sea of troubles but equipped with the remote control that once was Prospero’s; blessed, as was the tragical Doctor Faustus, with instant access to the dreams “of power, of honor, of omnipotence.”
Lewis H. Lapham is editor of Lapham’s Quarterly. Formerly editor of Harper’s Magazine, he is the author of numerous books, including Money and Class in America, Theater of War, Gag Rule, and, most recently, Pretensions to Empire. The New York Times has likened him to H.L. Mencken; Vanity Fair has suggested a strong resemblance to Mark Twain; and Tom Wolfe has compared him to Montaigne. This essay, shortened for TomDispatch, introduces “Magic Shows,” the Summer 2012 issue of Lapham’s Quarterly.