A massive cull of pet cats and dogs in the UK during WW11.
Out of the blue the other day Margaret from Tasmania sent me an email.
I happened to come across this rather sad but interesting story.
Thought you might like to read it.
– Margaret (from Tasmania)
The email contained a link to this very sad information.
The little-told story of the massive WWII pet cull
By Alison Feeney-Hart
BBC News Magazine
12th October, 2013
At the beginning of World War II, a government pamphlet led to a massive cull of British pets. As many as 750,000 British pets were killed in just one week. This little-discussed moment of panic is explored in a new book.
The cull came as the result of a public information campaign that caused an extraordinary reaction among anxious Britons.
In the summer of 1939, just before the outbreak of war, the National Air Raid Precautions Animals Committee (NARPAC) was formed. It drafted a notice – Advice to Animal Owners.
The pamphlet said: “If at all possible, send or take your household animals into the country in advance of an emergency.” It concluded: “If you cannot place them in the care of neighbours, it really is kindest to have them destroyed.”
The advice was printed in almost every newspaper and announced on the BBC. It was “a national tragedy in the making”, says Clare Campbell, author of new book Bonzo’s War: Animals Under Fire 1939 -1945.
Campbell recalls a story about her uncle. “Shortly after the invasion of Poland, it was announced on the radio that there might be a shortage of food. My uncle announced that the family pet Paddy would have to be destroyed the next day.”
After war was declared on 3 September 1939, pet owners thronged to vets’ surgeries and animal homes.
“Animal charities, the PDSA, the RSPCA and vets were all opposed to the killing of pets and very concerned about people just dumping animals on their doorsteps at the start of the war,” says historian Hilda Kean.
Battersea Dogs and Cats Home opened its doors in 1860 and survived both wars. “Many people contacted us after the outbreak of war to ask us to euthanise their pets – either because they were going off to war, they were bombed, or they could no longer afford to keep them during rationing,” a spokesman says.
“Battersea actually advised against taking such drastic measures and our then manager Edward Healey-Tutt wrote to people asking them not to be too hasty.”
But Campbell cites an Arthur Moss of the RSPCA who, “gloomily pronounced that the primary task for them all would be the destruction of animals”.
In the first few days of war, PDSA hospitals and dispensaries were overwhelmed by owners bringing their pets for destruction. PDSA founder Maria Dickin reported: “Our technical officers called upon to perform this unhappy duty will never forget the tragedy of those days.”
In Memoriam notices started to appear in the press. “Happy memories of Iola, sweet faithful friend, given sleep September 4th 1939, to be saved suffering during the war. A short but happy life – 2 years, 12 weeks. Forgive us little pal,” said one in Tail-Wagger Magazine.
The first bombing of London in September 1940 prompted more pet owners to rush to have their pets destroyed.
Many people panicked, but others tried to restore calm. “Putting your pets to sleep is a very tragic decision. Do not take it before it is absolutely necessary,” urged Susan Day in the Daily Mirror.
But the government pamphlet had sowed a powerful seed.
“People were basically told to kill their pets and they did. They killed 750,000 of them in the space of a week – it was a real tragedy, a complete disaster,” says Christy Campbell, who helped write Bonzo’s War.
Historian Hilda Kean says that it was just another way of signifying that war had begun. “It was one of things people had to do when the news came – evacuate the children, put up the blackout curtains, kill the cat.”
It was the lack of food, not bombs, that posed the biggest threat to wartime pets. There was no food ration for cats and dogs.
But many owners were able to make do. Pauline Caton was just five years old at the time and lived in Dagenham. She remembers “queuing up with the family at Blacks Market in Barking to buy horsemeat to feed the family cat”.
And even though there were just four staff at Battersea, the home managed to feed and care for 145,000 dogs during the course of the war.
In the middle of the pet-culling mayhem, some people tried desperately to intervene. The Duchess of Hamilton – both wealthy and a cat lover – rushed from Scotland to London with her own statement to be broadcast on the BBC. “Homes in the country urgently required for those dogs and cats which must otherwise be left behind to starve to death or be shot.”
“Being a duchess she had a bit of money and established an animal sanctuary,” says historian Kean. The “sanctuary” was a heated aerodrome in Ferne. The duchess sent her staff out to rescue pets from the East End of London. Hundreds and hundreds of animals were taken back initially to her home in St John’s Wood. She apologised to the neighbours who complained about the barking.
But at a time of such uncertainty, many pet owners were swayed by the worst-case scenario.
“People were worried about the threat of bombing and food shortages, and felt it inappropriate to have the ‘luxury’ of a pet during wartime,” explains Pip Dodd, senior curator at the National Army Museum.
“The Royal Army Veterinary Corps and the RSPCA tried to stop this, particularly as dogs were needed for the war effort.”
Ultimately, given the unimaginable human suffering that followed over the six years of the war, it is perhaps understandable that the extraordinary cull of pets is not better known.
But the episode brought another sadness to people panicked and fearful at the start of hostilities.
The story is not more widely known because it was a difficult story to tell, says Kean.
“It isn’t well known that so many pets were killed because it isn’t a nice story, it doesn’t fit with this notion of us as a nation of animal lovers. People don’t like to remember that at the first sign of war we went out to kill the pussycat,” she says.
It’s very difficult to make one’s mind up. As was written there were no food ration cards for pets.
But at the same time this huge pet cull was too much, too soon.
As was written, “The story is not more widely known because it was a difficult story to tell, says (Hilda) Kean.
“It isn’t well known that so many pets were killed because it isn’t a nice story, it doesn’t fit with this notion of us as a nation of animal lovers. People don’t like to remember that at the first sign of war we went out to kill the pussycat,” she says.”
It was a most interesting link albeit a very sad one.
People just seem to love snub-nosed dogs. From bulldogs and pugs to Boston terriers and Cavalier King Charles spaniels, these flat-faced breeds are regulars at the dog parks and stars on social media.
According to the American Kennel Club, French bulldogs and bulldogs are the fourth and fifth most popular breeds in the U.S. (following only Labrador retrievers, German shepherds and golden retrievers). Their faces are just so photogenic and cute.
Breeds with broad, short skulls are called brachycephalic. They have flat faces and large, wide-set eyes that give them somewhat of a baby-like appearance. As common as these breeds are in public, they’re also regular patients at the veterinarian’s office because they’re more likely to have an array of health conditions, often because of breathing problems called brachycephalic syndrome. A survey of five years of Australian pet health insurance claims found that the average annual veterinary bill for a British bulldog was $965 compared to $445 for a mixed breed.
Here are some of the medical problems that come along with those photogenic faces.
Heat and summer
Dogs with short snouts are at a higher risk of heat-related issues because their anatomy makes it harder for them to have easy breathing, especially in the heat and humidity. Make sure to have plenty of water on hand, keep pets in the shade and ideally, indoors, during the hottest hours of the day.
Pugs and other brachycephalic breeds often make snoring, wheezing noises. (Photo: fongleon356/Shutterstock)Narrowed nostrils and elongation of the soft palate in snub-nosed dogs obstructs the passage of air through the nose and throat. That’s why these dogs often seem to be making snoring, wheezing or snorting noises. It’s a good idea to make sure your vet closely monitors what’s going on to make sure the noises don’t change or there isn’t an obstruction.
With their big, wide-set eyes, brachycephalic breeds are more likely to develop certain opthalmologic issues. Because they have a shallow eye socket that gives them the “bulging eyes” look, many of these dogs can’t always fully blink. This can lead to dry corneas and corneal ulcers, according to The Kennel Club. Their unusual eye and eyelid anatomy also makes them more likely to have conjunctivitis and eye injuries.
Along with breathing problems, flat-faced dogs are also often more likely to have skin problems, according to an American Veterinary Medical Association (AVMA) analysis of pet insurance claims. It’s because these dogs often have deep skin folds and wrinkles. They are often more likely to have issues with fungal skin disease, allergic dermatitis, ear infections and pyoderma (a painful skin disease with painful pustules).
What are the brachycephalic breeds?
Not sure if that smushy-faced pup is one to worry about? Nationwide Pet Insurance identifies two dozen breeds that fall under the brachycephalyic breed description:
Bulldog (Olde English)
Cavalier King Charles spaniel
Dogue de Bordeaux
Olde English bulldog
There are many more “brachycephalyic breed” dogs than I realised. This was an important article, me thinks. Many, many readers of this place will have one.
A deeply fascinating essay from an individual at the University of Oxford.
I have long read the daily output from The Conversation. It’s a very useful way of keeping one’s brain cells functioning in some sort of fashion.
Yesterday morning I read an essay put out by Thomas Moynihan, a PhD Candidate at the University of Oxford.
It was fascinating and I am republishing it here.
Now it’s not for everyone. It is also long and it also has a number of videos to watch. And there’s not a dog mentioned!
But if you are interested in where we, as in human beings, are ‘going’, so to speak, then this is for you.
And I’m ready to admit that it may be an age thing; something that is of much interest to me because I shall be 75 in November and one naturally wonders about the end of life. Both individually and of society!
The end of the world: a history of how a silent cosmos led humans to fear the worst.
It is 1950 and a group of scientists are walking to lunch against the majestic backdrop of the Rocky Mountains. They are about to have a conversation that will become scientific legend. The scientists are at the Los Alamos Ranch School, the site for the Manhattan Project, where each of the group has lately played their part in ushering in the atomic age.
They are laughing about a recent cartoon in the New Yorker offering an unlikely explanation for a slew of missing public trash cans across New York City. The cartoon had depicted “little green men” (complete with antenna and guileless smiles) having stolen the bins, assiduously unloading them from their flying saucer.
By the time the party of nuclear scientists sits down to lunch, within the mess hall of a grand log cabin, one of their number turns the conversation to matters more serious. “Where, then, is everybody?”, he asks. They all know that he is talking – sincerely – about extraterrestrials.
Bin-stealing UFOs notwithstanding, humanity still hasn’t found any evidence of intelligent activity among the stars. Not a single feat of “astro-engineering”, no visible superstructures, not one space-faring empire, not even a radio transmission. It has beenargued that the eerie silence from the sky above may well tell us something ominous about the future course of our own civilisation.
Such fears are ramping up. Last year, the astrophysicist Adam Frank implored an audience at Google that we see climate change – and the newly baptised geological age of the Anthropocene – against this cosmological backdrop. The Anthropocene refers to the effects of humanity’s energy-intensive activities upon Earth. Could it be that we do not see evidence of space-faring galactic civilisations because, due to resource exhaustion and subsequent climate collapse, none of them ever get that far? If so, why should we be any different?
A few months after Frank’s talk, in October 2018, the Intergovernmental Panel on Climate Change’s update on global warming caused a stir. It predicted a sombre future if we do not decarbonise. And in May, amid Extinction Rebellion’s protests, a new climate report upped the ante, warning: “Human life on earth may be on the way to extinction.”
Meanwhile, NASA has been publishing press releases about an asteroid set to hit New York within a month. This is, of course, a dress rehearsal: part of a “stress test” designed to simulate responses to such a catastrophe. NASA is obviously fairly worried by the prospect of such a disaster event – such simulations are costly.
Space tech Elon Musk has also been relaying his fears about artificial intelligence to YouTube audiences of tens of millions. He and others worry that the ability for AI systems to rewrite and self-improve themselves may trigger a sudden runaway process, or “intelligence explosion”, that will leave us far behind – an artificial superintelligence need not even be intentionally malicious in order to accidentally wipe us out.
In 2015, Musk donated to Oxford’s Future of Humanity Institute, headed up by transhumanist Nick Bostrom. Nestled within the university’s medieval spires, Bostrom’s institute scrutinises the long-term fate of humanity and the perils we face at a truly cosmic scale, examining the risks of things such as climate, asteroids and AI. It also looks into less well-publicised issues. Universe destroying physics experiments, gamma-ray bursts, planet-consuming nanotechnology and exploding supernovae have all come under its gaze.
So it would seem that humanity is becoming more and more concerned with portents of human extinction. As a global community, we are increasingly conversant with increasingly severe futures. Something is in the air.
But this tendency is not actually exclusive to the post-atomic age: our growing concern about extinction has a history. We have been becoming more and more worried for our future for quite some time now. My PhD research tells the story of how this began. No one has yet told this story, yet I feel it is an important one for our present moment.
I wanted to find out how current projects, such as the Future of Humanity Institute, emerge as offshoots and continuations of an ongoing project of “enlightenment” that we first set ourselves over two centuries ago. Recalling how we first came to care for our future helps reaffirm why we should continue to care today.
Extinction, 200 years ago
In 1816, something was also in the air. It was a 100-megaton sulfate aerosol layer. Girdling the planet, it was made up of material thrown into the stratosphere by the eruption of Mount Tambora, in Indonesia, the previous year. It was one of the biggest volcanic eruptions since civilisation emerged during the Holocene.
Almost blotting out the sun, Tambora’s fallout caused a global cascade of harvest collapse, mass famine, cholera outbreak and geopolitical instability. And it also provoked the first popular fictional depictions of human extinction. These came from a troupe of writers including Lord Byron, Mary Shelley and Percy Shelley.
The group had been holidaying together in Switzerland when titanic thunderstorms, caused by Tambora’s climate perturbations, trapped them inside their villa. Here they discussed humanity’s long-term prospects.
Clearly inspired by these conversations and by 1816’s hellish weather, Byron immediately set to work on a poem entitled “Darkness”. It imagines what would happen if our sun died:
I had a dream, which was not all a dream
The bright sun was extinguish’d, and the stars
Did wander darkling in the eternal space
Rayless, and pathless, and the icy earth
Swung blind and blackening in the moonless air
Detailing the ensuing sterilisation of our biosphere, it caused a stir. And almost 150 years later, against the backdrop of escalating Cold War tensions, the Bulletin for Atomic Scientists again called upon Byron’s poem to illustrate the severity of nuclear winter.
Two years later, Mary Shelley’s Frankenstein (perhaps the first book on synthetic biology) refers to the potential for the lab-born monster to outbreed and exterminate Homo sapiens as a competing species. By 1826, Mary went on to publish The Last Man. This was the first full-length novel on human extinction, depicted here at the hands of pandemic pathogen.
Beyond these speculative fictions, other writers and thinkers had already discussed such threats. Samuel Taylor Coleridge, in 1811, daydreamed in his private notebooks about our planet being “scorched by a close comet and still rolling on – cities men-less, channels riverless, five mile deep”. In 1798, Mary Shelley’s father, the political thinker William Godwin, queried whether our species would “continue forever”?
While just a few years earlier, Immanuel Kant had pessimistically proclaimed that global peace may be achieved “only in the vast graveyard of the human race”. He would, soon after, worry about a descendent offshoot of humanity becoming more intelligent and pushing us aside.
Earlier still, in 1754, philosopher David Hume had declared that “man, equally with every animal and vegetable, will partake” in extinction. Godwin noted that “some of the profoundest enquirers” had lately become concerned with “the extinction of our species”.
In 1816, against the backdrop of Tambora’s glowering skies, a newspaper article drew attention to this growing murmur. It listed numerous extinction threats. From global refrigeration to rising oceans to planetary conflagration, it spotlighted the new scientific concern for human extinction. The “probability of such a disaster is daily increasing”, the article glibly noted. Not without chagrin, it closed by stating: “Here, then, is a very rational end of the world!”
Before this, we thought the universe was busy
So if people first started worrying about human extinction in the 18th century, where was the notion beforehand? There is enough apocalypse in scripture to last until judgement day, surely. But extinction has nothing to do with apocalypse. The two ideas are utterly different, even contradictory.
For a start, apocalyptic prophecies are designed to reveal the ultimate moral meaning of things. It’s in the name: apocalypse means revelation. Extinction, by direct contrast, reveals precisely nothing and this is because it instead predicts the end of meaning and morality itself – if there are no humans, there is nothing humanly meaningful left.
And this is precisely why extinction matters. Judgement day allows us to feel comfortable knowing that, in the end, the universe is ultimately in tune with what we call “justice”. Nothing was ever truly at stake. On the other hand, extinction alerts us to the fact that everything we hold dear has always been in jeopardy. In other words, everything is at stake.
Extinction was not much discussed before 1700 due to a background assumption, widespread prior to the Enlightenment, that it is the nature of the cosmos to be as full as moral value and worth as is possible. This, in turn, led people to assume that all other planets are populated with “living and thinking beings” exactly like us.
Although it only became a truly widely accepted fact after Copernicus and Kepler in the 16th and 17th centuries, the idea of plural worlds certainly dates back to antiquity, with intellectuals from Epicurus to Nicholas of Cusa proposing them to be inhabited with lifeforms similar to our own. And, in a cosmos that is infinitely populated with humanoid beings, such beings – and their values – can never fully go extinct.
In the 1660s, Galileo confidently declared that an entirely uninhabited or unpopulated world is “naturally impossible” on account of it being “morally unjustifiable”. Gottfried Leibniz later pronounced that there simply cannot be anything entirely “fallow, sterile, or dead in the universe”.
Along the same lines, the trailblazing scientist Edmond Halley (after whom the famous comet is named) reasoned in 1753 that the interior of our planet must likewise be “inhabited”. It would be “unjust” for any part of nature to be left “unoccupied” by moral beings, he argued.
Around the same time Halley provided the first theory on a “mass extinction event”. He speculated that comets had previously wiped out entire “worlds” of species. Nonetheless, he also maintained that, after each previous cataclysm “human civilisation had reliably re-emerged”. And it would do so again. Only this, he said could make such an event morally justifiable.
Later, in the 1760s, the philosopher Denis Diderot was attending a dinner party when he was asked whether humans would go extinct. He answered “yes”, but immediately qualified this by saying that after several millions of years the “biped animal who carries the name man” would inevitably re-evolve.
This is what the contemporary planetary scientist Charles Lineweaver identifies as the “Planet of the Apes Hypothesis”. This refers to the misguided presumption that “human-like intelligence” is a recurrent feature of cosmic evolution: that alien biospheres will reliably produce beings like us. This is what is behind the wrong-headed assumption that, should we be wiped out today, something like us will inevitably return tomorrow.
Back in Diderot’s time, this assumption was pretty much the only game in town. It was why one British astronomer wrote, in 1750, that the destruction of our planet would matter as little as “Birth-Days or Mortalities” do down on Earth.
This was typical thinking at the time. Within the prevailing worldview of eternally returning humanoids throughout an infinitely populated universe, there was simply no pressure or need to care for the future. Human extinction simply couldn’t matter. It was trivialised to the point of being unthinkable.
For the same reasons, the idea of the “future” was also missing. People simply didn’t care about it in the way we do now. Without the urgency of a future riddled with risk, there was no motivation to be interested in it, let alone attempt to predict and preempt it.
It was the dismantling of such dogmas, beginning in the 1700s and ramping up in the 1800s, that set the stage for the enunciation of Fermi’s Paradox in the 1900s and leads to our growing appreciation for our cosmic precariousness today.
But then we realised the skies are silent
In order to truly care about our mutable position down here, we first had to notice that the cosmic skies above us are crushingly silent. Slowly at first, though soon after gaining momentum, this realisation began to take hold around the same time that Diderot had his dinner party.
One of the first examples of a different mode of thinking I’ve found is from 1750, when the French polymath Claude-Nicholas Le Cat wrote a history of the earth. Like Halley, he posited the now familiar cycles of “ruin and renovation”. Unlike Halley, he was conspicuously unclear as to whether humans would return after the next cataclysm. A shocked reviewer picked up on this, demanding to know whether “Earth shall be re-peopled with new inhabitants”. In reply, the author facetiously asserted that our fossil remains would “gratify the curiosity of the new inhabitants of the new world, if there be any”. The cycle of eternally returning humanoids was unwinding.
In line with this, the French encyclopaedist Baron d’Holbach ridiculed the “conjecture that other planets, like our own, are inhabited by beings resembling ourselves”. He noted that precisely this dogma – and the related belief that the cosmos is inherently full of moral value – had long obstructed appreciation that the human species could permanently “disappear” from existence. By 1830, the German philosopher F W J Schelling declared it utterly naive to go on presuming “that humanoid beings are found everywhere and are the ultimate end”.
And so, where Galileo had once spurned the idea of a dead world, the German astronomer Wilhelm Olbers proposed in 1802 that the Mars-Jupiter asteroid belt in fact constitutes the ruins of a shattered planet. Troubled by this, Godwin noted that this would mean that the creator had allowed part of “his creation” to become irremediably “unoccupied”. But scientists were soon computing the precise explosive force needed to crack a planet – assigning cold numbers where moral intuitions once prevailed. Olbers calculated a precise timeframe within which to expect such an event befalling Earth. Poets began writing of “bursten worlds”.
The cosmic fragility of life was becoming undeniable. If Earth happened to drift away from the sun, one 1780s Parisian diarist imagined that interstellar coldness would “annihilate the human race, and the earth rambling in the void space, would exhibit a barren, depopulated aspect”. Soon after, the Italian pessimist Giacomo Leopardi envisioned the same scenario. He said that, shorn of the sun’s radiance, humanity would “all die in the dark, frozen like pieces of rock crystal”.
Galileo’s inorganic world was now a chilling possibility. Life, finally, had become cosmically delicate. Ironically, this appreciation came not from scouring the skies above but from probing the ground below. Early geologists, during the later 1700s, realised that Earth has its own history and that organic life has not always been part of it. Biology hasn’t even been a permanent fixture down here on Earth – why should it be one elsewhere? Coupled with growing scientific proof that many species had previously become extinct, this slowly transformed our view of the cosmological position of life as the 19th century dawned.
Seeing death in the stars
And so, where people like Diderot looked up into the cosmos in the 1750s and saw a teeming petri dish of humanoids, writers such as Thomas de Quincey were, by 1854, gazing upon the Orion nebula and reporting that they saw only a gigantic inorganic “skull” and its lightyear-long rictus grin.
The astronomer William Herschel had, already in 1814, realised that looking out into the galaxy one is looking into a “kind of chronometer”. Fermi would spell it out a century after de Quincey, but people were already intuiting the basic notion: looking out into dead space, we may just be looking into our own future.
People were becoming aware that the appearance of intelligent activity on Earth should not be taken for granted. They began to see that it is something distinct – something that stands out against the silent depths of space. Only through realising that what we consider valuable is not the cosmological baseline did we come to grasp that such values are not necessarily part of the natural world. Realising this was also realising that they are entirely our own responsibility. And this, in turn, summoned us to the modern projects of prediction, preemption and strategising. It is how we came to care about our future.
As soon as people first started discussing human extinction, possible preventative measures were suggested. Bostrom now refers to this as “macrostrategy”. However, as early as the 1720s, the French diplomat Benoît de Maillet was suggesting gigantic feats of geoengineering that could be leveraged to buffer against climate collapse. The notion of humanity as a geological force has been around ever since we started thinking about the long-term – it is only recently that scientists have accepted this and given it a name: “Anthropocene”.
Will technology save us?
It wasn’t long before authors began conjuring up highly technologically advanced futures aimed at protecting against existential threat. The eccentric Russian futurologist Vladimir Odoevskii, writing in the 1830s and 1840s, imagined humanity engineering the global climate and installing gigantic machines to “repulse” comets and other threats, for example. Yet Odoevskii was also keenly aware that with self-responsibility comes risk: the risk of abortive failure. Accordingly, he was also the very first author to propose the possibility that humanity might destroy itself with its own technology.
Acknowledgement of this plausibility, however, is not necessarily an invitation to despair. And it remains so. It simply demonstrates appreciation of the fact that, ever since we realised that the universe is not teeming with humans, we have come to appreciate that the fate of humanity lies in our hands. We may yet prove unfit for this task, but – then as now – we cannot rest assured believing that humans, or something like us, will inevitably reappear – here or elsewhere.
Beginning in the late 1700s, appreciation of this has snowballed into our ongoing tendency to be swept up by concern for the deep future. Current initiatives, such as Bostrom’s Future of Humanity Institute, can be seen as emerging from this broad and edifying historical sweep. From ongoing demands for climate justice to dreams of space colonisation, all are continuations and offshoots of a tenacious task that we first began to set for ourselves two centuries ago during the Enlightenment when we first realised that, in an otherwise silent universe, we are responsible for the entire fate of human value.
It may be solemn, but becoming concerned for humanity’s extinction is nothing other than realising one’s obligation to strive for unceasing self-betterment. Indeed, ever since the Enlightenment, we have progressively realised that we must think and act ever better because, should we not, we may never think or act again. And that seems – to me at least – like a very rational end of the world.
I hope you have read it all. There’s much to engage one. And the message to me is very clear: We have to regard this race, correction: our race, as unique. As is put in the penultimate paragraph:
“Enlightenment when we first realised that, in an otherwise silent universe, we are responsible for the entire fate of human value.”
Now there’s a thought for an atheist on a Saturday morning!
The more we rely on technology to make us efficient, the fewer skills we have to confront the unexpected, says writer and entrepreneur Margaret Heffernan. She shares why we need less tech and more messy human skills — imagination, humility, bravery — to solve problems in business, government and life in an unpredictable age. “We are brave enough to invent things we’ve never seen before,” she says. “We can make any future we choose.”
Later on it explains: “The former CEO of five businesses, Margaret Heffernan explores the all-too-human thought patterns that lead organizations and managers astray.”
In doing more research I came upon this:
Margaret Heffernan was born in Texas, grew up in the Netherlands and was educated at Cambridge University. She produced drama and documentary programs for the BBC for 13 years, then moved back to the US where she became a serial entrepreneur and CEO in the early days of the internet.
All of Heffernan’s work challenges accepted wisdom about good lives and good work. Willful Blindness: Why We Ignore the Obvious at Our Peril, named one of the most important business books of the decade by the Financial Times, looked at how our most cherished beliefs, behaviors and rules blind us to what matters most.
In 2015, she was awarded the Transmission Prize for A Bigger Prize: How We Can Do Better than the Competition, a book that upended the idea that competition forces the best to the top, arguing that it mostly proves wasteful and destructive where collaboration is more sustainable and creative.
Her forthcoming book, Uncharted: How to Map the Future will be published in February 2020 in the UK and May 2020 in the US. It addresses the fundamental unpredictability of life, challenges technological determinism and asks how we can find in ourselves the freedom and imagination to create the futures we want. An early reader called it “Karl Popper for the 21st century.”
As lead faculty for the Forward Institute’s Responsible Leadership Programme, Heffernan mentors CEOs and senior executives of major global organizations.
I am not a great Facebook user. I have nothing against the app just prefer not to be active in terms of my comings and goings. However, I do automatically send posts from this blog across to Facebook. Some of my followers come from FB.
As was the case with Michelle Orcutt.
I went across to her FB ‘page’ to leave my thanks for her follow and read a wonderful account of Diya.
Michelle kindly gave me permission to republish the article in this place. Here it is.
By Michelle Orcutt
Most of my friends know I adopted Diya, originally a street puppy from India (aka a desi dog, an Indie, a native Indian dog, pariah dog, or a streetie), almost 4 years ago. I wrote this for her rescue’s private Facebook group a year ago, in hopes of encouraging a better understanding of street dogs, and have had some requests to make it public, so here it is:
What follows are my own musings, not anything coming from ISDF. I think about dogs a lot!😁 Through Diya’s rescuer’s visit to the Twin Cities, I was able to meet and observe 7 other Indian-Minnesotan dogs. Thinking about some common difficulties voiced by these dogs’ people, and also about some of the worries and frustrations recently expressed in group and my own challenges with Diya, I feel like sharing my perspective.
To a person, the Desi adopters I met here have been patient and accommodating towards their dogs, yet most of the dogs continue to have difficulty in certain situations. These are dogs, working from what their DNA and experience gives them to go on, in a wholly other environment from where they emerged into the world. Especially with random-bred, pariah type dogs like many from India, Oman, and Thailand, these dogs’ lives center around finding food and water, protecting themselves and their territory, avoiding harm, and successfully breeding, bearing, and raising pups. Certainly the pursuit of pleasure and comfortable resting spots plays into their lives too.
We ask these dogs, who are dogs as dogs are truly meant to be—to become “ours” when they arrive in America. We subject them to foreign constraints like crates and leashes, and saddle them with our own expectations. We spend a lot of time telling dogs they are good and that they are bad. But bottom line, they’re dogs, not just our fur babies, or our charges, but entities deserving of respect in their own right. This isn’t to minimize the difficulty and emotional toll of trying to change worrisome behaviors.
Our dogs think hard to get a handle on us; they interpret and build their own sense of the meanings behind our facial expressions, movements, words, tone, touch, habits, clothing, and smells. Their language is far broader than English, Hindi, Arabic, or Thai. They are another form of intelligent life in our midst, in our cars, on our sofas, under the covers and curled into the bend of our knees. Yet they can also be incredibly distressing as they bark at our friends, growl at our guests, lung at terriers, chase cats, and destroy door frames.
Dogs are incredibly adaptable; this is one of the reasons for their success as a species. A terrified dog rescued from meat trade smugglers in Thailand can transform into a remarkable beauty at ease in a Chanel boutique (😉😉Sparkle Stern); a dog from torrid Muscat can thrive in snow (you Omani pups know who you are). A Delhi puppy fated to starve in the same spot her mother died, can instead run miles through the Michigan woods and “go to work” in an air-conditioned office with her human mom and other people with their own dogs (yes, that’s you, Miss Lily). These changes don’t happen magically or automatically (except in the case of snow), but through initial acts of grace followed by steady and hope-fueled progression.
The things that come easiest to most of these former street puppies and dogs, are the ones that overlap with their natural instincts. Bonding with people who treat them well and provide their food, comfort, and positive mental stimulation is relatively straightforward, though many of our dogs retain more of a capacity for independence than common American companion breeds. Diya is always watching for suspicious people and crows; I live in a part of St. Paul where it’s not uncommon for neighborhood Facebook group posts to start out, “Was that gunshots or fireworks?” so I appreciate her sharp eyes and formidable-sounding bark (I love my city neighborhood, by the way. I love crows too—this is one of the points at which Diya and I differ).
It’s the things that are really weird for street dogs that are hard: being expected to be outgoing, friendly, and trusting of all people and other dogs…always having to stifle your growl…tolerating being left in a wire or plastic box for hours…not being able to run away when you get nervous or to sniff as many spots as you think you need to gather information…to have people decide what you need…going to the vet, going to dog parks, etc. Dogs are social animals, but their idea of social life is different from ours (and also very different from wolves’), and each has their own unique relationship with their person or people, and to the other animals in the household.
So when you are flustered and upset by your dog’s behavior, step back, and think of all we are expecting them to learn and all we are asking them to put aside. Learning new things can be very uncomfortable and anxiety-provoking, especially when going against strong instincts. Living alone with my dogs and cats, and being an introvert by nature, I’ve tended to avoid some trying situations that other families have to work through, but Diya and I have still come a long way. I look forward to finding out where all we’ll go and what we’ll teach each other.
If you just glanced at this post then make a note, a firm note, to come back and read it fully and carefully.
For Michelle captures precisely what it is to be a dog, especially a street dog.
It is a profoundly wise article and it is a great honour to be able to republish it in this place.