Category: Technology

Is your community no-kill?

A timely republication of a helpful article.

The Best Friends website has a useful article under their 2025 Goal aim.

It follows nicely yesterday’s post.

ooOOoo

2025 Goal

No-Kill for Cats and Dogs in America’s Shelters

You believe that animals deserve compassion and good quality of life. You also love your community and want to take action for the pets and people in it. Here’s how.

Last year, about 733,000 dogs and cats were killed in our nation’s animal shelters, simply because they didn’t have safe places to call home. Together, we can change that and achieve no-kill for dogs and cats nationwide by 2025.

Is your community no-kill?

Explore lifesaving nationwide using the interactive tool below and see which shelters in your community need your support. When every shelter in a community achieves a 90% save rate for all cats and dogs, that community is designated as no-kill. This provides a simple, effective benchmark for our lifesaving progress.

This dashboard presents a dynamic data set that is being updated regularly with the most current information available. We welcome your feedback to help ensure that our data is the latest and most accurate information.

Go here to access the map!

Common elements of a no-kill community
All no-kill communities embrace and promote:

Collective responsibility: We hold ourselves accountable for the welfare of pets in our animal shelters and communities.

    • Individual community members are willing to participate in lifesaving programs.
    • State and local government are poised to support those programs.
    • A transparent shelter staff is working with their community to save more lives.

Progressive lifesaving: We value compassionate and responsible actions to save animals.

  • Decision-making is data-driven and anchored by best practices in the field.
  • Quality care is provided to every pet and quality of life is a priority.
  • Programs are designed to save the animals most at risk of being killed.
  • Programs are designed to tackle the root of the problem rather than the symptoms.

True euthanasia: We recognize that, for some animals, euthanasia is the most compassionate choice. This is why the no-kill benchmark for save rate is 90% and not 100%. In some cases, shelters may not meet the 90% benchmark, but do meet the philosophical principles of no-kill, which are:

  • Ending the life of an animal only to end irremediable suffering.
  • Ending the life of an animal when the animal is too dangerous to rehabilitate and place in the community safely.

End-of-life decisions are made by animal welfare professionals engaging in best practices and protocols.

Visit the “Community Lifesaving Dashboard Frequently Asked Questions” page to learn more.

Working together to save more pets

About the lifesaving community maps

These community maps are the first of their kind in animal welfare. They represent an enormous undertaking on the part of compassionate organizations and individuals throughout the country and a commitment to collaboration and transparency from more than 3,200 shelters across the country.

Learn more about how these maps were created and how you can help make them more accurate and powerful for the pets in your community and beyond.

ooOOoo

Once again, go here to view the maps.

The best of luck!

It’s very quiet out there!

A deeply fascinating essay from an individual at the University of Oxford.

I have long read the daily output from The Conversation. It’s a very useful way of keeping one’s brain cells functioning in some sort of fashion.

Yesterday morning I read an essay put out by  a PhD Candidate at the University of Oxford.

It was fascinating and I am republishing it here.

Now it’s not for everyone. It is also long and it also has a number of videos to watch. And there’s not a dog mentioned!

But if you are interested in where we, as in human beings, are ‘going’, so to speak, then this is for you.

And I’m ready to admit that it may be an age thing; something that is of much interest to me because I shall be 75 in November  and one naturally wonders about the end of life. Both individually and of society!

ooOOoo

The end of the world: a history of how a silent cosmos led humans to fear the worst.

By
PhD Candidate, University of Oxford

August 7th, 2019

It is 1950 and a group of scientists are walking to lunch against the majestic backdrop of the Rocky Mountains. They are about to have a conversation that will become scientific legend. The scientists are at the Los Alamos Ranch School, the site for the Manhattan Project, where each of the group has lately played their part in ushering in the atomic age.

They are laughing about a recent cartoon in the New Yorker offering an unlikely explanation for a slew of missing public trash cans across New York City. The cartoon had depicted “little green men” (complete with antenna and guileless smiles) having stolen the bins, assiduously unloading them from their flying saucer.

By the time the party of nuclear scientists sits down to lunch, within the mess hall of a grand log cabin, one of their number turns the conversation to matters more serious. “Where, then, is everybody?”, he asks. They all know that he is talking – sincerely – about extraterrestrials.

The question, which was posed by Enrico Fermi and is now known as Fermi’s Paradox, has chilling implications.

Bin-stealing UFOs notwithstanding, humanity still hasn’t found any evidence of intelligent activity among the stars. Not a single feat of “astro-engineering”, no visible superstructures, not one space-faring empire, not even a radio transmission. It has been argued that the eerie silence from the sky above may well tell us something ominous about the future course of our own civilisation.

Such fears are ramping up. Last year, the astrophysicist Adam Frank implored an audience at Google that we see climate change – and the newly baptised geological age of the Anthropocene – against this cosmological backdrop. The Anthropocene refers to the effects of humanity’s energy-intensive activities upon Earth. Could it be that we do not see evidence of space-faring galactic civilisations because, due to resource exhaustion and subsequent climate collapse, none of them ever get that far? If so, why should we be any different?

A few months after Frank’s talk, in October 2018, the Intergovernmental Panel on Climate Change’s update on global warming caused a stir. It predicted a sombre future if we do not decarbonise. And in May, amid Extinction Rebellion’s protests, a new climate report upped the ante, warning: “Human life on earth may be on the way to extinction.”

Meanwhile, NASA has been publishing press releases about an asteroid set to hit New York within a month. This is, of course, a dress rehearsal: part of a “stress test” designed to simulate responses to such a catastrophe. NASA is obviously fairly worried by the prospect of such a disaster event – such simulations are costly.

Space tech Elon Musk has also been relaying his fears about artificial intelligence to YouTube audiences of tens of millions. He and others worry that the ability for AI systems to rewrite and self-improve themselves may trigger a sudden runaway process, or “intelligence explosion”, that will leave us far behind – an artificial superintelligence need not even be intentionally malicious in order to accidentally wipe us out.

In 2015, Musk donated to Oxford’s Future of Humanity Institute, headed up by transhumanist Nick Bostrom. Nestled within the university’s medieval spires, Bostrom’s institute scrutinises the long-term fate of humanity and the perils we face at a truly cosmic scale, examining the risks of things such as climate, asteroids and AI. It also looks into less well-publicised issues. Universe destroying physics experiments, gamma-ray bursts, planet-consuming nanotechnology and exploding supernovae have all come under its gaze.

So it would seem that humanity is becoming more and more concerned with portents of human extinction. As a global community, we are increasingly conversant with increasingly severe futures. Something is in the air.

But this tendency is not actually exclusive to the post-atomic age: our growing concern about extinction has a history. We have been becoming more and more worried for our future for quite some time now. My PhD research tells the story of how this began. No one has yet told this story, yet I feel it is an important one for our present moment.

I wanted to find out how current projects, such as the Future of Humanity Institute, emerge as offshoots and continuations of an ongoing project of “enlightenment” that we first set ourselves over two centuries ago. Recalling how we first came to care for our future helps reaffirm why we should continue to care today.

Extinction, 200 years ago

In 1816, something was also in the air. It was a 100-megaton sulfate aerosol layer. Girdling the planet, it was made up of material thrown into the stratosphere by the eruption of Mount Tambora, in Indonesia, the previous year. It was one of the biggest volcanic eruptions since civilisation emerged during the Holocene.

Mount Tambora’s crater. Wikimedia Commons/NASA

Almost blotting out the sun, Tambora’s fallout caused a global cascade of harvest collapse, mass famine, cholera outbreak and geopolitical instability. And it also provoked the first popular fictional depictions of human extinction. These came from a troupe of writers including Lord Byron, Mary Shelley and Percy Shelley.

The group had been holidaying together in Switzerland when titanic thunderstorms, caused by Tambora’s climate perturbations, trapped them inside their villa. Here they discussed humanity’s long-term prospects.

Clearly inspired by these conversations and by 1816’s hellish weather, Byron immediately set to work on a poem entitled “Darkness”. It imagines what would happen if our sun died:

I had a dream, which was not all a dream
The bright sun was extinguish’d, and the stars
Did wander darkling in the eternal space
Rayless, and pathless, and the icy earth
Swung blind and blackening in the moonless air

Detailing the ensuing sterilisation of our biosphere, it caused a stir. And almost 150 years later, against the backdrop of escalating Cold War tensions, the Bulletin for Atomic Scientists again called upon Byron’s poem to illustrate the severity of nuclear winter.

Two years later, Mary Shelley’s Frankenstein (perhaps the first book on synthetic biology) refers to the potential for the lab-born monster to outbreed and exterminate Homo sapiens as a competing species. By 1826, Mary went on to publish The Last Man. This was the first full-length novel on human extinction, depicted here at the hands of pandemic pathogen.

Boris Karloff plays Frankenstein’s monster, 1935. Wikimedia Commons

Beyond these speculative fictions, other writers and thinkers had already discussed such threats. Samuel Taylor Coleridge, in 1811, daydreamed in his private notebooks about our planet being “scorched by a close comet and still rolling on – cities men-less, channels riverless, five mile deep”. In 1798, Mary Shelley’s father, the political thinker William Godwin, queried whether our species would “continue forever”?

While just a few years earlier, Immanuel Kant had pessimistically proclaimed that global peace may be achieved “only in the vast graveyard of the human race”. He would, soon after, worry about a descendent offshoot of humanity becoming more intelligent and pushing us aside.

Earlier still, in 1754, philosopher David Hume had declared that “man, equally with every animal and vegetable, will partake” in extinction. Godwin noted that “some of the profoundest enquirers” had lately become concerned with “the extinction of our species”.

In 1816, against the backdrop of Tambora’s glowering skies, a newspaper article drew attention to this growing murmur. It listed numerous extinction threats. From global refrigeration to rising oceans to planetary conflagration, it spotlighted the new scientific concern for human extinction. The “probability of such a disaster is daily increasing”, the article glibly noted. Not without chagrin, it closed by stating: “Here, then, is a very rational end of the world!”

Tambora’s dust-cloud created ominous sunsets, such as this one painted by Turner, c. 1830–5. © Tate, CC BY-NC-ND

Before this, we thought the universe was busy

So if people first started worrying about human extinction in the 18th century, where was the notion beforehand? There is enough apocalypse in scripture to last until judgement day, surely. But extinction has nothing to do with apocalypse. The two ideas are utterly different, even contradictory.

For a start, apocalyptic prophecies are designed to reveal the ultimate moral meaning of things. It’s in the name: apocalypse means revelation. Extinction, by direct contrast, reveals precisely nothing and this is because it instead predicts the end of meaning and morality itself – if there are no humans, there is nothing humanly meaningful left.

And this is precisely why extinction matters. Judgement day allows us to feel comfortable knowing that, in the end, the universe is ultimately in tune with what we call “justice”. Nothing was ever truly at stake. On the other hand, extinction alerts us to the fact that everything we hold dear has always been in jeopardy. In other words, everything is at stake.

Extinction was not much discussed before 1700 due to a background assumption, widespread prior to the Enlightenment, that it is the nature of the cosmos to be as full as moral value and worth as is possible. This, in turn, led people to assume that all other planets are populated with “living and thinking beings” exactly like us.

Although it only became a truly widely accepted fact after Copernicus and Kepler in the 16th and 17th centuries, the idea of plural worlds certainly dates back to antiquity, with intellectuals from Epicurus to Nicholas of Cusa proposing them to be inhabited with lifeforms similar to our own. And, in a cosmos that is infinitely populated with humanoid beings, such beings – and their values – can never fully go extinct.

Star cluster Messier 13 in Hercules, 1877. Wikimedia Commons

In the 1660s, Galileo confidently declared that an entirely uninhabited or unpopulated world is “naturally impossible” on account of it being “morally unjustifiable”. Gottfried Leibniz later pronounced that there simply cannot be anything entirely “fallow, sterile, or dead in the universe”.

Along the same lines, the trailblazing scientist Edmond Halley (after whom the famous comet is named) reasoned in 1753 that the interior of our planet must likewise be “inhabited”. It would be “unjust” for any part of nature to be left “unoccupied” by moral beings, he argued.

Around the same time Halley provided the first theory on a “mass extinction event”. He speculated that comets had previously wiped out entire “worlds” of species. Nonetheless, he also maintained that, after each previous cataclysm “human civilisation had reliably re-emerged”. And it would do so again. Only this, he said could make such an event morally justifiable.

Later, in the 1760s, the philosopher Denis Diderot was attending a dinner party when he was asked whether humans would go extinct. He answered “yes”, but immediately qualified this by saying that after several millions of years the “biped animal who carries the name man” would inevitably re-evolve.

This is what the contemporary planetary scientist Charles Lineweaver identifies as the “Planet of the Apes Hypothesis”. This refers to the misguided presumption that “human-like intelligence” is a recurrent feature of cosmic evolution: that alien biospheres will reliably produce beings like us. This is what is behind the wrong-headed assumption that, should we be wiped out today, something like us will inevitably return tomorrow.

Back in Diderot’s time, this assumption was pretty much the only game in town. It was why one British astronomer wrote, in 1750, that the destruction of our planet would matter as little as “Birth-Days or Mortalities” do down on Earth.

This was typical thinking at the time. Within the prevailing worldview of eternally returning humanoids throughout an infinitely populated universe, there was simply no pressure or need to care for the future. Human extinction simply couldn’t matter. It was trivialised to the point of being unthinkable.

For the same reasons, the idea of the “future” was also missing. People simply didn’t care about it in the way we do now. Without the urgency of a future riddled with risk, there was no motivation to be interested in it, let alone attempt to predict and preempt it.

It was the dismantling of such dogmas, beginning in the 1700s and ramping up in the 1800s, that set the stage for the enunciation of Fermi’s Paradox in the 1900s and leads to our growing appreciation for our cosmic precariousness today.

But then we realised the skies are silent

In order to truly care about our mutable position down here, we first had to notice that the cosmic skies above us are crushingly silent. Slowly at first, though soon after gaining momentum, this realisation began to take hold around the same time that Diderot had his dinner party.

One of the first examples of a different mode of thinking I’ve found is from 1750, when the French polymath Claude-Nicholas Le Cat wrote a history of the earth. Like Halley, he posited the now familiar cycles of “ruin and renovation”. Unlike Halley, he was conspicuously unclear as to whether humans would return after the next cataclysm. A shocked reviewer picked up on this, demanding to know whether “Earth shall be re-peopled with new inhabitants”. In reply, the author facetiously asserted that our fossil remains would “gratify the curiosity of the new inhabitants of the new world, if there be any”. The cycle of eternally returning humanoids was unwinding.

In line with this, the French encyclopaedist Baron d’Holbach ridiculed the “conjecture that other planets, like our own, are inhabited by beings resembling ourselves”. He noted that precisely this dogma – and the related belief that the cosmos is inherently full of moral value – had long obstructed appreciation that the human species could permanently “disappear” from existence. By 1830, the German philosopher F W J Schelling declared it utterly naive to go on presuming “that humanoid beings are found everywhere and are the ultimate end”.

Figures illustrating articles on astronomy, from the 1728 Cyclopaedia. Wikimedia Commons

And so, where Galileo had once spurned the idea of a dead world, the German astronomer Wilhelm Olbers proposed in 1802 that the Mars-Jupiter asteroid belt in fact constitutes the ruins of a shattered planet. Troubled by this, Godwin noted that this would mean that the creator had allowed part of “his creation” to become irremediably “unoccupied”. But scientists were soon computing the precise explosive force needed to crack a planet – assigning cold numbers where moral intuitions once prevailed. Olbers calculated a precise timeframe within which to expect such an event befalling Earth. Poets began writing of “bursten worlds”.

The cosmic fragility of life was becoming undeniable. If Earth happened to drift away from the sun, one 1780s Parisian diarist imagined that interstellar coldness would “annihilate the human race, and the earth rambling in the void space, would exhibit a barren, depopulated aspect”. Soon after, the Italian pessimist Giacomo Leopardi envisioned the same scenario. He said that, shorn of the sun’s radiance, humanity would “all die in the dark, frozen like pieces of rock crystal”.

Galileo’s inorganic world was now a chilling possibility. Life, finally, had become cosmically delicate. Ironically, this appreciation came not from scouring the skies above but from probing the ground below. Early geologists, during the later 1700s, realised that Earth has its own history and that organic life has not always been part of it. Biology hasn’t even been a permanent fixture down here on Earth – why should it be one elsewhere? Coupled with growing scientific proof that many species had previously become extinct, this slowly transformed our view of the cosmological position of life as the 19th century dawned.

Copper engraving of a pterodactyl fossil discovered by the Italian scientist Cosimo Alessandro Collini in 1784. Wikimedia Commons

Seeing death in the stars

And so, where people like Diderot looked up into the cosmos in the 1750s and saw a teeming petri dish of humanoids, writers such as Thomas de Quincey were, by 1854, gazing upon the Orion nebula and reporting that they saw only a gigantic inorganic “skull” and its lightyear-long rictus grin.

The astronomer William Herschel had, already in 1814, realised that looking out into the galaxy one is looking into a “kind of chronometer”. Fermi would spell it out a century after de Quincey, but people were already intuiting the basic notion: looking out into dead space, we may just be looking into our own future.

Early drawings of Orion’s nebula by R.S. Newall, 1884. © Cambridge University, CC BY

People were becoming aware that the appearance of intelligent activity on Earth should not be taken for granted. They began to see that it is something distinct – something that stands out against the silent depths of space. Only through realising that what we consider valuable is not the cosmological baseline did we come to grasp that such values are not necessarily part of the natural world. Realising this was also realising that they are entirely our own responsibility. And this, in turn, summoned us to the modern projects of prediction, preemption and strategising. It is how we came to care about our future.

As soon as people first started discussing human extinction, possible preventative measures were suggested. Bostrom now refers to this as “macrostrategy”. However, as early as the 1720s, the French diplomat Benoît de Maillet was suggesting gigantic feats of geoengineering that could be leveraged to buffer against climate collapse. The notion of humanity as a geological force has been around ever since we started thinking about the long-term – it is only recently that scientists have accepted this and given it a name: “Anthropocene”.

Will technology save us?

It wasn’t long before authors began conjuring up highly technologically advanced futures aimed at protecting against existential threat. The eccentric Russian futurologist Vladimir Odoevskii, writing in the 1830s and 1840s, imagined humanity engineering the global climate and installing gigantic machines to “repulse” comets and other threats, for example. Yet Odoevskii was also keenly aware that with self-responsibility comes risk: the risk of abortive failure. Accordingly, he was also the very first author to propose the possibility that humanity might destroy itself with its own technology.

Acknowledgement of this plausibility, however, is not necessarily an invitation to despair. And it remains so. It simply demonstrates appreciation of the fact that, ever since we realised that the universe is not teeming with humans, we have come to appreciate that the fate of humanity lies in our hands. We may yet prove unfit for this task, but – then as now – we cannot rest assured believing that humans, or something like us, will inevitably reappear – here or elsewhere.

Beginning in the late 1700s, appreciation of this has snowballed into our ongoing tendency to be swept up by concern for the deep future. Current initiatives, such as Bostrom’s Future of Humanity Institute, can be seen as emerging from this broad and edifying historical sweep. From ongoing demands for climate justice to dreams of space colonisation, all are continuations and offshoots of a tenacious task that we first began to set for ourselves two centuries ago during the Enlightenment when we first realised that, in an otherwise silent universe, we are responsible for the entire fate of human value.

It may be solemn, but becoming concerned for humanity’s extinction is nothing other than realising one’s obligation to strive for unceasing self-betterment. Indeed, ever since the Enlightenment, we have progressively realised that we must think and act ever better because, should we not, we may never think or act again. And that seems – to me at least – like a very rational end of the world.

ooOOoo

I hope you have read it all. There’s much to engage one. And the message to me is very clear: We have to regard this race, correction: our race, as unique. As is put in the penultimate paragraph:

Enlightenment when we first realised that, in an otherwise silent universe, we are responsible for the entire fate of human value.

Now there’s a thought for an atheist on a Saturday morning!

Margaret Heffernan

Margaret who?

I must admit that I hadn’t heard of Margaret Heffernan before.

But in browsing TED Talks one evening recently we came across a TED Talk by her. And it was riveting!

Here’s how it was introduced:

The more we rely on technology to make us efficient, the fewer skills we have to confront the unexpected, says writer and entrepreneur Margaret Heffernan. She shares why we need less tech and more messy human skills — imagination, humility, bravery — to solve problems in business, government and life in an unpredictable age. “We are brave enough to invent things we’ve never seen before,” she says. “We can make any future we choose.”

Later on it explains: “The former CEO of five businesses, Margaret Heffernan explores the all-too-human thought patterns that lead organizations and managers astray.

In doing more research I came upon this:

Margaret Heffernan was born in Texas, grew up in the Netherlands and was educated at Cambridge University. She produced drama and documentary programs for the BBC for 13 years, then moved back to the US where she became a serial entrepreneur and CEO in the early days of the internet.

All of Heffernan’s work challenges accepted wisdom about good lives and good work. Willful Blindness: Why We Ignore the Obvious at Our Peril, named one of the most important business books of the decade by the Financial Times, looked at how our most cherished beliefs, behaviors and rules blind us to what matters most.

In 2015, she was awarded the Transmission Prize for A Bigger Prize: How We Can Do Better than the Competition, a book that upended the idea that competition forces the best to the top, arguing that it mostly proves wasteful and destructive where collaboration is more sustainable and creative.

In 2015, TED published Beyond Measure: The Big Impact of Small Changes which argued that organizational change can, and should, happen at all levels.

Her forthcoming book, Uncharted: How to Map the Future will be published in February 2020 in the UK and May 2020 in the US. It addresses the fundamental unpredictability of life, challenges technological determinism and asks how we can find in ourselves the freedom and imagination to create the futures we want. An early reader called it “Karl Popper for the 21st century.”

As lead faculty for the Forward Institute’s Responsible Leadership Programme, Heffernan mentors CEOs and senior executives of major global organizations.

Trust me, you will find this talk fascinating.

Reflections on the future

Father’s Day ….

….. was OK in the morning but for some reason I was in a dark mood in the afternoon.

(And if you want to skip today’s post I don’t blame you at all. This is not my usual style albeit it is important.)

I was reflecting on the state of the world. Global population was well in excess of seven billion people. The longevity of those people was increasing. That’s good news. The health standards were increasing. That’s also good news.

However, the pressure on farming is intense. More and more land is required. The natural world is under supreme pressure. Extinction rates of many natural species are soaring.

Planet Earth has far too many people!

OK, maybe in time the population level will come down but right now it is too high.

Then in came Tom Engelhardt’s latest essay. I read it and reflected. Is it too dark to post? Then Jeannie said that if you really want to share it then publish it.

Here it is, published with Tom’s kind permission.

ooOOoo

Tomgram: Engelhardt, Trump Change

Posted by Tom Engelhardt at 4:23pm, June 16, 2019.
Follow TomDispatch on Twitter @TomDispatch.

If Donald Trump Is the Symptom…
Then What’s the Disease?

By Tom Engelhardt
Don’t try to deny it! The political temperature of this country is rising fast. Call it Trump change or Trump warming, if you want, but grasp one thing: increasingly, you’re in a different land and, whatever happens to Donald Trump, the results down the line are likely to be ever less pretty. Trump change isn’t just an American phenomenon, it’s distinctly global. After all, from Australia to India, the Philippines to Hungary, Donald Trumps and their supporters keep getting elected or reelected and, according to a recent CNN poll, a majority of Americans think Trump himself will win again in 2020 (though, at the moment, battleground-state polls look grim for him).

Still, whether or not he gets a second term in the White House, he only seems like the problem, partially because no president, no politician, no one in history has ever gotten such 24/7 media coverage of every twitch, tweet, bizarre statement, falsehood, or fantasy he expresses (or even the clothes he wears). Think of it this way: we’re in a moment in which the only thing the media can’t imagine saying about Donald Trump is: “You’re fired!” And believe me, that’s just one sign of a media — and a country — with a temperature that’s anything but 98.6.

Since you-know-who is always there, always being discussed, always @(un)realdonaldtrump, it’s easy enough to imagine that everything that’s going wrong — or, if you happen to be part of his famed base, right (even if that right isn’t so damned hot for you) — is due to him. When we’re gripped by such thinking and the temperature’s rising, it hardly matters that just about everything he’s “done” actually preceded him. That includes favoring the 1%, deporting record numbers of illegal immigrants, and making war (unsuccessfully) or threatening to do so across significant parts of the planet.

Here, then, is the question of the day, the sort you’d ask about any patient with a rising temperature: If Donald Trump is only the symptom, what’s the disease?

Blowback Central

Let me say that the late Chalmers Johnson would have understood President Trump perfectly. The Donald clearly arrived on the scene as blowback — the CIA term of tradecraft Johnson first put into our everyday vocabulary — from at least two things: an American imperium gone wrong with its never-ending wars, ever-rising military budgets, and ever-expanding national security state, and a new “gilded age” in which three men (Bill Gates, Jeff Bezos, and Warren Buffett) have more wealth than the bottom half of society and the .01% have one of their own, a billionaire, in the Oval Office. (If you want to add a third blowback factor, try a media turned upside down by new ways of communicating and increasingly desperate to glue eyes to screens as ad revenues, budgets, and staffs shrank and the talking heads of cable news multiplied.)

Now, I don’t mean to sell Donald Trump short in any way. Give that former reality TV star credit. Unlike either Hillary Clinton or any of his Republican opponents in the 2016 election campaign, he sensed that there were voters in profusion in the American heartland who felt that things were not going well and were eager for a candidate just like the one he was ready to become. (There were, of course, other natural audiences for a disruptive, self-promoting billionaire as well, including various millionaires and billionaires ready to support him, the Russians, the Saudis… well, you know the list). His skill, however, never lay in what he could actually do (mainly, in these years, cut taxes for the wealthy, impose tariffs, and tweet his head off). It lay in his ability to catch the blowback mood of that moment in a single slogan — Make America Great Again, or MAGA — that he trademarked in November 2012, only days after Mitt Romney lost his bid for the presidency to Barack Obama.

Yes, four years later in the 2016 election, others began to notice the impact of that slogan. You couldn’t miss the multiplying MAGA hats, after all. Hillary Clinton’s advisers even briefly came up with the lamest response imaginable to it: Make America Whole Again, or MAWA. But what few at the time really noted was the crucial word in that phrase: “again.” Politically speaking, that single blowback word might then have been the most daring in the English language. In 2016, Donald Trump functionally said what no other candidate or politician of any significance in America dared to say: that the United States was no longer the greatest, most indispensable, most exceptionable nation or superpower or hyper-power ever to exist on Planet Earth.

That represented a groundbreaking recognition of reality. At the time, it didn’t matter whether you were Barack Obama, Hillary Clinton, or Marco Rubio, you had to acknowledge some version of that formula of exceptionalism. Trump didn’t and, believe me, that rang a bell in the American heartland, where lots of people had felt, however indirectly, the blowback from all those years of taxpayer-funded fruitless war, while not benefitting from infrastructure building or much of anything else. They experienced blowback from a country in which new billionaires were constantly being created, while the financial distance between CEO salaries and those of workers grew exponentially vaster by the year, and the financing of the political system became a 1% affair.

With that slogan, The Donald caught the spirit of a moment in which both imperial and economic decline, however unacknowledged by the Washington political elite, had indeed begun. In the process, as I wrote at that time, he crossed a psychologically taboo line and became America’s first declinist candidate for president. MAGA captured a feeling already at large that tomorrow would be worse than today, which was already worse than yesterday. As it turned out, it mattered not at all that the billionaire conman spouting that trademarked phrase had long been part of the problem, not the solution.

He caught the essence of the moment, in other words, but certainly didn’t faintly cause it in the years when he financed Trump Tower, watched his five Atlantic City casinos go bankrupt, and hosted The Apprentice. In that election campaign, he captured a previously forbidden reality of the twenty-first century. For example, I was already writing this in June 2016, five months before he was elected president:

“In its halcyon days, Washington could overthrow governments, install Shahs or other rulers, do more or less what it wanted across significant parts of the globe and reap rewards, while (as in the case of Iran) not paying any price, blowback-style, for decades, if at all. That was imperial power in the blaze of the noonday sun. These days, in case you hadn’t noticed, blowback for our imperial actions seems to arrive as if by high-speed rail (of which by the way, the greatest power on the planet has yet to build a single mile, if you want a quick measure of decline).

“Despite having a more massive, technologically advanced, and better funded military than any other power or even group of powers on the planet, in the last decade and a half of constant war across the Greater Middle East and parts of Africa, the U.S. has won nothing, nada, zilch. Its unending wars have, in fact, led nowhere in a world growing more chaotic by the second.”

Mind you, three years later the United States remains a staggeringly powerful imperial force, with hundreds of military bases still scattered across the globe, while its economic clout — its corporations control about half the planet’s wealth — similarly remains beyond compare. Yet, even in 2016, it shouldn’t have been hard to see that the American Century was indeed ending well before its 100 years were up. It shouldn’t have been hard to grasp, as Donald Trump intuitively did, that this country, however powerful, was already both a declining empire — thank you, George W. Bush for invading Iraq! Mission Accomplished! — and a declining economic system (both of which still looked great indeed, if you happened to be profiting from them). That intuition and that slogan gave Trump his moment in… well, dare I call it “the afternoon sun”? They made him president.

MTPGA

In a sense, all of this should have been expectable enough. Despite the oddity of Donald Trump himself, there was little new in it, even for the imperial power that its enthusiasts once thought stood at “the end of history.” You don’t need to look far, after all, for evidence of the decline of empires. You don’t even have to think back to the implosion of the Soviet Union in 1991, almost three decades ago in what now seems like the Stone Age. (Admittedly, Russian President Vladimir Putin, a brilliant imagineer, has brought back a facsimile of the old Soviet Union, even if, in reality, Russia is now a rickety, fraying petro-state.)

Just take a glance across the Atlantic at Great Britain at this moment. And imagine that three-quarters of a century ago, that modest-sized island nation still controlled all of India, colonies across the planet, and an impressive military and colonial service. Go back even further and you’ll find yourself in a time when it was the true superpower of planet Earth. What a force it was — industrially, militarily, colonially — until, of course, it wasn’t.

If you happen to be looking for imperial lessons, you could perhaps say that some empires end not with a bang but with a Brexit. Despite all the pomp and circumstance (tweeting and insults) during the visit of the Trump royal family (Donald, Melania, Ivanka, Jared, Donald Jr., Eric, and Tiffany) to the British royals, led by a queen who, at 93, can remember better days, here’s something hard to deny: with Brexit (no matter how it turns out), the Earth’s former superpower has landed in the sub-basement of history. Great Britain? Obviously that adjective has to change.

In the meantime, across the planet, China, another once great imperial power, perhaps the greatest in the long history of this planet, is clearly on the rise again from another kind of sub-basement. That, in turn, is deeply worrying the leadership, civilian and military, of the planet’s “lone superpower.” Its president, in response, is wielding his weapon of choice — tariffs — while the U.S. military prepares for an almost unimaginable future war with that upstart nation, possibly starting in the South China Sea.

Meanwhile, the still-dominant power on the planet is, however incrementally, heading down. It’s nowhere near that sub-basement, of course — anything but. It’s still a rich, immensely powerful land. Its unsuccessful wars, however, go on without surcease, the political temperature rises, and democratic institutions continue to fray — all of which began well before Donald Trump entered the Oval Office and, in fact, helped ensure that he would make it there in the first place.

And yet none of this, not even imperial decline itself, quite captures the “disease” of which The Donald is now such an obvious symptom. After all, while the rise and fall of imperial powers has been an essential part of history, the planetary context for that process is now changing in an unprecedented way. And that’s not just because, since the 1945 atomic bombings of Hiroshima and Nagasaki, growing numbers of countries have come to possess the power to take the planet down in a cataclysm of fire and ice (as in nuclear winter). It’s also because history, as we’ve known it, including the rise and fall of empires, is now, in a sense, melting away.

Trump change, the rising political temperature stirred by the growing populist right, is taking place in the context of (and, worse yet, aiding and abetting) record global temperatures, the melting of ice across the planet, the rise of sea levels and the future drowning of coastlines (and cities), the creation of yet more refugees, the increasing fierceness of fires and droughts, and the intensification of storms. In the midst of it all, an almost unimaginable wave of extinctions is occurring, with a possible million plant and animal species, some crucial to human existence, already on the verge of departure.

Never before in history has the rise and decline of imperial powers taken place in the context of the decline of the planet itself. Try, for instance, to imagine what a “risen” China will look like in an age in which one of its most populous regions, the north China plain, may by century’s end be next to uninhabitable, given the killing heat waves of the future.

In the context of both Trump change and climate change, we’re obviously still awaiting our true transformative president, the one who is not a symptom of decline, but a factor in trying to right this country and the Earth before it’s too late. You know, the one who will take as his or her slogan, MTPGA (Make The Planet Great Again).

Tom Engelhardt is a co-founder of the American Empire Project and the author of a history of the Cold War, The End of Victory Culture. He runs TomDispatch.com and is a fellow of the Type Media Center. His sixth and latest book is A Nation Unmade by War (Dispatch Books).

Follow TomDispatch on Twitter and join us on Facebook. Check out the newest Dispatch Books, John Feffer’s new dystopian novel (the second in the Splinterlands series) Frostlands,Beverly Gologorsky’s novel Every Body Has a Story, and Tom Engelhardt’s A Nation Unmade by War, as well as Alfred McCoy’s In the Shadows of the American Century: The Rise and Decline of U.S. Global Power and John Dower’s The Violent American Century: War and Terror Since World War II.

Copyright 2019 Tom Engelhardt

ooOOoo

I’m 74. I don’t know how long I’ve got.

Part of me wants to live for a long time. That’s why I am vegan and trying to stay as fit as I can. (I’m also aware that Jeannie’s Parkinson’s Disease is a terminal disease and that in the latter stages she will need me to look after her.)

But then again I’m not sure I want to live in a world that continues to degrade especially continues to degrade in natural ways.

What’s the answer?

What do others who are on or around my age think about it?

What is the disease?

More on meteorites.

I saw this story very late yesterday.

This was read quickly towards the end of the day, as in yesterday, but I thought it well worthwhile rescheduling my doggie article until Saturday and putting this in for today.

Later on yesterday it was read more thoroughly and it is full of fascinating information such as the weight of meteorites that fall onto Planet Earth each day. I wasn’t aware of that.

Anyway, hope you too find it of interest.

ooOOoo

The tell-tale clue to how meteorites were made, at the birth of the solar system

By

Professor of Astronomy, Wesleyan University

and

Assistant Professor of Earth and Environmental Sciences, Wesleyan University

June 6th, 2019.

April 26, 1803 was an unusual day in the small town of L’Aigle in Normandy, France – it rained rocks.

Over 3,000 of them fell out of the sky. Fortunately no one was injured. The French Academy of Sciences investigated and proclaimed, based on many eyewitness stories and the unusual look of the rocks, that they had come from space.

The Earth is pummeled with rocks incessantly as it orbits the Sun, adding around 50 tons to our planet’s mass every day. Meteorites, as these rocks are called, are easy to find in deserts and on the ice plains of Antarctica, where they stick out like a sore thumb. They can even land in backyards, treasures hidden among ordinary terrestrial rocks. Amateurs and professionals collect meteorites, and the more interesting ones make it to museums and laboratories around the world for display and study. They are also bought and sold on eBay.

Despite decades of intense study by thousands of scientists, there is no general consensus on how most meteorites formed. As an astronomer and a geologist, we have recently developed a new theory of what happened during the formation of the solar system to create these valuable relics of our past. Since planets form out of collisions of these first rocks, this is an important part of the history of the Earth.

This meteor crater in Arizona was created 50,000 years ago when an iron meteorite struck the Earth. It is about one mile across. W. Herbst, CC BY-SA

The mysterious chondrules

Drew Barringer (left), owner of Arizona meteor crater, his wife, Clare Schneider, and author William Herbst in the Van Vleck Observatory Library of Wesleyan University, where an iron meteorite from the crater is on display. W. Herbst

About 10% of meteorites are pure iron. These form through a multi-step process in which a large molten asteroid has enough gravity to cause iron to sink to its center. This builds an iron core just like the Earth’s. After this asteroid solidifies, it can be shattered into meteorites by collisions with other objects. Iron meteorites are as old as the solar system itself, proving that large asteroids formed quickly and fully molten ones were once abundant.

The other 90% of meteorites are called “chondrites” because they are full of mysterious, tiny spheres of rock known as “chondrules.” No terrestrial rock has anything like a chondrule inside it. It is clear that chondrules formed in space during a brief period of intense heating when temperatures reached the melting point of rock, around 3,000 degrees Fahrenheit, for less than an hour. What could possibly account for that?

A closeup of the Semarkona meteorite showing dozens of chondrules. Kenichi Abe

Researchers have come up with many hypotheses through the last 40 years. But no consensus has been reached on how this brief flash of heating happened.

The chondrule problem is so famously difficult and contentious that when we announced to colleagues a few years ago that we were working on it, their reaction was to smile, shake their heads and offer their condolences. Now that we have proposed a solution we are preparing for a more critical response, which is fine, because that’s the way science advances.

The flyby model

Our idea is quite simple. Radioactive dating of hundreds of chondrules shows that they formed between 1.8 and 4 million years after the beginning of the solar system – some 4.6 billion years ago. During this time, fully molten asteroids, the parent bodies of the iron meteorites, were abundant. Volcanic eruptions on these asteroids released tremendous amounts of heat into the space around them. Any smaller objects passing by during an eruption would experience a short, intense blast of heat.

To test our hypothesis, we split up the challenge. The astronomer, Herbst, crunched the numbers to determine how much heating was necessary and for how long to create chondrules. Then the geologist, Greenwood, used a furnace in our lab at Wesleyan to recreate the predicted conditions and see if we could make our own chondrules.

Laboratory technician Jim Zareski (top) loads a programmable furnace as co-author Jim Greenwood looks on, in his laboratory at Wesleyan University. This is where the synthetic chondrules are made. W. Herbst

The experiments turned out to be quite successful.

We put some fine dust from Earth rocks with compositions resembling space dust into a small capsule, placed it in our furnace and cycled the temperature through the predicted range. Out came a nice-looking synthetic chondrule. Case closed? Not so fast.

Two problems emerged with our model. In the first place, we had ignored the bigger issue of how chondrules came to be part of the whole meteorite. What is their relationship to the stuff between chondrules – called matrix? In addition, our model seemed a bit too chancy to us. Only a small fraction of primitive matter will be heated in the way we proposed. Would it be enough to account for all those chondrule-packed meteorites hitting the Earth?

A comparison of a synthetic chondrule (left) made in the Wesleyan lab with a heating curve from the flyby model, with an actual chondrule (right) from the Semarkona meteorite. The crystal structure is quite similar, as shown in the enlargements (bottom row). J. Greenwood

Making whole meteorites

To address these issues, we extended our initial model to consider flyby heating of a larger object, up to a few miles across. As this material approaches a hot asteroid, parts of it will vaporize like a comet, resulting in an atmosphere rich in oxygen and other volatile elements. This turns out to be just the kind of atmosphere in which chondrules form, based on previous detailed chemical studies.

We also expect the heat and gas pressure to harden the flyby object into a whole meteorite through a process known as hot isostatic pressing, which is used commercially to make metal alloys. As the chondrules melt into little spheres, they will release gas to the matrix, which traps those elements as the meteorite hardens. If chondrules and chondrites form together in this manner, we expect the matrix to be enhanced in exactly the same elements that the chondrules are depleted. This phenomenon, known as complementarity, has, in fact, been observed for decades, and our model provides a plausible explanation for it.

The authors’ model for forming chondrules. A small piece of rock (right) — a few miles across or less — swings close to a large hot asteroid erupting lava at its surface. Infrared radiation from the hot lava briefly raises the temperature on the small piece of rock high enough to form chondrules and harden part of that object into a meteorite. W. Herbst/Icarus

Perhaps the most novel feature of our model is that it links chondrule formation directly to the hardening of meteorites. Since only well-hardened objects from space can make it through the Earth’s atmosphere, we would expect the meteorites in our museums to be full of chondrules, as they are. But hardened meteorites full of chondrules would be the exception, not the rule, in space, since they form by a relatively chancy process – the hot flyby. We should know soon enough if this idea holds water, since it predicts that chondrules will be rare on asteroids. Both Japan and the United States have ongoing missions to nearby asteroids that will return samples over the next few years.

If those asteroids are full of chondrules, like the hardened meteorites that make it to the Earth’s surface, then our model can be discarded and the search for a solution to the famous chondrule problem can go on. If, on the other hand, chondrules are rare on asteroids, then the flyby model will have passed an important test.

ooOOoo

Perfect timing!

Funny what falls from the sky!

Mud ball meteorites!

We took a rafting trip down the Rogue River yesterday (the 4th) and when I have finished transferring the photographs from my iPhone to my computer I will write a post on the journey.

But for today’s post I want to republish an item that appeared also yesterday in EarthSky.

And it does involve dogs!

ooOOoo

Mud ball meteorites rain down in Costa Rica

“Mud ball” meteorites – full of clays, organics and water – are unique among space rocks. And a lot of them fell in April 2019 on a small town in Costa Rica, much to the delight of scientists.

This meteorite from the fall at Aguas Zarcas, Costa Rica, in April hit a doghouse. Luckily, the dog – Rocky – was unharmed. Image via Michael Farmer/ASU.

Meteorite falls on Earth are fairly common, but not all meteorites are the same. Some of them are “mud balls,” rich in clays, organic compounds and water-bearing minerals, called carbonaceous chondrites. They are of great interest to scientists, due to their unique composition, and now a bunch more prime specimens have been found, which rained down after a large fireball was seen over Aguas Zarcas, a small town in Costa Rica, on April 23, 2019.

The fireball was a meteor, or space rock, entering the Earth’s atmosphere that broke apart into hundreds of smaller pieces. When the pieces of this rock hit the ground, their name changed to meteorite. One meteorite fragment weighed about two pounds and smashed through the roof of a house, destroying the owner’s dining table. Another one crashed through the roof of a dog house, narrowly missing a sleeping dog. Close calls!

The doghouse with the hole in its roof from the April 2019 meteorite in Costa Rica. The dog, Rocky, was sleeping in the doghouse at the time; he was unharmed, but probably surprised! Image via Michael Farmer/ASU.

Several of the meteorites were collected and sent to Arizona State University (ASU) for study, donated by meteorite collector Michael Farmer. ASU will also be able to purchase additional meteorites from the fall, thanks to a private donor. This is the first time in 50 years that the university has had a chance to analyze such pristine samples of extraterrestrial mud balls. As Laurence Garvie, a research professor at ASU and a curator for its Center for Meteorite Studies, said:

Many carbonaceous chondrites are mud balls that are between 80 and 95 percent clay. Clays are important because water is an integral part of their structure. These had to be collected quickly and before they got rained on. Because they are mostly clay, as soon as these types of meteorites get wet, they fall apart.

Luckily, the researchers were able to collect their samples before it rained again, and they got a nice little haul, too, about 55 pounds (25 kilograms) of the precious space rocks.

A composite element map from one of the meteorites showing the distribution of different minerals. Orange-yellow colors show tochilinite, deep-blue colors represent olivine, and red colors are pentlandite and pyrrhotite. Image via ASU.

Analysis of the meteorites was carried out at ASU’s campus in Tempe, Arizona. According to Garvie:

I was in the lab by 5 a.m. the next morning after picking up the samples to get them ready for the initial analyses. Classification of new meteorites can be like a race with other institutions, and I needed ASU to be first so that we’ll have the recognition of being the collection that holds and curates the type specimen material.

Air-sensitive meteorites like these are kept in special nitrogen cabinets. The nitrogen gas helps to preserve the meteorites, which can degrade easily due to their composition. As Garvie explained:

If you left this carbonaceous chondrite in the air, it would lose some of its extraterrestrial affinities. These meteorites have to be curated in a way that they can be used for current and future research, and we have that ability here at ASU.

This mud ball meteorite fragment from April’s meteorite fall in Aguas Zarcas, Costa Rica, looks a bit like an arrowhead. Image via ASU.

The classification of these meteorites is part of a broader international classification effort. Garvie is also working with Karen Ziegler from the Institute of Meteoritics at the University of New Mexico. They studied the oxygen isotopes of the meteorites, to determine how similar they are to other carbonaceous chondrites.

Sandra Pizzarello, an organic chemist from ASU’s School of Molecular Sciences, is also involved in the studies, focusing on the organic content of the meteorites. These kinds of organics could have provided the material needed for life to begin on Earth.

Additional scientific analysis will follow later, but first the meteorites need to be approved, classified and named by The Meteoritical Society‘s nomenclature committee. This group of 12 scientists is responsible for approving all meteorite samples for study.

These new meteorite samples are currently on display at ASU’s Tempe campus in the Center for Meteorite Studies collection.

So, why are mud ball carbonaceous chondrite meteorites so significant?

They are thought to originate from asteroids that are leftovers from early planetesimals, planets that started to form in the early solar system billions of years ago but now no longer exist. Those planets had organic materials and water, making them places where the chemical precursors to life could have started. In the case of the asteroid that these new meteorites originated from, Garvie said:

It formed in an environment free of life, then was preserved in the cold and vacuum of space for 4.56 billion years, and then dropped in Costa Rica last week.

As CMS Director Meenakshi Wadhwa also said:

Carbonaceous chondrites are relatively rare among meteorites but are some of the most sought-after by researchers because they contain the best-preserved clues to the origin of the solar system. This new meteorite represents one of the most scientifically significant additions to our wonderful collection in recent years.

Because these meteorites contain so much mineral-bound water, they could also be useful in learning how water can be extracted from asteroids, a great resource for future astronauts. According to Garvie:

Having this meteorite in our lab gives us the ability, with further analysis, to ultimately develop technologies to extract water from asteroids in space.

Location of Aguas Zarcas in Costa Rica. Image via Google Maps.

The last time a carbonaceous chondrite meteorite fall similar to this one occurred was in 1969 near Murchison, Australia. Those meteorites were curated by another ASU professor and founding director of ASU’s Center for Meteorite Studies, Carleton Moore.

The meteorites in Aguas Zarcas have also been found to be similar in composition to asteroid Bennu, now being explored by NASA’s OSIRIS-REx spacecraft. Bennu is thought to be a remnant carbonaceous chondrite planetesimal. OSIRIS-REx is carrying ASU’s Phil Christensen-designed Thermal Emissions Spectrometer (OTES) instrument, which is being used to make mineral and temperature maps of the asteroid.

Garvie and other scientists will be studying these mud ball meteorites for years to come, unlocking more secrets as to how our solar system formed and evolved, and how the ingredients of life originated and were spread throughout the solar system, including to Earth.

Bottom line: This new meteorite fall in Costa Rica has provided scientists with a great opportunity to study multiple mud ball meteorites, one of the most unusual kinds of meteorites known to exist, and one that could help answer the question of how life started on Earth.

Via ASU

ooOOoo

I don’t know about you but I found this very interesting indeed. I guess I hadn’t looked at meteorites as different entities, depending on the source, before.

Fascinating!

Must repeat that closing paragraph again: “Garvie and other scientists will be studying these mud ball meteorites for years to come, unlocking more secrets as to how our solar system formed and evolved, and how the ingredients of life originated and were spread throughout the solar system, including to Earth.

Breathing problems in certain dogs

This is of interest to many but especially lovers of bulldogs and similar.

There was an article on May 17th in The Smithsonian that caught my eye. So much so that I wanted to republish it for you.

Here it is.

ooOOoo

Breathing Problems in Pugs and Bulldogs Might Have a Genetic Component

It might not be their smushed-up snouts after all

They’re all good dogs. ( Frank Gaglione / Getty)

By Jason Daley

smithsonian.com
May 17, 2019
Smushed-up faces are what make certain dog breeds, like French and English bulldogs or pugs, so ugly-cute. But those good looks come with a cost. Many dogs in these breeds suffer from a disease called Brachycephalic Obstructive Airway Syndrome (BOAS). The compact architecture of their skull results in deformation, which make their nostrils or soft palate too small, obstructing airflow and leaving the pups gasping for breath. Researchers long thought that the main cause was their shortened faces. But genes found in another breed suggest that the shortness of breath might be in their DNA, according to a new study published in the journal PLOS Genetics.

Ed Cara at Gizmodo reports that veterinarians began to notice that another small breed of dog, the Norwich terrier, was increasingly coming down with similar respiratory symptoms, a disease called upper airway obstructive syndrome. Unlike flat-faced pugs and bulldogs, however, the Norwich—bred for chasing rodents—has a nice, proportional skull. That got study author Jeffrey Schoenebeck, a veterinary scientist at the University of Edinburgh, wondering if the breathing problems in all the small dogs were genetic.

“That made us wonder if there was something similar shared across these different breeds, or if we were seeing two different diseases that just looked very similar,” he says.

Schoenebeck and his team decided to dig into the terrier’s DNA to find out. The team assessed 401 Norwich terriers for signs of the airway syndrome and also examined their genomes. Cassie Martin at Science News reports the researchers discovered one gene mutation in particular, ADAMTS3, was associated with the breathing disorder. Dogs with two copies of the mutation showed signs of fluid retention and swelling around the lungs. They had worse breathing scores than dogs with just one copy of the mutation or the normal gene.

When the team examined the genome of bulldogs and pugs, they also found that ADAMTS3 was common, meaning their funky faces might not be the only cause of BOAS.

“BOAS is a complex disease. Although skull shape remains an important risk factor, our study suggests that the status of ADAMTS3 should be considered as well,” Schoenebeck says in a press release. “More studies are needed to dissect the complex nature of this devastating disease.”

Cara reports that Norwich terrier breeders are already inadvertently combating the mutation. In Switzerland, Schoenebeck’s team has been working with breeders to give dogs breathing tests, identifying pups likely to develop the disease. As a result, the younger generation of terriers is less like to develop the disease than older dogs.

“In the 90s, something like 80 percent of the Norwich terriers that came into their clinic had poor breathing and this mutation,” Schoenebeck tells Cara. “But it’s decreasing further and further over time. They didn’t know it at the time, but they were actually selecting against this thing that we think is causing this disease.”

The genetic finding means that researchers can now screen directly for the mutation, and perhaps rid the terrier population of the disease.

The problem in flat-faced breeds may not be quite as simple to deal with. Wonky skull shape still makes the risk of developing BOAS higher, and the gene mutation adds to that risk. The team needs to do a similar study with bulldogs to figure out how much of their breathing problems come from their genes and how much comes from their cute little smushed-up skulls.

Smushed-up faces are what make certain dog breeds, like French and English bulldogs and pugs, so ugly-cute. But those good looks come with a cost. Many dogs in these breeds suffer from a disease called Brachycephalic Obstructive Airway Syndrome (BOAS). The compact architecture of their skull results in deformation, that make the nostrils or soft palate too small, obstructing airflow and leaving the pups gasping for breath. Researcher long thought that the main cause was their shortened faces. But genes found in another breed suggest that the shortness of breath might be in their DNA.

Ed Cara at Gizmodo reports that veterinarians began to notice that another small breed of dog, the Norwich Terrier, was increasingly coming down with similar respiratory symptoms, a disease called Upper Airway Syndrome. Unlike flat-faced pugs and bulldogs, however, the Norwich, bred for chasing rodents, has a nice, proportional skull. That got Jeffrey Schoenebeck of the Roslin Institute at the University of Edinburghand and lead author of the study in the journal PLOS Genetics wondering if the breathing problems in all the small dogs were genetic.

“That made us wonder if there was something similar shared across these different breeds, or if we were seeing two different diseases that just looked very similar,” he says.

Schoenebeck and his team decided to dig into the terrier’s DNA to find out. The team assessed 401 Norwich Terriers for signs of the airway syndrome and also examined their genomes. Cassie Martin at Science News reports the researchers discovered one gene mutation in particular, ADAMTS3, was associated with the breathing disorder. Dogs with two copies of the mutation showed signs of fluid retention and swelling around the lungs. They had worse breathing scores than dogs with just one copy of the mutation or the normal gene.

When the team examined the genome of bulldogs and pugs, they also found that ADAMTS3 was common, meaning their funky faces might not be the only cause of BOAS.

“BOAS is a complex disease. Although skull shape remains an important risk factor, our study suggests that the status of ADAMTS3 should be considered as well,” study leader Jeffrey Schoenebeck says in a press release. “More studies are needed to dissect the complex nature of this devastating disease.”

Cara reports that Norwich Terrier breeders are already inadvertently combating the mutation. In Switzerland, Schoenebeck’s co-authors have been working with breeders to give dogs breathing tests, identifying doggos likely to develop the disease. As a result, the younger generation of terriers is less like to develop the disease than older dogs.

“In the 90s, something like 80 percent of the Norwich Terriers that came into their clinic had poor breathing and this mutation,” Schoenebeck tells Cara. “But it’s decreasing further and further over time. They didn’t know it at the time, but they were actually selecting against this thing that we think is causing this disease.”

The genetic finding means that researchers can now screen directly for the mutation, and perhaps rid the terrier population of the disease.

The problem in flat-faced breeds may not be quite as simple to deal with. Wonky skull shape still makes the risk of developing BOAS higher, and the gene mutation adds to that risk. The team needs to do a similar study with bulldogs to figure out how much of their breathing problems come from their genes and how much comes from their cute little smushed-up skulls.

Jason Daley is a Madison, Wisconsin-based writer specializing in natural history, science, travel, and the environment. His work has appeared in Discover, Popular Science, Outside, Men’s Journal, and other magazines.

ooOOoo

How great it would be if the problem facing these flat-faced breeds was slowly done away with.

G3PUK!

The story of me gaining my radio amateur licence.

As I spoke about yesterday in my introduction, when my mother remarried my sister and I had a new man about the house, so to speak. He was Richard Mills.

I was 13 or thereabouts and already struggling with my school work (the result of my father’s sudden death). And ‘Dad’ as we called him was finding his feet in the strange world of going from having no children to instantly having two step children!

Anyway, Dad found a theme with me that I enjoyed: building a shortwave radio receiver. It was full of learning for me and over the years I became hooked on listening to radio stations both near and far transmitting in morse code. I also joined the Harrow Radio Society and went across to their weekly meetings by tube and bus. (Despite the Society no longer being at the Harrow address it is amazing that they are still going strong.)

It was also a time when there was a great deal of ‘radio surplus’ equipment going for next to nothing and I ‘upgraded’ to an R-1152 receiver.

War surplus R-1152 receiver.

In time I became sufficiently old to take driving lessons and pass my driving licence. I then got a secondhand car. It helped because then I could drive up to Bushey and spend Sunday mornings at the house of Ron Ray. Ron was a keen amateur. On Sunday mornings Ron had a small group of people who wanted to pass the morse code test and apply for a licence.

I was already a member of the RSGB, the Radio Society of Great Britain, and that surely encouraged me further to study for my amateur licence.

In time, I sat the exam and much to my amazement passed!

oooo

oooo

 

So that is the story of me and amateur radio.

Well, almost the full story.

In 1963 I volunteered for the Royal Naval Reserve, London Division. In time I was accepted and chose the join the radio branch, my G3PUK status coming in useful, because I reckoned that when we went to sea, on flat-bottomed minesweepers, it was better to be sick into a bucket between the knees than be sick on deck!

So there you are – G3PUK!

The Morse Code is 175 years old!

Two days of nostalgia follow! (You have been warned!)

As many of you already know, my father died fairly suddenly on December 20th, 1956. I had turned 12 some six weeks previously.

After about a year my mother remarried. His name was Richard Mills. Richard came to live at the house in Toley Avenue and had the unenviable task of taking on a new ‘son’ and ‘daughter’. (My sister, Elizabeth, some four years younger than I.)

Richard was a technical author in the newly-arrived electronics industry and one day he asked me if I would like to build a short-wave receiver. He coached me in the strange art of soldering wires and radio valves and other components and in the end I had a working receiver. That led, in turn, to me studying for an amateur radio licence. More of that tomorrow.

But the point of the introduction is to relay that The Morse Code is 175 years old on the 24th May.

Read more:

ooOOoo

Simply elegant, Morse code marks 175 years and counting

The elegantly simple code works whether flashing a spotlight or blinking your eyes—or even tapping on a smartphone touchscreen

There’s still plenty of reason to know how to use this Morse telegraph key. (Jason Salmon/Shutterstock.com)

By
Ph.D. Student in Electrical Engineering, University of South Carolina

May 21st, 2019

The first message sent by Morse code’s dots and dashes across a long distance traveled from Washington, D.C., to Baltimore on Friday, May 24, 1844 – 175 years ago. It signaled the first time in human history that complex thoughts could be communicated at long distances almost instantaneously. Until then, people had to have face-to-face conversations; send coded messages through drums, smoke signals and semaphore systems; or read printed words.

Thanks to Samuel F.B. Morse, communication changed rapidly, and has been changing ever faster since. He invented the electric telegraph in 1832. It took six more years for him to standardize a code for communicating over telegraph wires. In 1843, Congress gave him US$30,000 to string wires between the nation’s capital and nearby Baltimore. When the line was completed, he conducted a public demonstration of long-distance communication.

Morse wasn’t the only one working to develop a means of communicating over the telegraph, but his is the one that has survived. The wires, magnets and keys used in the initial demonstration have given way to smartphones’ on-screen keyboards, but Morse code has remained fundamentally the same, and is still – perhaps surprisingly – relevant in the 21st century. Although I have learned, and relearned, it many times as a Boy Scout, an amateur radio operator and a pilot, I continue to admire it and strive to master it.

Samuel F.B. Morse’s own handwritten record of the first Morse code message ever sent, on May 24, 1844. Library of Congress

Easy sending

Morse’s key insight in constructing the code was considering how frequently each letter is used in English. The most commonly used letters have shorter symbols: “E,” which appears most often, is signified by a single “dot.” By contrast, “Z,” the least used letter in English, was signified by the much longer and more complex “dot-dot-dot (pause) dot.”

In 1865, the International Telecommunications Union changed the code to account for different character frequencies in other languages. There have been other tweaks since, but “E” is still “dot,” though “Z” is now “dash-dash-dot-dot.”

The reference to letter frequency makes for extremely efficient communications: Simple words with common letters can be transmitted very quickly. Longer words can still be sent, but they take more time.

Going wireless

The communications system that Morse code was designed for – analogue connections over metal wires that carried a lot of interference and needed a clear on-off type signal to be heard – has evolved significantly.

The first big change came just a few decades after Morse’s demonstration. In the late 19th century, Guglielmo Marconi invented radio-telegraph equipment, which could send Morse code over radio waves, rather than wires.

The shipping industry loved this new way to communicate with ships at sea, either from ship to ship or to shore-based stations. By 1910, U.S. law required many passenger ships in U.S. waters to carry wireless sets for sending and receiving messages.

After the Titanic sank in 1912, an international agreement required some ships to assign a person to listen for radio distress signals at all times. That same agreement designated “SOS” – “dot-dot-dot dash-dash-dash dot-dot-dot” – as the international distress signal, not as an abbreviation for anything but because it was a simple pattern that was easy to remember and transmit. The Coast Guard discontinued monitoring in 1995. The requirement that ships monitor for distress signals was removed in 1999, though the U.S. Navy still teaches at least some sailors to read, send and receive Morse code.

The arrow points at the chart label indicating the Morse code equivalent to the ‘BAL’ signal for a radio beacon near Baltimore. Edited screenshot of an FAA map, CC BY-ND

Aviators also use Morse code to identify automated navigational aids. These are radio beacons that help pilots follow routes, traveling from one transmitter to the next on aeronautical charts. They transmit their identifiers – such as “BAL” for Baltimore – in Morse code. Pilots often learn to recognize familiar-sounding patterns of beacons in areas they fly frequently.

There is a thriving community of amateur radio operators who treasure Morse code, too. Among amateur radio operators, Morse code is a cherished tradition tracing back to the earliest days of radio. Some of them may have begun in the Boy Scouts, which has made learning Morse variably optional or required over the years. The Federal Communications Commission used to require all licensed amateur radio operators to demonstrate proficiency in Morse code, but that ended in 2007. The FCC does still issue commercial licenses that require Morse proficiency, but no jobs require it anymore.

Blinking Morse

Because its signals are so simple – on or off, long or short – Morse code can also be used by flashing lights. Many navies around the world use blinker lights to communicate from ship to ship when they don’t want to use radios or when radio equipment breaks down. The U.S. Navy is actually testing a system that would let a user type words and convert it to blinker light. A receiver would read the flashes and convert it back to text.

Skills learned in the military helped an injured man communicate with his wife across a rocky beach using only his flashlight in 2017.

Other Morse messages

Perhaps the most notable modern use of Morse code was by Navy pilot Jeremiah Denton, while he was a prisoner of war in Vietnam. In 1966, about one year into a nearly eight-year imprisonment, Denton was forced by his North Vietnamese captors to participate in a video interview about his treatment. While the camera focused on his face, he blinked the Morse code symbols for “torture,” confirming for the first time U.S. fears about the treatment of service members held captive in North Vietnam.

Navy pilot Jeremiah Denton, a prisoner of war, blinks Morse code spelling out ‘torture’ during a forced interview with his captors.

Blinking Morse code is slow, but has also helped people with medical conditions that prevent them from speaking or communicating in other ways. A number of devices – including iPhones and Android smartphones – can be set up to accept Morse code input from people with limited motor skills.

There are still many ways people can learn Morse code, and practice using it, even online. In emergency situations, it can be the only mode of communications that will get through. Beyond that, there is an art to Morse code, a rhythmic, musical fluidity to the sound. Sending and receiving it can have a soothing or meditative feeling, too, as the person focuses on the flow of individual characters, words and sentences. Overall, sometimes the simplest tool is all that’s needed to accomplish the task.

ooOOoo

I do hope you read this article in full because it contains much interesting information. Many people will not have a clue about The Morse Code and, as you can see above, it is still relevant.

Finally, I can still remember the The Morse Code after all these years!

It’s time to change our habits.

Funny how things evolve!

A week ago I was casually reading a copy of our local newspaper, the Grants Pass Daily Courier, and inside was a piece by Kathleen Parker, a syndicated columnist, entitled It’s the end of everything – or not.

I found it particularly interesting especially a quotation in her piece by Robert Watson, a British chemist who served as the chair of the panel of the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES). The IPBES had recently published the results of the three-year study by 145 authors from 50 countries.

So I wrote to Kathleen Parker asking if I might have permission to quote that excerpt and, in turn, received her permission to so do.

Here it is:

Robert Watson wrote in a statement that:

“the health of ecosystems on which we and all species depend is deteriorating more rapidly than ever. We are eroding the very foundation of our economies, livelihoods, food security, health and quality of life worldwide.”

But, Watson also said, it’s not too late to repair and sustain nature – if we act now in transformative ways.

It is time to change our habits both at an individual level and the level of countries working together.

Moreover we haven’t got decades. We have got to do it now!