On yesterday’s World This Weekend the programme was entirely devoted to a speech that John Major gave on February 16th. His theme was: “We are moving into a more dangerous world“
From that post: Venus, named for the Roman goddess of love, reaches its greatest brilliancy on Valentine’s Day, February 14. Venus is currently blazing, low in the west after sunset, with Saturn below.
I first met Richard Maugham when we were being treated to a private jet flight to the Hannover International Fair in 1982, some 43 years ago. We were both English and I was living in Tollesbury, Essex, near Colchester and Richard living near Ealing, West London.
The common thread was that all the passengers were major sellers of the Commodore ‘PET’.
Richard and I hit it off straight away. Richard was a fellow salesman. I was ex-IBM Office Products Division and Richard was ex-Olivetti.
Both of us also volunteered for the Prince’s Youth Business Trust, a charity headed by Prince of Wales, as he was then, helping young people start their own business.
The Prince of Wales, me, and Richard Maugham
Gillie and Colin, a couple who know Richard, recently sent me the following email:
Dear Paul,
I write to inform you that Richard passed away today at 2.50 GMT.
Both Colin and I were at his hospital bedside.
Our thoughts and condolences are with you.
Gillie and Colin.
The ‘today’ in the above email was Sunday, 9th February, 2025.
Thinking it through thanks to a recent issue of Skeptical Inquirer.
Melanie Trecer-King is the creator of Thinking is Power and the associate professor of biology at the Massasoit Community College, where she teaches a science course designed to equip her students with essential critical thinking, information literacy, and science literacy skills.
The article was published in the November/December, 2024 issue of the magazine. I believe it is free to share.
ooOOoo
Most people agree that critical thinking is an important skill that should be taught in schools. And most educators think they teach critical thinking. I know I did. After all, I was a science educator, and science is critical thinking. Isn’t it?
For years, I taught general-education biology, a course commonly taken by undergraduates who aren’t science majors. And while I love biology, I grew more and more frustrated with the content. I asked myself: If I had one semester to teach the average student what they need to know about the process of science and critical thinking, what would it look like?
Thankfully, my college allowed me to replace my traditional introductory biology course with a course titled Science for Life, designed to teach critical thinking, information literacy, and science literacy skills (Trecek-King 2022). Since my conversion, I’ve been sharing my new path with anyone who will listen about the value of teaching critical thinking.
Yet conversations with Bertha Vazquez, director of education for the Center for Inquiry, gave me pause. In a recent podcast conversation with the two of us and Daniel Reed (of the West Virginia Skeptics Society), Vazquez was adamant. Educators do teach critical thinking: the Next Generation Science Standards (NGSS) require students to ask questions, plan and carry out investigations, analyze and interpret data, construct explanations, and engage in arguments from evidence.
As a science communicator, I constantly fight misconceptions around certain terms. Theory and skepticism are prime examples. So imagine my surprise (and embarrassment) when I realized that, as a critical thinking educator, I had overlooked an important first step in critical thinking: defining terms. The irony.
What Is Critical Thinking?
While we can all agree that it’s important to teach critical thinking, there’s not always agreement on what we mean by the term.
In his book Critical Thinking, Jonathan Haber (2020) explains how the concept emerged and some of the ways it’s currently defined. John Dewey, in his 1910 work How We Think, proposed one of the first modern definitions of reflective thinking, describing it as an “active, persistent, and careful consideration of any belief” (Dewey 1910, 6).
In his 1941 dissertation, Edward Glaser identified three components of critical thinking: “(1) an attitude of being disposed to consider in a thoughtful way the problems and subjects that come within the range of one’s experiences, (2) knowledge of the methods of logical inquiry and reasoning, and (3) some skill in applying those methods” (Glaser 1941, 5–6). That same year, Glaser and Goodwin Watson published the Watson-Glaser Tests of Critical Thinking (now the Watson-Glaser Critical Thinking Appraisal), a widely used standardized test for assessing critical thinking skills.
Critical thinking’s “big bang” moment, according to Haber, came in the early 1980s when the state of California (Harmon 1980, 3) mandated that all students in its university system complete a course that teaches “an understanding of the relationship of language to logic, leading to the ability to analyze, criticize and advocate ideas, reason inductively and deductively, and reach factual or judgmental conclusions based on sound inferences drawn from unambiguous statements of knowledge or belief.”
And the Delphi Report (Facione 1990, 2), in which Peter Facione worked with critical thinking experts to create a consensus definition, concludes that critical thinking is a “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based.”
From these (and many other) definitions, Haber identifies three interconnected parts of critical thinking: knowledge of critical thinking components, such as logic and argumentation; the skills to put the knowledge to use in real-world situations; and the dispositions needed to prioritize critical thinking honestly and ethically.
This problem-solving view of critical thinking forms the basis of many of the current educational standards, including the NGSS and Common Core, which ask students to think deeply within a specific domain. And it is one scientists themselves use when trying to understand issues.
These are worthy educational goals to be sure. However, in my experience teaching general-education biology, I’ve come to realize that this approach is incomplete.
If critical thinking requires deep knowledge, then our ability to analyze topics is limited to areas in which we possess sufficient expertise. Pedagogy that encourages “independent” thinking outside these areas can have the unintended consequence of teaching students to overestimate their abilities. The best minds know they can’t know everything. Even experts rely on other experts and sources.
Additionally, in the classroom, students are provided with reliable content from which to critically analyze. In the “real world,” these guardrails are nonexistent. Not only is misinformation ubiquitous, disinformation purveyors exploit our biases and emotions to manipulate our reasoning.
And finally, we can’t address science misinformation, from evolution to vaccines to climate change, by giving students more content knowledge. We don’t fall for science denial and pseudoscience because we don’t have the facts but because of our emotions, desires, identities, and biases.
I now have a better understanding of what my colleague and friend Andy Norman means when he says that critical thinking suffers from a branding problem.
Yes, And …?
Using the above definition(s), I was teaching my biology students how to think critically. For example, I didn’t just ask them to memorize the stages of mitosis but to explore the mutations that could disrupt the cell cycle and lead to cancer. But to what end? If (or when) my former students are touched by cancer, will they remember how proto-oncogenes and tumor suppressor genes can lead to unregulated cell growth? Is that even what they need to know? I argue that, especially for students who aren’t going to be scientists, it’s far more important to teach students how and why the process of science results in reliable knowledge … and how to find it.
My Science for Life course and my Thinking Is Power resource are both based on the same premise. Knowledge may be power, but there’s too much to know. Even more, knowledge is a process; it’s not just what we know but how we know. It’s not just a noun but a verb. When we need reliable knowledge, can we find it and use it to make wiser decisions? And how do we know what information to ignore?
As a science educator, I want my students to understand how the process of science produces knowledge and why it’s reliable. Why aren’t comments such as “it worked for me” or “I know what I saw” sufficient evidence? I’ve come to realize that an essential—and often overlooked—ingredient is why we need science in the first place.
Richard Feynman famously said, “The first principle is that you must not fool yourself, and you are the easiest person to fool.” Science is how we correct for this tendency toward self-deception. That’s why I spend the first third of the semester exploring how we come to our beliefs, the limits of our perception and memory, the importance of skepticism, the cognitive biases that can lead our thinking astray, and the logical fallacies we use to convince ourselves (and others) that our conclusions are justified.
Influential voices in the skeptical community played a crucial role in shaping the ingredients of critical thinking I use in Science for Life and Thinking Is Power, which include the following:
Being aware of our limitations: Understanding that our perception and memory are flawed, and the biases and heuristics our brains rely on to make fast and easy decisions can lead us astray.
Arguing with evidence and logic: Using arguments that are well-structured and supported by evidence. This includes understanding how the different types of arguments work (i.e., deductive, inductive, and abductive) and avoiding logical fallacies.
Thinking about our thinking (metacognition): Actively examining and questioning our own thought processes—including the source of our knowledge, assumptions, intuitions, motivations, emotions, and biases—and how they might influence our judgments.
Embracing nuance and uncertainty: Avoiding the black-or-white thinking that can lead to oversimplified conclusions and accepting that our knowledge is never perfect or complete.
Seeking objectivity: Actively working to counter the limitations that prevent us from accurately understanding the world. This includes seeking diverse perspectives, separating our identity from our beliefs, and prioritizing accuracy over ego.
Having curiosity and open-mindedness: Possessing a desire to learn and understand by asking questions and seeking out information, even if it contradicts what we want to believe.
Maintaining healthy skepticism: Balancing gullibility and doubt and proportioning our beliefs to the available evidence. And remembering that claims made without evidence can be dismissed without evidence and extraordinary claims require extraordinary evidence.
Exhibiting intellectual humility: Recognizing the limitations of our knowledge, being open to the expertise of others, and being willing to change our minds with evidence.
In my experience, giving students this foundation is essential for helping them become better consumers of information and science. Without an awareness that our emotions and existing beliefs can drive our reasoning, search engines and low-quality sources become tools to confirm our biases. And without an understanding of how our identities and worldviews can alter our standards of evidence, pseudoscience and science denial provide cover for what we want or don’t want to believe.
The logic of science’s practices, from carefully controlling experimental variables to making the findings available to other experts for scrutiny and replication, falls into place once students understand the problems it’s addressing. Simply put, science is our shield against self-deception.
Now instead of asking students to think critically about the biology of cancer, I teach them how to evaluate sources to find reliable information, how to recognize pseudoscientific “treatments,” and how their need for hope and answers makes them vulnerable to misinformation.
The Take-Home Message
I have no doubt that most educators teach critical thinking. But for pedagogical and communication purposes, it would be beneficial to clarify what we mean—and just as importantly to ask ourselves what we want our students to learn.
The dominant view of critical thinking in education is problem-solving in specific domains, which is absolutely a valuable skill. However, many skeptics view critical thinking as good thinking in a broader sense. My own teaching shifted toward this latter framework after I realized that problem-solving skills are insufficient without a foundation in better thinking. We may be born with the ability to think, but we must be taught to think well, and our primate brains aren’t adapted to today’s tidal wave of misinformation.
I’m grateful to the skeptical community for challenging my assumptions about critical thinking and my friend Bertha Vazquez for encouraging me to think more deeply about the good work science educators do in their classrooms every day. This article is the result of my attempt to reconcile what critical thinking means to educators and what it means to skeptics, and my hope is that it opens a conversation about how we can better serve our students. Maybe it will even start a critical thinking revolution, especially in science education.
Let the critical thinking revolution begin!
Acknowledgments
My thanks go to those on this brief list of skeptical thinkers/authors who’ve influenced my understanding of critical thinking: James Alcock, Timothy Caulfield, John Cook, Brian Dunning, Julia Galef, Adam Grant, David Robert Grimes, Jon Guy, Harriet Hall, Guy Harrison, Daniel Kahneman, David McRaney, Steven Novella, Carl Sagan, Michael Shermer, and Carol Tavris.
Special thanks to Bertha Vazquez, Daniel Pimentel, Andy Norman, Daniel Reed, and Jon Guy for their helpful feedback on this article.
References
Dewey, John. 1910. How We Think. Lexington, MA: D.C. Heath and Company.
Facione, Peter A. 1990. Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction—The Delphi Report. Millbrae, CA: California Academic Press.
Glaser, Edward M. 1941. An Experiment in the Development of Critical Thinking. New York, NY: Teachers College, Columbia University.
Haber, Jonathan. 2020. Critical Thinking. Cambridge, MA: The MIT Press.
Harmon, Harry. 1980. Executive Order No. 338: General Education-Breadth Requirements. The California State University and Colleges.
I hope others have read this fascinating article and repeat the statement: ‘While we can all agree that it’s important to teach critical thinking, there’s not always agreement on what we mean by the term.’
We live on a profoundly ancient and beautiful planet.
I follow the photographic website Ugly Hedgehog and have been doing for some time. There has been a post recently from the section Photo Gallery and ‘greymule’ from Colorado called his entry ‘A Couple of Desert Scenes’ and I will display just one of his images from that post.
It makes a wonderful connection to today’s post which is from The Conversation.
ooOOoo
Evidence from Snowball Earth found in ancient rocks on Colorado’s Pikes Peak – it’s a missing link
Rocks can hold clues to history dating back hundreds of millions of years. Christine S. Siddoway
Around 700 million years ago, the Earth cooled so much that scientists believe massive ice sheets encased the entire planet like a giant snowball. This global deep freeze, known as Snowball Earth, endured for tens of millions of years.
The Snowball Earth hypothesis has been largely based on evidence from sedimentary rocks exposed in areas that once were along coastlines and shallow seas, as well as climate modeling. Physical evidence that ice sheets covered the interior of continents in warm equatorial regions had eluded scientists – until now.
In new research published in the Proceedings of the National Academy of Sciences, our team of geologists describes the missing link, found in an unusual pebbly sandstone encapsulated within the granite that forms Colorado’s Pikes Peak.
Earth iced over during the Cryogenian Period, but life on the planet survived. NASA illustration
Solving a Snowball Earth mystery on a mountain
Pikes Peak, originally named Tavá Kaa-vi by the Ute people, lends its ancestral name, Tava, to these notable rocks. They are composed of solidified sand injectites, which formed in a similar manner to a medical injection when sand-rich fluid was forced into underlying rock.
A possible explanation for what created these enigmatic sandstones is the immense pressure of an overlying Snowball Earth ice sheet forcing sediment mixed with meltwater into weakened rock below.
Dark red to purple bands of Tava sandstone dissect pink and white granite. The Tava is also cross-cut by silvery-gray veins of iron oxide. Liam Courtney-Davies
An obstacle for testing this idea, however, has been the lack of an age for the rocks to reveal when the right geological circumstances existed for sand injection.
We found a way to solve that mystery, using veins of iron found alongside the Tava injectites, near Pikes Peak and elsewhere in Colorado.
A 5-meter-tall, almost vertical Tava dike is evident in this section of Pikes Peak granite. Liam Courtney-Davies
Iron minerals contain very low amounts of naturally occurring radioactive elements, including uranium, which slowly decays to the element lead at a known rate. Recent advancements in laser-based radiometric dating allowed us to measure the ratio of uranium to lead isotopes in the iron oxide mineral hematite to reveal how long ago the individual crystals formed.
The iron veins appear to have formed both before and after the sand was injected into the Colorado bedrock: We found veins of hematite and quartz that both cut through Tava dikes and were crosscut by Tava dikes. That allowed us to figure out an age bracket for the sand injectites, which must have formed between 690 million and 660 million years ago.
So, what happened?
The time frame means these sandstones formed during the Cryogenian Period, from 720 million to 635 million years ago. The name is derived from “cold birth” in ancient Greek and is synonymous with climate upheaval and disruption of life on our planet – including Snowball Earth.
University of Exeter professor Timothy Lenton explains why the Earth was able to freeze over.
The Tava found on Pikes Peak would have formed close to the equator within the heart of an ancient continent named Laurentia, which gradually over time and long tectonic cycles moved into its current northerly position in North America today.
The origin of Tava rocks has been debated for over 125 years, but the new technology allowed us to conclusively link them to the Cryogenian Snowball Earth period for the first time.
The scenario we envision for how the sand injection happened looks something like this:
A giant ice sheet with areas of geothermal heating at its base produced meltwater, which mixed with quartz-rich sediment below. The weight of the ice sheet created immense pressures that forced this sandy fluid into bedrock that had already been weakened over millions of years. Similar to fracking for natural gas or oil today, the pressure cracked the rocks and pushed the sandy meltwater in, eventually creating the injectites we see today.
Clues to another geologic puzzle
Not only do the new findings further cement the global Snowball Earth hypothesis, but the presence of Tava injectites within weak, fractured rocks once overridden by ice sheets provides clues about other geologic phenomena.
Time gaps in the rock record created through erosion and referred to as unconformities can be seen today across the United States, most famously at the Grand Canyon, where in places, over a billion years of time is missing. Unconformities occur when a sustained period of erosion removes and prevents newer layers of rock from forming, leaving an unconformable contact.
Unconformity in the Grand Canyon is evident here where horizontal layers of 500-million-year-old rock sit on top of a mass of 1,800-million-year-old rocks. The unconformity, or ‘time gap,’ demonstrates that years of history are missing. Mike Norton via Wikimedia, CC BY-SA
Our results support that a Great Unconformity near Pikes Peak must have been formed prior to Cryogenian Snowball Earth. That’s at odds with hypotheses that attribute the formation of the Great Unconformity to large-scale erosion by Snowball Earth ice sheets themselves.
We hope the secrets of these elusive Cryogenian rocks in Colorado will lead to the discovery of further terrestrial records of Snowball Earth. Such findings can help develop a clearer picture of our planet during climate extremes and the processes that led to the habitable planet we live on today.
Back in the late 1950s when my mother remarried after my father’s death, ‘Dad’, as he was called when he came to live at Toley Avenue, Preston Road, London, taught me how to construct a radio receiver made from a crystal, a crystal set. I have this memory of listening to Quincy Jones on my crystal set and loving the rhythm.
“Music is sacred to me,” Quincy Jones once said. “Melody is God’s voice.”
He certainly had the divine touch.
Jones, who had died at the age of 91, was the right-hand man to both Frank Sinatra and Michael Jackson, and helped to shape the sound of jazz and pop over more than 60 years.
His recordings revolutionised music by crossing genres, promoting unlikely collaborations and shaping modern production techniques.
Some of the best evidence for this comes from the behavior of two of the most powerful beings of the Maya world: The first is a creator god whose name is still spoken by millions of people every fall – Huracán, or “Hurricane.” The second is a god of lightning, K’awiil, from the early first millennium C.E.
As a scholar of the Indigenous religions of the Americas, I recognize that these beings, though separated by over 1,000 years, are related and can teach us something about our relationship to the natural world.
Huracán, the ‘Heart of Sky’
Huracán was once a god of the K’iche’, one of the Maya peoples who today live in the southern highlands of Guatemala. He was one of the main characters of the Popol Vuh, a religious text from the 16th century. His name probably originated in the Caribbean, where other cultures used it to describe the destructive power of storms.
The K’iche’ associated Huracán, which means “one leg” in the K’iche’ language, with weather. He was also their primary god of creation and was responsible for all life on earth, including humans.
Because of this, he was sometimes known as U K’ux K’aj, or “Heart of Sky.” In the K’iche’ language, k’ux was not only the heart but also the spark of life, the source of all thought and imagination.
Yet, Huracán was not perfect. He made mistakes and occasionally destroyed his creations. He was also a jealous god who damaged humans so they would not be his equal. In one such episode, he is believed to have clouded their vision, thus preventing them from being able to see the universe as he saw it.
Huracán was one being who existed as three distinct persons: Thunderbolt Huracán, Youngest Thunderbolt and Sudden Thunderbolt. Each of them embodied different types of lightning, ranging from enormous bolts to small or sudden flashes of light.
Despite the fact that he was a god of lightning, there were no strict boundaries between his powers and the powers of other gods. Any of them might wield lightning, or create humanity, or destroy the Earth.
Another storm god
The Popol Vuh implies that gods could mix and match their powers at will, but other religious texts are more explicit. One thousand years before the Popol Vuh was written, there was a different version of Huracán called K’awiil. During the first millennium, people from southern Mexico to western Honduras venerated him as a god of agriculture, lightning and royalty.
Illustrations of K’awiil can be found everywhere on Maya pottery and sculpture. He is almost human in many depictions: He has two arms, two legs and a head. But his forehead is the spark of life – and so it usually has something that produces sparks sticking out of it, such as a flint ax or a flaming torch. And one of his legs does not end in a foot. In its place is a snake with an open mouth, from which another being often emerges.
Indeed, rulers, and even gods, once performed ceremonies to K’awiil in order to try and summon other supernatural beings. As personified lightning, he was believed to create portals to other worlds, through which ancestors and gods might travel.
Representation of power
For the ancient Maya, lightning was raw power. It was basic to all creation and destruction. Because of this, the ancient Maya carved and painted many images of K’awiil. Scribes wrote about him as a kind of energy – as a god with “many faces,” or even as part of a triad similar to Huracán.
He was everywhere in ancient Maya art. But he was also never the focus. As raw power, he was used by others to achieve their ends.
Moreover, Maya artists always had K’awiil doing something or being used to make something happen. They believed that power was something you did, not something you had. Like a bolt of lightning, power was always shifting, always in motion.
An interdependent world
Because of this, the ancient Maya thought that reality was not static but ever-changing. There were no strict boundaries between space and time, the forces of nature or the animate and inanimate worlds.
Residents wade through a street flooded by Hurricane Helene, in Batabano, Mayabeque province, Cuba, on Sept. 26, 2024. AP Photo/Ramon Espinosa
Everything was malleable and interdependent. Theoretically, anything could become anything else – and everything was potentially a living being. Rulers could ritually turn themselves into gods. Sculptures could be hacked to death. Even natural features such as mountains were believed to be alive.
The illusion is not that different things exist. Rather it is that they exist independent from one another. Huracán, in this sense, damaged himself by damaging his creations.
Hurricane season every year should remind us that human beings are not independent from nature but part of it. And like Hurácan, when we damage nature, we damage ourselves.
My mother and father were atheists so when I was born in 1944 it was obvious that I would be brought up as an atheist. Same for my sister, Elizabeth, born in 1948. It was amazing that when I met Jean in Mexico in 2007 that she, too, was an atheist. That was on top of the fact that we were both born in North London some 26 miles apart. Talk about fate!
I was abused as a child. The abuse to which I was subjected is called “child indoctrination,” a type of brainwashing considered noble and necessary and, therefore, the most natural thing in the world.
My mother took me to the Seventh-day Adventist Church, an American denomination known for keeping the Sabbath and emphasizing the advent, or return, of Jesus. Adventists boast that they are the only ones to interpret the Bible the way its author wanted. Consequently, they deem themselves the most special creatures to God—so special that they’ll soon arouse the envy and wrath of all other denominations and religions, which, under the command of the beasts of Revelation (the American government and the Catholic Church), will persecute them. Adventists believe that the Earth was created in six days, that it is 6,000 years old, and that dinosaurs are extinct because they were too big to be saved on Noah’s ark.
It closes thus:
I don’t want to believe; I want to know. Atheism is a natural result of intellectual honesty.
The author of the article, Paulo Bittencourt is described as:
Paulo Bittencourt was born in Castro, Brazil, spent his childhood in Rio de Janeiro, and studied theology in São Paulo. Close to becoming a pastor, he went on an adventure to Europe and ended up settling in Austria, where he trained as an opera singer. Bittencourt is the author of the books Liberated from Religion: The Inestimable Pleasure of Being a Freethinker and Wasting Time on God: Why I Am an Atheist.
I am writing this having listened to a programme on BBC Radio 4. (Was broadcast on Radio 4 on Tuesday, August 13th.) It shows how many, many people can have a really positive response to a dastardly negative occurrence such as the Covid outbreak or a pandemic.
Every Friday, volunteers gather on the Albert Embankment at the River Thames in London to lovingly retouch thousands of red hearts inscribed on a Portland stone wall directly opposite the Houses of Parliament. Each heart is dedicated to a British victim of COVID. It is a deeply social space – a place where the COVID bereaved come together to honour their dead and share memories.
The so-called National Covid Memorial Wall is not, however, officially sanctioned. In fact, ever since activists from COVID-19 Bereaved Families for Justice (CBFFJ) daubed the first hearts on the wall in March 2021 it has been a thorn in the side of the authorities.
Featured in the media whenever there is a new revelation about partygate, the wall is a symbol of the government’s blundering response to the pandemic and an implicit rebuke to former prime minister Boris Johnson and other government staff who breached coronavirus restrictions.
As one writer put it, viewed from parliament the hearts resemble “a reproachful smear of blood”. Little wonder that the only time Johnson visited the wall was under the cover of darkness to avoid the TV cameras. His successor Rishi Sunak has been similarly reluctant to acknowledge the wall or say what might take its place as a more formal memorial to those lost in the pandemic.
Though in April the UK Commission on COVID Commemoration presented Sunak with a report on how the pandemic should be remembered, Sunak has yet to reveal the commission’s recommendations.
Lady Heather Hallett, the former high court judge who chairs the public inquiry into COVID, has attempted to acknowledge the trauma of the bereaved by commissioning a tapestry to capture the experiences of people who “suffered hardship and loss” during the pandemic. Yet such initiatives are no substitute for state-sponsored memorials.
What is remembered and what is forgotten?
This political vacuum is odd when you consider that the United Kingdom, like other countries, engages in many other commemorative activities central to national identity. The fallen of the first world war and other military conflicts are commemorated in a Remembrance Sunday ceremony held every November at the Cenotaph in London, for example.
But while wars lend themselves to compelling moral narratives, it is difficult to locate meaning in the random mutations of a virus. And while wars draw on a familiar repertoire of symbols and rituals, pandemics have few templates.
For instance, despite killing more than 50 million globally, there are virtually no memorials to the 1918-1919 “Spanish” influenza pandemic. Nor does the UK have a memorial to victims of HIV/AIDS. As the memory studies scholar Astrid Erll puts it, pandemics have not been sufficiently “mediated” in collective memory.
As a rule, they do not feature in famous paintings, novels or films or in the oral histories passed down as part of family lore. Nor are they able to draw on familiar cultural materials such as poppies, gun carriages, catafalques and royal salutes. Without such symbols and schemata, Erll argues, we struggle to incorporate pandemics into our collective remembering systems.
This lacuna was brought home to me last September when tens of thousands of Britons flocked to the south bank of the Thames to pay their respects to Britain’s longest serving monarch. By coincidence, the police directed the queue for the late Queen’s lying-in-state in Westminster Hall over Lambeth Bridge and along Albert Embankment.
But few of the people I spoke to in the queue seemed to realise what the hearts signified. It was as if the spectacle of a royal death had eclipsed the suffering of the COVID bereaved, rendering the wall all but invisible.
Waiting for answers
Another place where the pandemic could be embedded in collective memory is at the public inquiry. Opening the preliminary hearing last October into the UK’s resilience and preparedness for a pandemic, Lady Hallett promised to put the estimated 6.8 million Britons mourning the death of a family member or friend to COVID at the heart of the legal process. “I am listening to them; their loss will be recognised,” she said.
But though Lady Hallett has strategically placed photographs of the hearts throughout the inquiry’s offices in Bayswater and has invited the bereaved to relate their experiences to “Every Story Matters”, the hearing room is dominated by ranks of lawyers. And except when a prominent minister or official is called to testify, the proceedings rarely make the news.
This is partly the fault of the inquiry process itself. The hearings are due to last until 2025, with the report on the first stage of the process not expected until the summer of 2024. As Lucy Easthope, an emergency planner and veteran of several disasters, puts it: “one of the most painful frustrations of the inquiry will be temporal. It will simply take too long.”
The inquiry has also been beset by bureaucratic obfuscation, not least by the Cabinet Office which attempted (unsuccessfully in the end) to block the release of WhatsApp messages relating to discussions between ministers and Downing Street officials in the run-up to lockdown.
To the inquiry’s critics, the obvious parallel is with the Grenfell inquiry, which promised to “learn lessons” from the devastating fire that engulfed the west London tower in 2017 but has so far ended up blurring the lines of corporate responsibility and forestalling a political reckoning.
The real work of holding the government to account and making memories takes place every Friday at the wall and the other places where people come together to spontaneously mourn and remember absent loved ones. These are the lives that demand to be “seen”. They are the ghosts that haunt our amnesic political culture.
I follow Patrice Ayme and have done for many years. Some of his posts are super-intellectual and those I struggle to understand.
But a post published on September 8th, 2024 was very easy for me, and countless others no doubt, to understand and I have pleasure in republishing it on Learning from Dogs today.
I doubt that there are civilizations around. We are facing a galaxy devoid of intelligent aliens: little green mats out there, not little green men.
First, we don’t see them. With foreseeable technology (hibernation, nuclear propulsion, compact thermonuclear reactors, bioengineering, AI, quantum computers) we should be able to send very large interstellar spaceships at 1,000 kilometers per second… Thus it would take a millennium to colonize the Centaur tri-star system…. 25,000 years to colonize a 200 light years across ball… And the entire galaxy in ten million years… Wars would only accelerate the expansion. So if there was a galactic civilization, within ten million years it would have spanned the entire galaxy and its presence should be in sight.
Second, life took nearly four billion years to evolve animals. Bacterial life could have been nearly extinguished on Earth many times…. Be it only during the Snowball Earths episodes. A star whizzing by could have launched a thousand large comets. The large planets could have fallen inward.
Third, ultra intelligent life may not be able to have hands or tentacles and thus develop industry. Once intelligent life forms have evolved, say sea lions or parrots, let alone wolves, they may just be sitting ducks for the next disaster which would revert life to the bacterial level, erasing billions of years of evolution.
Fourth, when civilization is launched, it can fail… And not get a second chance (from lack of availability of mines after easy picking during the initial civilization).
So I don’t expect little green men… Besides those sent by the perverse Putin…
Just when we thought we knew of all the stress, here is another one: if we go extinct, the universe loses its soul! We have thus found a new Superior Moral Directive: SUS, Save Unique Soul!
Expect little green mats, not little green men.
Patrice Ayme
ooOOoo
A couple of weeks ago I gave a talk to our local Freethinkers group and called it The Next Ten Years. It began, thus:
This presentation is about the world of the future; of the near future. And the biggest issue, most agree, is the change in the climate.
The Global Temperature anomaly, as of last year, 2023, is 1.17 Centigrade, 2.11 Fahrenheit, above the long-term average from 1951 to 1980. The 10 most recent years are the warmest years on record.
Antoine de Saint-Exupéry is quoted as saying that ‘a goal without a plan is just a wish.’ So my plan is to show you how we can change, no let me put that more strongly, how we must change in the next ten years. Because our present habits are ruining the world.