Category: History

The DNA of dogs.

What is revealed in most dogs’ genes.

On November 24th this year, The Conversation published an article that spoke of the ancient closeness, as in genetically, of wolves and dogs.

I share it with you. It is a fascinating read.

ooOOoo

Thousands of genomes reveal the wild wolf genes in most dogs’ DNA.

Modern wolves and dogs both descend from an ancient wolf population that lived alongside woolly mammoths and cave bears. Iza Lyson/500px Prime via Getty Images

Audrey T. Lin, Smithsonian Institution and Logan Kistler, Smithsonian Institution

Dogs were the first of any species that people domesticated, and they have been a constant part of human life for millennia. Domesticated species are the plants and animals that have evolved to live alongside humans, providing nearly all of our food and numerous other benefits. Dogs provide protection, hunting assistance, companionship, transportation and even wool for weaving blankets.

Dogs evolved from gray wolves, but scientists debate exactly where, when and how many times dogs were domesticated. Ancient DNA evidence suggests that domestication happened twice, in eastern and western Eurasia, before the groups eventually mixed. That blended population was the ancestor of all dogs living today.

Molecular clock analysis of the DNA from hundreds of modern and ancient dogs suggests they were domesticated between around 20,000 and 22,000 years ago, when large ice sheets covered much of Eurasia and North America. The first dog identified in the archaeological record is a 14,000-year-old pup found in Bonn-Oberkassel, Germany, but it can be difficult to tell based on bones whether an animal was an early domestic dog or a wild wolf.

Despite the shared history of dogs and wolves, scientists have long thought these two species rarely mated and gave birth to hybrid offspring. As an evolutionary biologist and a molecular anthropologist who study domestic plants and animals, we wanted to take a new look at whether dog-wolf hybridization has really been all that uncommon.

Little interbreeding in the wild

Dogs are not exactly descended from modern wolves. Rather, dogs and wolves living today both derive from a shared ancient wolf population that lived alongside woolly mammoths and cave bears.

In most domesticated species, there are often clear, documented patterns of gene flow between the animals that live alongside humans and their wild counterparts. Where wild and domesticated animals’ habitats overlap, they can breed with each other to produce hybrid offspring. In these cases, the genes from wild animals are folded into the genetic variation of the domesticated population.

For example, pigs were domesticated in the Near East over 10,000 years ago. But when early farmers brought them to Europe, they hybridized so frequently with local wild boar that almost all of their Near Eastern DNA was replaced. Similar patterns can be seen in the endangered wild Anatolian and Cypriot mouflon that researchers have found to have high proportions of domestic sheep DNA in their genomes. It’s more common than not to find evidence of wild and domesticated animals interbreeding through time and sharing genetic material.

That wolves and dogs wouldn’t show that typical pattern is surprising, since they live in overlapping ranges and can freely interbreed.

Dog and wolf behavior are completely different, though, with wolves generally organized around a family pack structure and dogs reliant on humans. When hybridization does occur, it tends to be when human activities – such as habitat encroachment and hunting – disrupt pack dynamics, leading female wolves to strike out on their own and breed with male dogs. People intentionally bred a few “wolf dog” hybrid types in the 20th century, but these are considered the exception.

a wolfish looking dog lies on the ground behind a metal fence
Luna Belle, a resident of the Wolf Sanctuary of Pennsylvania, which is home to both wolves and wolf dogs. Audrey Lin.

Tiny but detectable wolf ancestry

To investigate how much gene flow there really has been between dogs and wolves after domestication, we analyzed 2,693 previously published genomes, making use of massive publicly available datasets.

These included 146 ancient dogs and wolves covering about 100,000 years. We also looked at 1,872 modern dogs, including golden retrievers, Chihuahuas, malamutes, basenjis and other well-known breeds, plus more unusual breeds from around the world such as the Caucasian ovcharka and Swedish vallhund.

Finally, we included genomes from about 300 “village dogs.” These are not pets but are free-living animals that are dependent on their close association with human environments.

We traced the evolutionary histories of all of these canids by looking at maternal lineages via their mitochondrial genomes and paternal lineages via their Y chromosomes. We used highly sensitive computational methods to dive into the dogs’ and wolves’ nuclear genomes – that is, the genetic material contained in their cells’ nuclei.

We found the presence of wild wolf genes in most dog genomes and the presence of dog genes in about half of wild wolf genomes. The sign of the wolf was small but it was there, in the form of tiny, almost imperceptible chunks of continuous wolf DNA in dogs’ chromosomes. About two-thirds of breed dogs in our sample had wolf genes from crossbreeding that took place roughly 800 generations ago, on average.

While our results showed that larger, working dogs – such as sled dogs and large guardian dogs that protect livestock – generally have more wolf ancestry, the patterns aren’t universal. Some massive breeds such as the St. Bernard completely lack wolf DNA, but the tiny Chihuahua retains detectable wolf ancestry at 0.2% of its genome. Terriers and scent hounds typically fall at the low end of the spectrum for wolf genes.

a dog curled up on the sidewalk in a town
A street – or free-ranging – dog in Tbilisi, Georgia. Alexkom000/Wikimedia Commons, CC BY

We were surprised that every single village dog we tested had pieces of wolf DNA in their genomes. Why would this be the case? Village dogs are free-living animals that make up about half the world’s dogs. Their lives can be tough, with short life expectancy and high infant mortality. Village dogs are also associated with pathogenic diseases, including rabies and canine distemper, making them a public health concern.

More often than predicted by chance, the stretches of wolf DNA we found in village dog genomes contained genes related to olfactory receptors. We imagine that olfactory abilities influenced by wolf genes may have helped these free-living dogs survive in harsh, volatile environments.

The intertwining of dogs and wolves

Because dogs evolved from wolves, all of dogs’ DNA is originally wolf DNA. So when we’re talking about the small pieces of wolf DNA in dog genomes, we’re not referring to that original wolf gene pool that’s been kicking around over the past 20,000 years, but rather evidence for dogs and wolves continuing to interbreed much later in time.

A wolf-dog hybrid with one of each kind of parent would carry 50% dog and 50% wolf DNA. If that hybrid then lived and mated with dogs, its offspring would be 25% wolf, and so on, until we see only small snippets of wolf DNA present.

The situation is similar to one in human genomes: Neanderthals and humans share a common ancestor around half a million years ago. However, Neanderthals and our species, Homo sapiens, also overlapped and interbred in Eurasia as recently as a few thousand generations ago, shortly before Neanderthals disappeared. Scientists can spot the small pieces of Neanderthal DNA in most living humans in the same way we can see wolf genes within most dogs.

two small tan dogs walking on pavement on a double lead leash
Even tiny Chihuahuas contain a little wolf within their doggy DNA. Westend61 via Getty Images

Our study updates the previously held belief that hybridization between dogs and wolves is rare; interactions between these two species do have visible genetic traces. Hybridization with free-roaming dogs is considered a threat to conservation efforts of endangered wolves, including Iberian, Italian and Himalayan wolves. However, there also is evidence that dog-wolf mixing might confer genetic advantages to wolves as they adapt to environments that are increasingly shaped by humans.

Though dogs evolved as human companions, wolves have served as their genetic lifeline. When dogs encountered evolutionary challenges such as how to survive harsh climates, scavenge for food in the streets or guard livestock, it appears they’ve been able to tap into wolf ancestry as part of their evolutionary survival kit.

Audrey T. Lin, Research Associate in Anthropology, Smithsonian Institution and Logan Kistler, Curator of Archaeobotany and Archaeogenomics, National Museum of Natural History, Smithsonian Institution

This article is republished from The Conversation under a Creative Commons license. Read the original article.

ooOOoo

Well thanks to Audrey Lin and Logan Kistler for this very interesting study. So even modern dogs have visible traces of wolf in their DNA. It is yet another example of the ability of modern science to discover facts that were unknown a few decades ago.

A worldwide myth.

An incredible fact, as in the truth, that almost nobody will accept.

Until the 22nd November, 2025, that is last Saturday, I believed this lie. A lie that spoke of the dangers, the hazards, the imminent end of the world as I believed it; as in Climate Change!

Very few of you will change your minds, of that I’m sure.

Nonetheless, I am going to republish a long article that was sent to me by my buddy, Dan Gomez.

ooOOoo

Latest Science Further Exposes Lies About Rising Seas

By Vijay Jayaraj

It’s all too predictable: A jet-setting celebrity or politician wades ceremoniously into hip-deep surf for a carefully choreographed photo op, while proclaiming that human-driven sea-level rise will soon swallow an island nation. Of course, the water is deeper than the video’s pseudoscience, which is as shallow as the theatrics.

The scientific truth is simple: Sea levels are rising, but the rate of rise has not accelerated. A new peer-reviewed study confirms what many other studies have already shown – that the steady rise of oceans is a centuries-long process, not a runaway crisis triggered by modern emissions of carbon dioxide (CO2).

For the past 12,000 years, during our current warm epoch known as the Holocene, sea levels have risen and fallen dramatically. For instance, during the 600-year Little Ice Age, which ended in the mid-19th century, sea levels dropped quite significantly.

The natural warming that began in the late 1600s got to a point around 1800 where loss of glacial ice in the summer began to exceed winter accumulation and glaciers began to shrink and seas to rise. By 1850, full-on glacial retreat was underway.

Thus, the current period of gradual sea-level increase began between 1800-1860, preceding any significant anthropogenic CO2 emissions by many decades. The U.S. Department of Energy’s 2025 critical review on carbon dioxide and climate change confirms this historical perspective.

“There is no good, sufficient or convincing evidence that global sea level rise is accelerating –there is only hypothesis and speculation. Computation is not evidence and unless the results can be practically viewed and measured in the physical world, such results must not be presented as such,” notes Kip Hansen, researcher and former U.S. Coast Guard captain.

New Study Confirms No Crisis

While activists speak of “global sea-level rise,” the ocean’s surface does not behave like water in a bathtub. Regional currents, land movements, and local hydrology all influence relative sea level. This is why local tide gauge data is important. As Hansen warns, “Only actually measured, validated raw data can be trusted. … You have to understand exactly what’s been measured and how.”

In addition, local tide-gauge data cannot be extrapolated to represent global sea level. This is because the geographic coverage of suitable locations for gauges is often poor, with the majority concentrated in the Northern Hemisphere. Latin America and Africa are severely under-represented in the global dataset.  Hansen says, “The global tide gauge record is quantitatively problematic, but individual records can be shown as qualitative evidence for a lack of sea-level rise acceleration.”

A new 2025 study provides confirmation. Published in the Journal of Marine Science and Engineering, the study systematically dismantles the narrative of accelerating sea-level rise. It analyzed empirically derived long-term rates from datasets of sufficient length – at least 60 years – and incorporated long-term tide signals from suitable locations.

The startling conclusion: Approximately 95% of monitoring locations show no statistically significant acceleration of sea-level rise. It was found that the steady rate of sea-level rise – averaging around 1 to 2 millimeters per year globally – mirrors patterns observed over the past 150 years.

The study suggests that projections by the Intergovernmental Panel on Climate Change (IPCC), which often predicts rates as high as 3 to 4 millimeters per year by 2100, overestimate the annual rise by approximately 2 millimeters.

This discrepancy is not trivial. It translates into billions of dollars in misguided infrastructure investments and adaptation policies, which assume a far worse scenario than what the data support. Because we now know that local, non-climatic phenomena are a plausible cause of the accelerated sea level rise measured locally.

Rather than pursuing economically destructive initiatives to reduce greenhouse gas emissions on the basis of questionable projections and erroneous climate science, money and time should be invested in supporting coastal communities with accurate data for practical planning to adapt to local sea level rise.

Successful adaptation strategies have existed for centuries in regions prone to flooding and sea-level variations. The Netherlands is an excellent example of how engineering solutions can protect coastal populations even living below sea level.

Rising seas are real but not a crisis. What we have is a manageable, predictable phenomenon to which societies have adapted for centuries. To inflate it into an existential threat is to mislead, misallocate, and ultimately harm the communities that policymakers claim to protect.

This commentary was first published by PJ Media on September 10, 2025.

Vijay Jayaraj is a Science and Research Associate at the CO₂ Coalition, Fairfax, Virginia. He holds an M.S. in environmental sciences from the University of East Anglia and a postgraduate degree in energy management from Robert Gordon University, both in the U.K., and a bachelor’s in engineering from Anna University, India.

ooOOoo

I shall be returning to this important topic soon. Probably by republishing that 2025 Study referred to in the above article.

I hope that you read this post.

Thank you, Dan.

We humans are still evolving.

An article in The Conversation caught my eye.

We must never forget that evolution is always happening.

So without any more from me here is that article.

ooOOoo

If evolution is real, then why isn’t it happening now? An anthropologist explains that humans actually are still evolving

Inuit people such as these Greenlanders have evolved to be able to eat fatty foods with a low risk of getting heart disease. Olivier Morin/AFP via Getty Images

Michael A. Little, Binghamton University, State University of New York


If evolution is real, then why is it not happening now? – Dee, Memphis, Tennessee


Many people believe that we humans have conquered nature through the wonders of civilization and technology. Some also believe that because we are different from other creatures, we have complete control over our destiny and have no need to evolve. Even though lots of people believe this, it’s not true.

Like other living creatures, humans have been shaped by evolution. Over time, we have developed – and continue to develop – the traits that help us survive and flourish in the environments where we live.

I’m an anthropologist. I study how humans adapt to different environments. Adaptation is an important part of evolution. Adaptations are traits that give someone an advantage in their environment. People with those traits are more likely to survive and pass those traits on to their children. Over many generations, those traits become widespread in the population.

The role of culture

We humans have two hands that help us skillfully use tools and other objects. We are able to walk and run on two legs, which frees our hands for these skilled tasks. And we have large brains that let us reason, create ideas and live successfully with other people in social groups.

All of these traits have helped humans develop culture. Culture includes all of our ideas and beliefs and our abilities to plan and think about the present and the future. It also includes our ability to change our environment, for example by making tools and growing food.

Although we humans have changed our environment in many ways during the past few thousand years, we are still changed by evolution. We have not stopped evolving, but we are evolving right now in different ways than our ancient ancestors. Our environments are often changed by our culture.

We usually think of an environment as the weather, plants and animals in a place. But environments include the foods we eat and the infectious diseases we are exposed to.

A very important part of the environment is the climate and what kinds of conditions we can live in. Our culture helps us change our exposure to the climate. For example, we build houses and put furnaces and air conditioners in them. But culture doesn’t fully protect us from extremes of heat, cold and the sun’s rays.

a man runs after one of several goats in a dry, dusty landscape
The Turkana people in Kenya have evolved to survive with less water than other people, which helps them live in a desert environment. Tony Karumba/AFP via Getty Images

Here are some examples of how humans have evolved over the past 10,000 years and how we are continuing to evolve today.

The power of the sun’s rays

While the sun’s rays are important for life on our planet, ultraviolet rays can damage human skin. Those of us with pale skin are in danger of serious sunburn and equally dangerous kinds of skin cancer. In contrast, those of us with a lot of skin pigment, called melanin, have some protection against damaging ultraviolet rays from sunshine.

People in the tropics with dark skin are more likely to thrive under frequent bright sunlight. Yet, when ancient humans moved to cloudy, cooler places, the dark skin was not needed. Dark skin in cloudy places blocked the production of vitamin D in the skin, which is necessary for normal bone growth in children and adults.

The amount of melanin pigment in our skin is controlled by our genes. So in this way, human evolution is driven by the environment – sunny or cloudy – in different parts of the world.

The food that we eat

Ten thousand years ago, our human ancestors began to tame or domesticate animals such as cattle and goats to eat their meat. Then about 2,000 years later, they learned how to milk cows and goats for this rich food. Unfortunately, like most other mammals at that time, human adults back then could not digest milk without feeling ill. Yet a few people were able to digest milk because they had genes that let them do so.

Milk was such an important source of food in these societies that the people who could digest milk were better able to survive and have many children. So the genes that allowed them to digest milk increased in the population until nearly everyone could drink milk as adults.

This process, which occurred and spread thousands of years ago, is an example of what is called cultural and biological co-evolution. It was the cultural practice of milking animals that led to these genetic or biological changes.

Other people, such as the Inuit in Greenland, have genes that enable them to digest fats without suffering from heart diseases. The Turkana people herd livestock in Kenya in a very dry part of Africa. They have a gene that allows them to go for long periods without drinking much water. This practice would cause kidney damage in other people because the kidney regulates water in your body.

These examples show how the remarkable diversity of foods that people eat around the world can affect evolution.

gray scale microscope image of numerous blobs
These bacteria caused a devastating pandemic nearly 700 years ago that led humans to evolve resistance to them.
Image Point FR/NIH/NIAID/BSIP/Universal Images Group via Getty Images

Diseases that threaten us

Like all living creatures, humans have been exposed to many infectious diseases. During the 14th century a deadly disease called the bubonic plague struck and spread rapidly throughout Europe and Asia. It killed about one-third of the population in Europe. Many of those who survived had a specific gene that gave them resistance against the disease. Those people and their descendants were better able to survive epidemics that followed for several centuries.

Some diseases have struck quite recently. COVID-19, for instance, swept the globe in 2020. Vaccinations saved many lives. Some people have a natural resistance to the virus based on their genes. It may be that evolution increases this resistance in the population and helps humans fight future virus epidemics.

As human beings, we are exposed to a variety of changing environments. And so evolution in many human populations continues across generations, including right now.


Michael A. Little, Distinguished Professor Emeritus of Anthropology, Binghamton University, State University of New York

This article is republished from The Conversation under a Creative Commons license. Read the original article.

ooOOoo

This was published for the Curious Kids section of The Conversation.

However, I believe this is relevant for those adults as well who are interested in the subject. I’m in my 80’s and find this deeply interesting.

The Truth about Gods, part two

The concluding part of this essay by Patrice Ayme.

ooOOoo

Endowing aspects of the universe with spirituality, a mind of their own, is stupid in this day and age, only if one forgets there are natural laws underlying them. But if one wants to feel less alone and more purposeful, it is pretty smart.    

Patrice Ayme

Here is the inventor of monotheism: Nefertiti. Once a fanatic of Aten, the sun god, she turned cautious, once Pharaoh on her own, backpedalled and re-authorized Egyptian polytheism. (The sun-God, Sol Invictis, was revived by Roman emperor Dioclesian 17 centuries later, in his refounding of Romanitas and the empire. His ultra young successor and contemporary, emperor Constantine, used the revived monotheism to impose his invention of Catholicism. Funny how small the conceptual world is.)

***

The preceding part (see Part One yesterday} contains many iconoclastic statements which made the Articial Intelligence (AI) I consulted with try to correct me with what were conventional, but extremely erroneous, ill-informed data points. AI also use the deranged upside down meta-argument that it is well-known that Christianism is not like that, so I have got to be wrong. Well, no, I was raised as a Catholic child in two different Muslim countries, and also in a Pagan one; the Muslim faiths I knew as child were as different from Suny/Shiah faiths as Christianism is, overall, from Islamism. In other words, I know the music of monotheism. So here are:

***

TECHNICAL NOTES: 

[1] To speak in philosophical linguo, we capture two civilizational “ontologies” (logic of existence):

  1. Polytheistic-personal: relational, distributed, ecological.
  2. Monotheistic-fascistic: hierarchical, authoritarian, abstracted.

[2] Paganus, in a religious sense, appears first in the Christian author Tertullian, around 202 CE, to evoke paganus in Roman military jargon for ‘civilian, incompetent soldier‘ by opposition to the competent soldiers (milites) of Christ that Tertullian was calling for.

[3] ‘FAIR OF FACE, Joyous with the Double Plume, Mistress of Happiness, Endowed with Favour, at hearing whose voice one rejoices, Lady of Grace, Great of Love, whose disposition cheers the Lord of the Two Lands.

With these felicitous epithets, inscribed in stone more than 3,300 years ago, on the monumental stelae marking the boundaries of the great new city at Tell el Amarna on the Nile in central Egypt, the Pharaoh Akhenaten extolled his Great Royal Wife, the chief queen, the beautiful Nefertiti.

Nefertiti (‘the beautiful one has come‘) co-ruled Egypt with her older (and apparently famously ugly, deformed by disease) husband. Egypt was at its wealthiest. She was considered to be a DIVINITY. All her life is far from known, and revealed one fragment of DNA or old text at a time. She ruled as sole Pharaoh after her husband’s death, and seems to have offered to marry the Hittite Prince (as revealed by a recently found fragment: ”I do not wish to marry one of my subjects. I am afraid…” she confessed in a letter to the amazed Hittite emperor.). She apparently decided to re-allow the worship of the many Egyptian gods, and her adoptive son and successor Tutankhaten switched his name to Tutankhamen). Both her and Tutankhamen died, and they were replaced by a senior top general of Akhenatten who both relieved the dynasty from too much inbreeding (hence the deformed Akhenaten) and too much centralism focused on the sun-disk (‘Aten’)  

[4] Those who do not know history have a small and ridiculous view of FASCISM. Pathetically they refer to simpletons, such as Hitler and Mussolini, to go philosophical on the subject.. Google’s Gemini tried to pontificate that ‘labeling the structure of monotheism (especially its early forms) as fascistic’ is anachronistic and highly inflammatory. Fascism is a specific 20th-century political ideology. While the author means authoritarian and hierarchical, using ‘fascistic’ distracts from the historical and philosophical points by introducing modern political baggage. It would be clearer and less polemical to stick to Hierarchical’ or ‘Authoritarian-Centralized.

I disagree virulently with this cognitive shortsightedness of poorly programmed AI. The Romans were perfectly aware of the meaning that the faces symbolized (they got them from the Etruscans). So were the founders of the French and American republics aware of the importance of fascism and the crucial capabilities it provided, the powerful republics which, in the end, succeeded the Roman Republic (which died slowly under the emperors until it couldn’t get up); those two republics gave the basic mentality now ruling the planet.

Fascism is actually an instinct. Its malevolent and dumb confiscation by ignorant  morons such as Hitler and Mussolini ended pathetically under the blows of regimes (the democracies on one side, the fascist USSR on the other) which were capable of gathering enough, and much more, and higher quality fascism of their own to smother under a carpet of bombs the cretinism of the genocidal tyrants. It is actually comical, when reading old battles stories, to see the aghast Nazis out-Nazified by their Soviet opponents (discipline on the Soviet side was a lethal affair at all and any moment.) Or then to see SS commanders outraged by the ferocity of their US opponents. At Bir Hakeim, a tiny French army, 3,000 strong, buried in the sands, blocked the entire Afrika Korps and the Italian army, for weeks, under a deluge of bombs and shells, killing the one and only chance the Nazis had to conquer the Middle East. Hitler ordered the survivors executed, Rommel, who knew he was finished, disobeyed him.   

***

Early Christianism was highly genocidal. The Nazi obsession with the Jews was inherited from Nero (who, unsatisfied with just crucifying Christians (64 CE), launched the annihilation wars against Israel) and then the Christians themselves. There were hundreds of thousands of Samaritans, a type of Jew, with their own capital and temple (above Haifa). Warming up, after centuries of rage against civilization, the Christians under emperor Justinian, in the Sixth Century, nearly annihilated the Samaritans; a genocide by any definition.

Later, by their own count, at a time when Europe and the greater Mediterranean counted around 50 million inhabitants, the Christians, over centuries, killed no less than 5 million Cathars from Spain to Anatolia. Cathars, the pure ones in Greek (a name given to them by their genociders), were a type of Christian). In France alone, in a period of twenty years up to a million were killed, (not all Cathars, but that accentuates the homicidal character). As a commander famously said: ”Tuez les tous, Dieu reconnaitra les siens” (Kill them all, God will recognize his own). The anti-Cathars genocide drive in France, an aptly named ‘crusade‘,  something about the cross, lasted more than a century (and boosted the power of the Inquisition and the Dominicans). The extinction of Catharism was so great that we have only a couple of texts left. 

Want to know about Christianism? Just look at the torture and execution device they brandish, the cross. Christianism literally gave torture and execution a bad name, and it’s all the most cynical hypocrisy hard at work. 

And so on. To abstract it in an impactful way, one could say that much of Christianism instigated Nazism. That’s one of the dirty little secrets of history, and rather ironical as the dumb Hitler was anti-Christian, and still acted like one, unbeknownst to himself, his public, and his critiques; those in doubt can consult the descriptions of the Crusades by the Franks themselves, when roasting children was found to relieve hunger.

Chroniclers like Radulph of Caen (a Norman historian writing around 1118) described it vividly: “In Ma’arra our troops boiled pagan adults in cooking-pots; they impaled children on spits and devoured them grilled.” Other sources, such as Joinville, Fulcher of Chartres and Albert of Aachen, corroborate the desperation and brutality, though they express varying degrees of horror or justification.   These acts were not systematic policy but extreme responses to the hunger and chaos of war, and they were preserved in Frankish narratives as part of the Crusade’s grim legacy. (There were also cases of cannibalism in WW2).

Christianism, when not actively genocidal, certainly instigated a mood, a mentality, of genocide; read Roman emperor Theodosius I about heresy. Here is the end of Theodosius’ famous quote: ‘According to the apostolic teaching and the doctrine of the Gospel, let us believe in the one deity of the Father and of the Son and of the Holy Spirit, in equal majesty and in a holy Trinity. We order the followers of this law to embrace the name of Catholic Christians; but as for the others, we judge them to be demented and ever more insane (dementes vesanosque iudicantes), we decree that they shall be branded with the ignominious name of heretics, and shall not presume to give to their conventicles the name of churches. They will suffer in the first place the chastisement of the divine condemnation and in the second the PUNISHMENT OF OUR AUTHORITY which in accordance with the will of Heaven WE SHALL DECIDE TO INFLICT.

The ‘Men In Black‘ of the Fourth Century destroyed books, libraries and intellectuals, ensuring the smothering of civilization, as intended (destruction of the Serapeum in Alexandria, the world’s largest library) around 391 CE. Contemporary writers like Eunapius and Libanius lamented the ‘rage for destruction’ of the Men In Black. Some non-Christian texts were preserved in monasteries, true, but the point is that Christianism made possible the destruction of non-Yahweh knowledge. This is the problem king David himself already had, the fascism, the power obsessed little mind of Yahweh. Monasteries were often built with a covert anti-Christian mentality, things were complicated. When queen Bathilde outlawed slavery (657 CE), her closest allies were bishops, yet she had to execute other, slave-holding bishops. She also founded and funded four monasteries. Soon the Frankish government passed a law enforcing secular teaching by religious establishments.

The uniforms of the Men In Black were copied later by the Dominicans (‘Black Friars’) who led the genocide of the Cathars, in co-operation with the Inquisition, also dressed in black, and then the SS. Luther. Saint Louis expressed explicitly that nothing would bring them more joy than Jews suffering to death. Saint Louis was more descriptive, evoking a knife planted in the belly of the unbeliever and great pleasure. In Joinville’s Life of Saint Louis (c. 1309), Louis recounts a story of a knight who, during a debate with Jews, stabbed one in the belly with a dagger, saying it was better to “kill him like that” than argue theology.  Louis presented this approvingly as zeal for the faith, and wished he could partake. Although he warned, he wouldn’t do it, that would be illegal. With a faith like that Louis IX could only be canonized in 1297 CE. And, following Saint Louis’ hint, the Nazis removed his legal objection by changing the law in 1933, when they got to power.

Luther gave multiple and extensive ‘sincere advices‘ on how to proceed with the genocide of the Jews in his book: ”The Jews and their Lies”. Here is a sample: “If I had to baptize a Jew, I would take him to the bridge of the Elbe, hang a stone around his neck and push him over, with the words, ‘I baptize thee in the name of Abraham.” 

But Musk’s AI, ‘Grok’ informed me that its basic axiom was that Christianism was never genocidal, but instead ‘suppressive‘. Then it thought hard for ‘nine seconds‘ to try to prove to me, with biased context, that I had exaggerated.

I had not.  

***.  

The entire church was into assassination madness, glorifying in its own cruelty; the chief assassin of Hypatia, a sort of Charlie Manson to the power 1000, was made into a saint: Saint Cyril. Cyril’s minions grabbed Hypatia who had just finished giving a lecture, dragged her in the streets, and stripped her clothing, and then stripped her of her own skin, flaying her with oysters shells, causing her demise. She was the top intellectual of the age. Hypatia met her torturous end in 415 CE. Cyril was made into a saint 29 years later, in 444 CE! With saints like that, who needs Hitler?

Not to say Catholicism was useless; the jealous and genocidal, yet loving and all-knowing Yahweh was always a good excuse to massacre savages and extend civilization on the cheap. The Teutonic Knights, finding the Yahweh fanatics known as Muslims too hard a nut to crack, regrouped in Eastern Germany and launched a very hard fought crusade against the Prussian Natives, who were Pagans. After mass atrocities on both sides, the Teutons won.

The Franks embraced the capabilities of the cross, fully. Having converted to Catholicism, they were in a good position to subdue other Germans, who were Arians (and that they did, submitting Goths and Burgonds, Ostrogoths and Lombards). Three centuries later, Charlemagne used Christianism as an instrument to kill Saxons on an industrial scale, in the name of God, to finally subdue them, after Saxons had terminally aggravated Romanitas for 800 years, driving Augustus crazy

Charlemagne, in daily life, showing how relative Christianism was, and its true Jewish origins, used the nickname ‘David’ for king David, the monarch who refused to obey Yahweh, who had ordered David to genocidize a people (petty, jealous God Yahweh then tortured David’s son to death)

Charlemagne lived the life of a hardened Pagan, with a de-facto harem, etc. More viciously, Charlemagne passed laws pushing for secular, and thus anti-Christian education. Following in these respects a well-established Frankish custom. Charlemagne knew Christianism was a weapon, and he was careful to use it only on the Saxons; internally, there was maximum tolerance: Christians could become Jews, if they so wished.

PA

ooOOoo

I found the full essay quite remarkable. Jean has heard me rattle on about it numerous times since I first read the essay on November 2nd. I sincerely hope you will read it soon.

Finally, let me reproduce what I wrote in a response to Patrice’s post:

Patrice, in your long and fascinating article, above, you have educated me in so many ways. My mother was an atheist and I was brought up in likewise fashion. But you have gone so much further in your teachings.

Your article needs a further reading. But I am going to share it with my readers on LfD so many more can appreciate what you have written. Plus, I am going to republish it over two days.

Thank you, thank you, thank you!


The Truth about Gods, part one.

A brilliant essay by Patrice Ayme.

Patrice writes amazing posts, some of which are beyond me. But this one, The Personification Of The World, PAGANISM, Gives Us Friends Everywhere, is incredible.

My own position is that my mother and father were atheists and I was brought up as one. Apart from a slip-up when I was married to my third wife, a Catholic, and she left me and I thought that by joining the Catholic church I might win her back. (My subconcious fear of rejection.)

My subconscious fear of rejection was not revealed to me until the 50th anniversary of my father’s death in 2006 when I saw a local psychotherapist. Then I met Jean in December, 2007 and she was the first woman I truly loved!

Back to the essay; it is a long essay and I am going to publish the first half today and the second half tomorrow. (And I have made some tiny changes.)

ooOOoo

The Personification Of The World, PAGANISM, Gives Us Friends Everywhere

Abstract: Personification of the world (polytheism/paganism) is more pragmatic, psychologically rewarding, and ecologically sound than the hierarchical, abstracted structure of monotheism, which the author labels “fascistic.” [4]

***

Switching to a single fascist God, Ex Pluribus Unum, a single moral order replacing the myriad spirits of the world, was presented as a great progress: now everybody could, should, line up below the emperor, God’s representative on Earth, and obey the monarch and his or her gang. The resulting organization was called civilization. Submitting to God was the only way Rome could survive, because it provided a shrinking army and tax base with more authority than it deserved.

However peasants had to predict the world and it was more judicious to personalize aspects of it. The resulting logico-emotional relationship had another advantage: the supportive presence of a proximal Gods… All over!.[1]

*** 

personification

/pəˌsɒnɪfɪˈkeɪʃn/ noun

1.the attribution of a personal nature or human characteristics to something non-human, or the representation of an abstract quality in human form.

***

Before Christianism, Gods were everywhere. When the Christians took over, they imposed their all powerful, all knowing Jewish God. Some present the Jewish God as a great invention, symbolizing some sort of progress of rationality that nobody had imagined before. 

However, the single God concept was not that new. Even Americans had it in North America, as the chief of God, the Aztecs, had a similar concept, and even Zeus was a kind of chief God. Zoroastrianism had Ahura Mazda, who did not control Angra Manyu, but still was somewhat more powerful. The Hindus had Vishnu and his many avatars.

Eighteen centuries before those great converters to Christianism, Constantine, Constantius II, and Theodosius I, there was an attempt to forcefully convert the Egyptians to a single God. Pharaoh Akhenaten’s monotheistic experiment (worship of Aten) caused turmoil and was erased by his immediate successors.

According to the latest research it seems likely that the famous Nefertiti became a Pharaoh on her own, after the death of her husband, and retreated from monotheism by re-establishing Egyptian polytheism [3]. In the fullness of time, the infernal couple got struck by what the Romans, 15 centuries later, would call damnatio memoriae. Their very names and faith were erased from hundreds of monuments

Shortly after the Aten episode, there was another confrontation between polytheism and monotheism. The colonizers of Gaza were apparently Greek, of Aegean origin, and, as such, over more than a millennium of conflict with the Jewish states in the hills, Greek Gods confronted Yahweh. The Greeks obviously did not see Yahweh as a major conceptual advance, as they did not adopt Him (until Constantine imposed Him, 15 centuries later).

While the area experienced enormous turmoil, including the permanent eclipse of Troy after the war with Greece (see Homer), and later its replacement by Phrygia, then followed by the Bronze Age collapse, then the rise of Tyre, and the Assyrian conquest, the Greeks survived everything, and their civilization kept sprawling (the early Christian writings were in Greek).

Ultimately, the lack of ideological bending, the obsession with pigs and other silliness, helped to bring devastating Judeo-Roman wars. By comparison, the much larger Gaul bent like a reed when confronted with the Greco-Romans, absorbing the good stuff. Mercury, the God of trade, preceded Roman merchants. Gaul didn’t take religion too seriously, and went on with civilizational progress.

The lack of elasticity of the single God religion of the Jews brought their quasi-eradication by Rome; Judaism was tolerated, but Jewish nationalism got outlawed. By comparison, the Greeks played the long game, and within a generation or so of Roman conquest, they had spiritually conquered their conqueror. Christianism was actually an adaptation of Judaism to make Yahweh conquer the heart and soul of fascist Rome.

***

To have Gods everywhere? Why not? Is not the Judeo-Christian God everywhere too? Doesn’t it speak through fountains, and the immensity of the desert, and the moon, and the stars, too?

***

Yahweh, the Jewish God Catholic Romans called “Deus” was deemed to be also the ultimate love object. Yahweh had promised land to the Jews, Deus promised eternal life of the nicest sort – To all those who bought the program. 

Christians were city dwellers and their power over the countryside and barbarians came from those who had imposed Christianism, the imperial powers that be (at the time, more than 90% of the people worked in agriculture). Already as a teenager, Constantine, a sort of superman born from imperial purple, terrified the court which was supposed to hold him hostage. Such a brute and excellent general could only get inspired by Yahweh’s dedication to power.

The country dwellers, the villagers, disagreed that they needed to submit to a God organized, celebrated and imposed by the all-powerful government (god-vernment?). In classical Latin paganus meant ‘villager, rustic; civilian, non-combatant’. In late imperial Latin it came to mean non-Judeo-Christian (and later non-Judeo-Christo-Islamist) [2].

Christianism found it very difficult to penetrate the countryside, where the food was produced. It never quite succeeded (Even in Muslim Albania, Pagan rituals survived until the 20th century; much of the cult of saints is barely disguised Paganism).

Peasants knew that power was distributed throughout nature, and they had to understand those powers, thus love them – That enabled them to predict phenomena.

Peasants could ponder the mood of a river, and even predict it; flooding was more of a possibility in some circumstances, and then it was no time to indulge in activities next to the river. Peasants had to guess the weather, and the earlier, the better. Peasants had to know which part of the astronomical cycle they were in, to be able to plant crops accordingly; that was not always clear just looking outside, but the stars would tell and could be trusted to tell the truth.

We can be friends to human beings, and sometimes it’s great, but sometimes we feel betrayed and abandoned. But a mountain or a sea? They will always be there, they are not running away, they are never deliberately indifferent, and generally exhibit predictable moods. It is more pragmatic and rewarding to love them more rather than an abstract Dog in Heavens. Call them Gods if you want.

ooOOoo

Part two will be published tomorrow.

I am publishing the notes, on both days, so you can look them up now rather than waiting another day.

TECHNICAL NOTES: 

[1] To speak in philosophical linguo, we capture two civilizational ‘ontologies’ (logic of existence):

  1. Polytheistic-personal: relational, distributed, ecological.
  2. Monotheistic-fascistic: hierarchical, authoritarian, abstracted.

[2] Paganus, in a religious sense, appears first in the Christian author Tertullian, around 202 CE, to evoke paganus in Roman military jargon for ‘civilian, incompetent soldier‘ by opposition to the competent soldiers (milites) of Christ that Tertullian was calling for.

[3] ‘FAIR OF FACE, Joyous with the Double Plume, Mistress of Happiness, Endowed with Favour, at hearing whose voice one rejoices, Lady of Grace, Great of Love, whose disposition cheers the Lord of the Two Lands.

With these felicitous epithets, inscribed in stone more than 3,300 years ago on the monumental stelae marking the boundaries of the great new city at Tell el Amarna on the Nile in central Egypt, the Pharaoh Akhenaten extolled his Great Royal Wife, the chief queen, the beautiful Nefertiti.

Nefertiti (‘the beautiful one has come‘) co-ruled with her older (and apparently famously ugly, deformed by disease) husband, Egypt. Egypt was at its wealthiest. She was considered to be a DIVINITY. All her life is far from known, and revealed one fragment of DNA or old text at a time. She ruled as sole Pharaoh after her husband’s death, and seems to have offered to marry the Hittite Prince (as revealed by a recently found fragment: ”I do not wish to marry one of my subjects. I am afraid…” she confessed in a letter to the amazed Hittite emperor). She apparently decided to re-allow the worship of the many Egyptian gods and her adoptive son and successor Tutankhaten switched his name to Tutankhamen. Both her and Tutankhamen died, and they were replaced by a senior top general of Akhenatten who both relieved the dynasty from too much inbreeding (hence the deformed Akhenaten) and too much centralism focused on the sun-disk (‘Aten’).  

[4] Those who do not know history have a small and ridiculous view of FASCISM. Pathetically they refer to simpletons, such as Hitler and Mussolini, to go philosophical on the subject. Google’s Gemini tried to pontificate that ‘labeling the structure of monotheism (especially its early forms) as fascistic’ is anachronistic and highly inflammatory. Fascism is a specific 20th-century political ideology. While the author means authoritarian and hierarchical, using ‘fascistic’ distracts from the historical and philosophical points by introducing modern political baggage. It would be clearer and less polemical to stick to Hierarchical’ or ‘Authoritarian-Centralized’.

The start of it all!

A wonderful documentary of the formation of Planet Earth.

From the website The Earth through time, I quote: The Earth was formed about 4.6 billion years ago. 4.6 billion is 4,600,000.000 years ago. It was formed by collisions of particles in a large cloud of material. Slowly gravity gathered together all these particles of dust and gas and formed larger clumps. These clumps continued to collide and gradually grew bigger and bigger eventually forming the Earth. The earth at this time was very different to how we know it today.

I left a comment on the site: What a wonderful story.. So many comments in support of this fantastic film, and rightly so.

Reply

Prehistory

We all live in the Quantenary period. From Wikipedia I quote a small piece:

It follows the Neogene Period and spans from 2.6 million years ago to the present.

I don’t know about you but 2.6 million years ago (Ma) seems like a very long time. But then the prior period was the Neogene that went from 2.6 Ma to 23 Ma.

But if one wants to think ‘old’ then try the Ordovician period:

The Ordovician spans 41.6 million years from the end of the Cambrian Period 486.85 Ma (million years ago) to the start of the Silurian Period 443.1 Ma.

****

Just to put us humans into context, human evolution is very much shorter. I have it from six million years onwards. But here are two videos, courtesy of YouTube. The first one is a short one:

Scientists use fossils to reconstruct the evolutionary history of hominins—the group that includes modern humans, our immediate ancestors, and other extinct relatives. Today, our closest living relatives are chimpanzees, but extinct hominins are even closer. Where and when did they live? What can we learn about their lives? Why did they go extinct? Scientists look to fossils for clues.

 The second video is a 54-minute one from PBS.

They have both been watched thousands of times.

Now on to today’s post.

ooOOoo

Giant ground sloths’ fossilized teeth reveal their unique roles in the prehistoric ecosystem

Harlan’s ground sloth fossil skeleton excavated and displayed at the La Brea Tar Pits in Los Angeles. Larisa DeSantis

Larisa R. G. DeSantis, Vanderbilt University and Aditya Reddy Kurre, University of Pennsylvania

animal hanging from a branch looks upside down at the camera
A two-toed sloth at the Nashville Zoo. Larisa R. G. DeSantis

Imagine a sloth. You probably picture a medium-size, tree-dwelling creature hanging from a branch. Today’s sloths – commonly featured on children’s backpacks, stationery and lunch boxes – are slow-moving creatures, living inconspicuously in Central American and South American rainforests.

But their gigantic Pleistocene ancestors that inhabited the Americas as far back as 35 million years ago were nothing like the sleepy tree huggers we know today. Giant ground sloths – some weighing thousands of pounds and standing taller than a single-story building – played vital and diverse roles in shaping ecosystems across the Americas, roles that vanished with their loss at the end of the Pleistocene.

In our new study, published in the journal Biology Letters, we aimed to reconstruct the diets of two species of giant ground sloths that lived side by side in what’s now Southern California. We analyzed remains recovered from the La Brea Tar Pits of what are colloquially termed the Shasta ground sloth (Nothrotheriops shastensis) and Harlan’s ground sloth (Paramylodon harlani). Our work sheds light on the lives of these fascinating creatures and the consequences their extinction in Southern California 13,700 years ago has had on ecosystems.

Dentin dental challenges

Studying the diets of extinct animals often feels like putting together a jigsaw puzzle with only a portion of the puzzle pieces. Stable isotope analyses have revolutionized how paleoecologists reconstruct the diets of many ancient organisms. By measuring the relative ratios of light and heavy carbon isotopes in tooth enamel, we can figure out what kinds of foods an animal ate – for instance, grasses versus trees or shrubs.

dental drill in hands near an animal jawbone
Drilling teeth provides a sample for stable isotope analyses. Aditya Kurre

But the teeth of giant ground sloths lack enamel, the highly inorganic and hard outer layer on most animal teeth – including our own. Instead, sloth teeth are made primarily of dentin, a more porous and organic-rich tissue that readily changes its chemical composition with fossilization.

Stable isotope analyses are less dependable in sloths because dentin’s chemical composition can be altered postmortem, skewing the isotopic signatures.

Another technique researchers use to glean information about an animal’s diet relies on analyzing the microscopic wear patterns on its teeth. Dental microwear texture analysis can infer whether an animal mostly ate tough foods such as leaves and grass or hard foods such as seeds and fruit pits. This technique is also tricky when it comes to sloths’ fossilized teeth because signs of wear may be preserved differently in the softer dentin than in harder enamel.

Prior to studying fossil sloths, we vetted dental microwear methods in modern xenarthrans, a group of animals that includes sloths, armadillos and anteaters. This study demonstrated that dentin microwear can reveal dietary differences between leaf-eating sloths and insect-consuming armadillos, giving us confidence that these tools could reveal dietary information from ground sloth fossils.

Distinct dietary niches revealed

Previous research suggested that giant ground sloths were either grass-eating grazers or leaf-eating browsers, based on the size and shape of their teeth. However, more direct measures of diet – such as stable isotopes or dental microwear – were often lacking.

Our new analyses revealed contrasting dental wear signatures between the two co-occurring ground sloth species. The Harlan’s ground sloth, the larger of the two, had microwear patterns dominated by deep pitlike textures. This kind of wear is indicative of chewing hard, mechanically challenging foods such as tubers, seeds, fungi and fruit pits. Our new evidence aligns with skeletal adaptations that suggest powerful digging abilities, consistent with foraging foods both above and below ground.

diagram of sloth profiles, tooth outline and magnified surface of two bits of the teeth
The fossil teeth of the Harlan’s ground sloth typically showed deeper pitlike textures, bottom, while the Shasta ground sloth teeth had shallower wear patterns, top. DeSantis and Kurre, Biology Letters 2025

In contrast, the Shasta ground sloth exhibited dental microwear textures more akin to those in leaf-eating and woody plant-eating herbivores. This pattern corroborates previous studies of its fossilized dung, demonstrating a diet rich in desert plants such as yucca, agave and saltbush.

Next we compared the sloths’ microwear textures to those of ungulates such as camels, horses and bison that lived in the same region of Southern California. We confirmed that neither sloth species’ dietary behavior overlapped fully with other herbivores. Giant ground sloths didn’t perform the same ecological functions as the other herbivores that shared their landscape. Instead, both ground sloths partitioned their niches and played complementary ecological roles.

Extinctions brought ecological loss

The Harlan’s ground sloth was a megafaunal ecosystem engineer. It excavated soil and foraged underground, thereby affecting soil structure and nutrient cycling, even dispersing seed and fungal spores over wide areas. Anecdotal evidence suggests that some anachronistic fruits – such as the weird, bumpy-textured and softball-size Osage orange – were dispersed by ancient megafauna such as giant ground sloths. When the Pleistocene megafauna went extinct, the loss contributed to the regional restriction of these plants, since no one was around to spread their seeds.

The broader consequence is clear: Megafaunal extinctions erased critical ecosystem engineers, triggering cascading ecological changes that continue to affect habitat resilience today. Our results resonate with growing evidence that preserving today’s living large herbivores and understanding the diversity of their ecological niches is crucial for conserving functional ecosystems.

Studying the teeth of lost giant ground sloths has illuminated not only their diets but also the enduring ecological legacies of their extinction. Today’s sloths, though charming, only hint at the profound environmental influence of their prehistoric relatives – giants that shaped landscapes in ways we are only beginning to appreciate.

Larisa R. G. DeSantis, Associate Professor of Biological Sciences, Vanderbilt University and Aditya Reddy Kurre, Dental Student, University of Pennsylvania

This article is republished from The Conversation under a Creative Commons license. Read the original article.

ooOOoo

I am going to finish with a link, and a small extract, from a Wikipedia article on the evolution of Homo sapiens

The timeline of human evolution outlines the major events in the evolutionary lineage of the modern human speciesHomo sapiens, throughout the history of life, beginning some 4 billion years ago down to recent evolution within H. sapiens during and since the Last Glacial Period.

The beautiful moon, but …

… does it make us sleepless?

As has been mentioned previously, my dear wife and her Parkinson’s means that we go to bed early and get up early the following morning. Thus a recent item on The Conversation fascinated me and it is shared with you now.

ooOOoo

Does the full moon make us sleepless? A neurologist explains the science behind sleep, mood and lunar myths

How much does the moon cycle affect sleep? Probably less than your screen time at night. Muhammad Khazin Alhusni/iStock via Getty Images Plus

Joanna Fong-Isariyawongse, University of Pittsburgh

Have you ever tossed and turned under a full moon and wondered if its glow was keeping you awake? For generations, people have believed that the Moon has the power to stir up sleepless nights and strange behavior – even madness itself. The word “lunacy” comes directly from luna, Latin for Moon.

Police officers, hospital staff and emergency workers often swear that their nights get busier under a full moon. But does science back that up?

The answer is, of course, more nuanced than folklore suggests. Research shows a full moon can modestly affect sleep, but its influence on mental health is much less certain.

I’m a neurologist specializing in sleep medicine who studies how sleep affects brain health. I find it captivating that an ancient myth about moonlight and madness might trace back to something far more ordinary: our restless, moonlit sleep.

What the full moon really does to sleep

Several studies show that people really do sleep differently in the days leading up to the full moon, when moonlight shines brightest in the evening sky. During this period, people sleep about 20 minutes less, take longer to fall asleep and spend less time in deep, restorative sleep. Large population studies confirm the pattern, finding that people across different cultures tend to go to bed later and sleep for shorter periods in the nights before a full moon.

The most likely reason is light. A bright moon in the evening can delay the body’s internal clock, reduce melatonin – the hormone that signals bedtime – and keep the brain more alert.

The changes are modest. Most people lose only 15 to 30 minutes of sleep, but the effect is measurable. It is strongest in places without artificial light, such as rural areas or while camping. Some research also suggests that men and women may be affected differently. For instance, men seem to lose more sleep during the waxing phase, while women experience slightly less deep and restful sleep around the full moon.

Young adult woman lying in bed wide awake, staring out the window toward a bright light.
Sleep loss from a bright moon is modest but measurable. Yuliia Kaveshnikova/iStock via Getty Images Plus

The link with mental health

For centuries, people have blamed the full moon for stirring up madness. Folklore suggested that its glow could spark mania in bipolar disorder, provoke seizures in people with epilepsy or trigger psychosis in those with schizophrenia. The theory was simple: lose sleep under a bright moon and vulnerable minds might unravel.

Modern science adds an important twist. Research is clear that sleep loss itself is a powerful driver of mental health problems. Even one rough night can heighten anxiety and drag down mood. Ongoing sleep disruption raises the risk of depression, suicidal thoughts and flare-ups of conditions like bipolar disorder and schizophrenia.

That means even the modest sleep loss seen around a full moon could matter more for people who are already at risk. Someone with bipolar disorder, for example, may be far more sensitive to shortened or fragmented sleep than the average person.

But here’s the catch: When researchers step back and look at large groups of people, the evidence that lunar phases trigger psychiatric crises is weak. No reliable pattern has been found between the Moon and hospital admissions, discharges or lengths of stay.

But a few other studies suggest there may be small effects. In India, psychiatric hospitals recorded more use of restraints during full moons, based on data collected between 2016 and 2017. In China, researchers noted a slight rise in schizophrenia admissions around the full moon, using hospital records from 2012 to 2017. Still, these findings are not consistent worldwide and may reflect cultural factors or local hospital practices as much as biology.

In the end, the Moon may shave a little time off our sleep, and sleep loss can certainly influence mental health, especially for people who are more vulnerable. That includes those with conditions like depression, bipolar disorder, schizophrenia or epilepsy, and teenagers who are especially sensitive to sleep disruption. But the idea that the full moon directly drives waves of psychiatric illness remains more myth than reality.

The sleep/wake cycle is synchronized with lunar phases.

Other theories fall short

Over the years, scientists have explored other explanations for supposed lunar effects, from gravitational “tidal” pulls on the body to subtle geomagnetic changes and shifts in barometric pressure. Yet, none of these mechanisms hold up under scrutiny.

The gravitational forces that move oceans are far too weak to affect human physiology, and studies of geomagnetic and atmospheric changes during lunar phases have yielded inconsistent or negligible results. This makes sleep disruption from nighttime light exposure the most plausible link between the Moon and human behavior.

Why the myth lingers

If the science is so inconclusive, why do so many people believe in the “full moon effect”? Psychologists point to a concept called illusory correlation. We notice and remember the unusual nights that coincide with a full moon but forget the many nights when nothing happened.

The Moon is also highly visible. Unlike hidden sleep disruptors such as stress, caffeine or scrolling on a phone, the Moon is right there in the sky, easy to blame.

A woman staring at her cellphone while lying in the dark.
Screen-time habits are far more likely to have detrimental effects on sleep than a full moon. FanPro/Moment via Getty Images

Lessons from the Moon for modern sleep

Even if the Moon does not drive us “mad,” its small influence on sleep highlights something important: Light at night matters.

Our bodies are designed to follow the natural cycle of light and dark. Extra light in the evening, whether from moonlight, streetlights or phone screens, can delay circadian rhythms, reduce melatonin and lead to lighter, more fragmented sleep.

This same biology helps explain the health risks of daylight saving time. When clocks “spring forward,” evenings stay artificially brighter. That shift delays sleep and disrupts circadian timing on a much larger scale than the Moon, contributing to increased accidents and cardiovascular risks, as well as reduced workplace safety.

In our modern world, artificial light has a much bigger impact on sleep than the Moon ever will. That is why many sleep experts argue for permanent standard time, which better matches our biological rhythms.

So if you find yourself restless on a full moon night, you may not be imagining things – the Moon can tug at your sleep. But if sleeplessness happens often, look closer to home. It is likely a culprit of the light in your hand rather than the one in the sky.

Joanna Fong-Isariyawongse, Associate Professor of Neurology, University of Pittsburgh

This article is republished from The Conversation under a Creative Commons license. Read the original article.

ooOOoo

Ever since I have been an adult I have wondered what the purpose was of daylight time and standard time. The University of Colorado have the history of the time change and, as I suspect, it was brought about by the war; World War I.

Here’s part of that article:

It was first introduced in Germany in 1916 during World War I as an energy saving measure, according to CU Boulder sleep researcher Kenneth Wright. The U.S. followed suit, adopting DST in 1918. Initially implemented as a wartime measure, it was repealed a year later. 

Daylight saving time was reinstituted in 1942 during World War II. The next couple decades were a free-for-all, when states and localities switched between DST and standard time (ST) at will. To put an end to the clock chaos, Congress finally passed the Uniform Time Act in 1966, which standardized daylight saving time and its start and end dates across the country — with the exception of Hawaii and Arizona, which opted to keep standard time year-round. 

The evolution of the human brain

This is a deeply fascinating history.

There are many articles on this subject and it is one that I will return to from time to time.

But for today I want to just show a seven-minute video from YouTube.

Amazing!

More about Jane Goodall

An article published by The Conversation is offered today.

ooOOoo

Jane Goodall, the gentle disrupter whose research on chimpanzees redefined what it meant to be human

Jane Goodall appears on stage at 92NY in New York on Oct. 1, 2023.
Charles Sykes/Invision/AP

Mireya Mayor, Florida International University

Anyone proposing to offer a master class on changing the world for the better, without becoming negative, cynical, angry or narrow-minded in the process, could model their advice on the life and work of pioneering animal behavior scholar Jane Goodall.

Goodall’s life journey stretches from marveling at the somewhat unremarkable creatures – though she would never call them that – in her English backyard as a wide-eyed little girl in the 1930s to challenging the very definition of what it means to be human through her research on chimpanzees in Tanzania. From there, she went on to become a global icon and a United Nations Messenger of Peace.

Until her death on Oct. 1, 2025 at age 91, Goodall retained a charm, open-mindedness, optimism and wide-eyed wonder that are more typical of children. I know this because I have been fortunate to spend time with her and to share insights from my own scientific career. To the public, she was a world-renowned scientist and icon. To me, she was Jane – my inspiring mentor and friend.

Despite the massive changes Goodall wrought in the world of science, upending the study of animal behavior, she was always cheerful, encouraging and inspiring. I think of her as a gentle disrupter. One of her greatest gifts was her ability to make everyone, at any age, feel that they have the power to change the world. https://www.youtube.com/embed/rcL4jnGTL1U?wmode=transparent&start=0 Jane Goodall documented that chimpanzees not only used tools but make them – an insight that altered thinking about animals and humans.

Discovering tool use in animals

In her pioneering studies in the lush rainforest of Tanzania’s Gombe Stream Game Reserve, now a national park, Goodall noted that the most successful chimp leaders were gentle, caring and familial. Males that tried to rule by asserting their dominance through violence, tyranny and threat did not last.

I also am a primatologist, and Goodall’s groundbreaking observations of chimpanzees at Gombe were part of my preliminary studies. She famously recorded chimps taking long pieces of grass and inserting them into termite nests to “fish” for the insects to eat, something no one else had previously observed.

It was the first time an animal had been seen using a tool, a discovery that altered how scientists differentiated between humanity and the rest of the animal kingdom.

Renowned anthropologist Louis Leakey chose Goodall to do this work precisely because she was not formally trained. When she turned up in Leakey’s office in Tanzania in 1957, at age 23, Leakey initially hired her as his secretary, but he soon spotted her potential and encouraged her to study chimpanzees. Leakey wanted someone with a completely open mind, something he believed most scientists lost over the course of their formal training.

Because chimps are humans’ closest living relatives, Leakey hoped that understanding the animals would provide insights into early humans. In a predominantly male field, he also thought a woman would be more patient and insightful than a male observer. He wasn’t wrong.

Six months in, when Goodall wrote up her observations of chimps using tools, Leakey wrote, “Now we must redefine tool, redefine Man, or accept chimpanzees as human.”

Goodall spoke of animals as having emotions and cultures, and in the case of chimps, communities that were almost tribal. She also named the chimps she observed, an unheard-of practice at the time, garnering ridicule from scientists who had traditionally numbered their research subjects.

One of her most remarkable observations became known as the Gombe Chimp War. It was a four-year-long conflict in which eight adult males from one community killed all six males of another community, taking over their territory, only to lose it to another, bigger community with even more males.

Confidence in her path

Goodall was persuasive, powerful and determined, and she often advised me not to succumb to people’s criticisms. Her path to groundbreaking discoveries did not involve stepping on people or elbowing competitors aside.

Rather, her journey to Africa was motivated by her wonder, her love of animals and a powerful imagination. As a little girl, she was entranced by Edgar Rice Burroughs’ 1912 story “Tarzan of the Apes,” and she loved to joke that Tarzan married the wrong Jane.

When I was a 23-year-old former NFL cheerleader, with no scientific background at that time, and looked at Goodall’s work, I imagined that I, too, could be like her. In large part because of her, I became a primatologist, co-discovered a new species of lemur in Madagascar and have had an amazing life and career, in science and on TV, as a National Geographic explorer.
When it came time to write my own story, I asked Goodall to contribute the introduction. She wrote:

“Mireya Mayor reminds me a little of myself. Like me she loved being with animals when she was a child. And like me she followed her dream until it became a reality.”

In a 2023 interview, Jane Goodall answers TV host Jimmy Kimmel’s questions about chimpanzee behavior.

Storyteller and teacher

Goodall was an incredible storyteller and saw it as the most successful way to help people understand the true nature of animals. With compelling imagery, she shared extraordinary stories about the intelligence of animals, from apes and dolphins to rats and birds, and, of course, the octopus. She inspired me to become a wildlife correspondent for National Geographic so that I could share the stories and plights of endangered animals around the world.

Goodall inspired and advised world leaders, celebrities, scientists and conservationists. She also touched the lives of millions of children.

Two women face each other, smiling and holding a book
Jane Goodall and primatologist Mireya Mayor with Mayor’s book ‘Just Wild Enough,’ a memoir aimed at young readers. Mireya Mayor, CC BY-ND

Through the Jane Goodall Institute, which works to engage people around the world in conservation, she launched Roots & Shoots, a global youth program that operates in more than 60 countries. The program teaches children about connections between people, animals and the environment, and ways to engage locally to help all three.

Along with Goodall’s warmth, friendship and wonderful stories, I treasure this comment from her: “The greatest danger to our future is our apathy. Each one of us must take responsibility for our own lives, and above all, show respect and love for living things around us, especially each other.”

It’s a radical notion from a one-of-a-kind scientist.

This article has been updated to add the date of Goodall’s death.

Mireya Mayor, Director of Exploration and Science Communication, Florida International University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

ooOOoo

That comment by Jane that was treasured by Mireya is so important. “The greatest danger to our future is our apathy. Each one of us must take responsibility for our own lives, and above all, show respect and love for living things around us, especially each other.