Category: Education

We humans are still evolving.

An article in The Conversation caught my eye.

We must never forget that evolution is always happening.

So without any more from me here is that article.

ooOOoo

If evolution is real, then why isn’t it happening now? An anthropologist explains that humans actually are still evolving

Inuit people such as these Greenlanders have evolved to be able to eat fatty foods with a low risk of getting heart disease. Olivier Morin/AFP via Getty Images

Michael A. Little, Binghamton University, State University of New York


If evolution is real, then why is it not happening now? – Dee, Memphis, Tennessee


Many people believe that we humans have conquered nature through the wonders of civilization and technology. Some also believe that because we are different from other creatures, we have complete control over our destiny and have no need to evolve. Even though lots of people believe this, it’s not true.

Like other living creatures, humans have been shaped by evolution. Over time, we have developed – and continue to develop – the traits that help us survive and flourish in the environments where we live.

I’m an anthropologist. I study how humans adapt to different environments. Adaptation is an important part of evolution. Adaptations are traits that give someone an advantage in their environment. People with those traits are more likely to survive and pass those traits on to their children. Over many generations, those traits become widespread in the population.

The role of culture

We humans have two hands that help us skillfully use tools and other objects. We are able to walk and run on two legs, which frees our hands for these skilled tasks. And we have large brains that let us reason, create ideas and live successfully with other people in social groups.

All of these traits have helped humans develop culture. Culture includes all of our ideas and beliefs and our abilities to plan and think about the present and the future. It also includes our ability to change our environment, for example by making tools and growing food.

Although we humans have changed our environment in many ways during the past few thousand years, we are still changed by evolution. We have not stopped evolving, but we are evolving right now in different ways than our ancient ancestors. Our environments are often changed by our culture.

We usually think of an environment as the weather, plants and animals in a place. But environments include the foods we eat and the infectious diseases we are exposed to.

A very important part of the environment is the climate and what kinds of conditions we can live in. Our culture helps us change our exposure to the climate. For example, we build houses and put furnaces and air conditioners in them. But culture doesn’t fully protect us from extremes of heat, cold and the sun’s rays.

a man runs after one of several goats in a dry, dusty landscape
The Turkana people in Kenya have evolved to survive with less water than other people, which helps them live in a desert environment. Tony Karumba/AFP via Getty Images

Here are some examples of how humans have evolved over the past 10,000 years and how we are continuing to evolve today.

The power of the sun’s rays

While the sun’s rays are important for life on our planet, ultraviolet rays can damage human skin. Those of us with pale skin are in danger of serious sunburn and equally dangerous kinds of skin cancer. In contrast, those of us with a lot of skin pigment, called melanin, have some protection against damaging ultraviolet rays from sunshine.

People in the tropics with dark skin are more likely to thrive under frequent bright sunlight. Yet, when ancient humans moved to cloudy, cooler places, the dark skin was not needed. Dark skin in cloudy places blocked the production of vitamin D in the skin, which is necessary for normal bone growth in children and adults.

The amount of melanin pigment in our skin is controlled by our genes. So in this way, human evolution is driven by the environment – sunny or cloudy – in different parts of the world.

The food that we eat

Ten thousand years ago, our human ancestors began to tame or domesticate animals such as cattle and goats to eat their meat. Then about 2,000 years later, they learned how to milk cows and goats for this rich food. Unfortunately, like most other mammals at that time, human adults back then could not digest milk without feeling ill. Yet a few people were able to digest milk because they had genes that let them do so.

Milk was such an important source of food in these societies that the people who could digest milk were better able to survive and have many children. So the genes that allowed them to digest milk increased in the population until nearly everyone could drink milk as adults.

This process, which occurred and spread thousands of years ago, is an example of what is called cultural and biological co-evolution. It was the cultural practice of milking animals that led to these genetic or biological changes.

Other people, such as the Inuit in Greenland, have genes that enable them to digest fats without suffering from heart diseases. The Turkana people herd livestock in Kenya in a very dry part of Africa. They have a gene that allows them to go for long periods without drinking much water. This practice would cause kidney damage in other people because the kidney regulates water in your body.

These examples show how the remarkable diversity of foods that people eat around the world can affect evolution.

gray scale microscope image of numerous blobs
These bacteria caused a devastating pandemic nearly 700 years ago that led humans to evolve resistance to them.
Image Point FR/NIH/NIAID/BSIP/Universal Images Group via Getty Images

Diseases that threaten us

Like all living creatures, humans have been exposed to many infectious diseases. During the 14th century a deadly disease called the bubonic plague struck and spread rapidly throughout Europe and Asia. It killed about one-third of the population in Europe. Many of those who survived had a specific gene that gave them resistance against the disease. Those people and their descendants were better able to survive epidemics that followed for several centuries.

Some diseases have struck quite recently. COVID-19, for instance, swept the globe in 2020. Vaccinations saved many lives. Some people have a natural resistance to the virus based on their genes. It may be that evolution increases this resistance in the population and helps humans fight future virus epidemics.

As human beings, we are exposed to a variety of changing environments. And so evolution in many human populations continues across generations, including right now.


Michael A. Little, Distinguished Professor Emeritus of Anthropology, Binghamton University, State University of New York

This article is republished from The Conversation under a Creative Commons license. Read the original article.

ooOOoo

This was published for the Curious Kids section of The Conversation.

However, I believe this is relevant for those adults as well who are interested in the subject. I’m in my 80’s and find this deeply interesting.

Death – it comes to all of us!

Irrespective of our believe.

There are only two days in our lives when we live for less than twenty-four hours: the day we are born and the day when we die!

I was born in November, 1944 and that makes me eighty-one. I was born as a result of an affair between my mother and my father. The family genes favour girls over boys, as in seven girls for every boy, and the son is normally the first born. My mother lost her first child, it was a boy. Then my mother had a second baby. Surprise, surprise, it was another son – me!!

I say this as an introduction to a post on The Conversation.

ooOOoo

Americans are unprepared for the expensive and complex process of aging – a geriatrician explains how they can start planning

It’s important for older adults to plan for their care as they age. Maskot/Maskot via Getty Images

Kahli Zietlow, University of Michigan

Hollywood legend Gene Hackman and his wife, Betsy Arakawa, were found dead in their home in February 2025. Hackman had been living with Alzheimer’s and depended on Arakawa as his full-time caregiver.

Disturbingly, postmortem data suggests that Arakawa died of complications from pulmonary Hantavirus several days before her husband passed. The discordant times of death point to a grim scenario: Hackman was left alone and helpless, trapped in his home after his wife’s death.

The couple’s story, while shocking, is not unique. It serves as a warning for our rapidly aging society. The U.S. population is aging, but most Americans are not adequately planning to meet the needs of older adulthood.

As a geriatric physician and medical educator, I care for older adults in both inpatient and outpatient settings. My research and clinical work focus on dementia and surrogate decision-making.

In my experience, regardless of race, education or socioeconomic status, there are some universal challenges that all people face with aging and there are steps everyone can take to prepare.

Aging is inevitable but unpredictable

Aging is an unpredictable, highly individualized process that varies depending on a person’s genetics, medical history, cognitive status and socioeconomic factors.

The majority of older Americans report a strong sense of purpose and self-worth. Many maintain a positive view of their overall health well into their 70s and 80s.

But at some point, the body starts to slow down. Older adults experience gradual sensory impairment, loss of muscle mass and changes in their memory. Chronic diseases are more likely with advancing age.

According to the U.S. Census Bureau, 46% of adults over age 75 live with at least one physical disability, and this proportion grows with age. Even those without major health issues may find that routine tasks like yard work, housekeeping and home repairs become insurmountable as they enter their 80s and 90s.

Some may find that subtle changes in memory make it difficult to manage household finances or keep track of their medications. Others may find that vision loss and slowed reaction time make it harder to safely drive. Still others may struggle with basic activities needed to live independently, such as bathing or using a toilet. All of these changes threaten older adults’ ability to remain independent.

The costs of aging

Nearly 70% of older Americans will require long-term care in their lifetime, whether through paid, in-home help or residence in an assisted living facility or nursing home.

But long-term care is expensive. In 2021, the Federal Long Term Care Insurance Program reported that the average hourly rate for in-home care was US$27. An assisted living apartment averaged $4,800 per month, and a nursing home bed cost nearly double that, at a rate of $276 per day.

Many Americans may be shocked to discover that these costs are not covered by Medicare or other traditional medical insurance. Long-term care insurance covers the cost of long-term care, such as in-home care or nursing home placement. However, what is covered varies from plan to plan. Currently, only a small minority of Americans have long-term care insurance due to high premiums and complex activation rules.

I am not aware of any high-quality, peer-reviewed studies that have demonstrated the cost effectiveness of long-term care insurance. Yet, for many Americans, paying for care out of pocket is simply not an option.

Medicaid can provide financial support for long-term care but only for older adults with very low income and minimal assets – criteria most Americans don’t meet until they have nearly exhausted their savings.

Those receiving Medicaid to cover the costs of long-term care have essentially no funds for anything other than medical care, room and board. And proposed federal financial cuts may further erode the limited support services available. In Michigan, for example, Medicaid-covered nursing home residents keep only $60 per month for personal needs. If individuals receive monthly income greater than $60 – for instance, from Social Security or a pension – the extra money would go toward the cost of nursing home care.

Those who don’t qualify for Medicaid or cannot afford private care often rely on family and friends for unpaid assistance, but not everyone has such support systems.

A nurse helps an older man shave.
Older adults may end up needing help with day-to-day personal care. Klaus Vedfelt/DigitalVision via Getty Images

Planning for the care you want

Beyond financial planning, older adults can make an advance directive. This is a set of legal documents that outlines preferences for medical care and asset management if a person becomes incapacitated. However, only about 25% of Americans over 50 have completed such documentation.

Without medical and financial powers of attorney in place, state laws determine who makes critical decisions, which may or may not align with a person’s wishes. For instance, an estranged child may have more legal authority over an incapacitated parent than their long-term but unmarried partner. Seniors without clear advocates risk being placed under court-appointed guardianship – a restrictive and often irreversible process.

In addition to completing advance directives, it is important that older adults talk about their wishes with their loved ones. Conversations about disability, serious illness and loss of independence can be difficult, but these discussions allow your loved ones to advocate for you in the event of a health crisis.

Who’s going to care for you?

Finding a caregiver is an important step in making arrangements for aging. If you are planning to rely on family or friends for some care, it helps to discuss this with them ahead of time and to have contingency plans in place. As the Hackman case demonstrates, if a caregiver is suddenly incapacitated, the older adult may be left in immediate danger.

Caregivers experience higher rates of stress, depression and physical illness compared with their peers. This is often exacerbated by financial strain and a lack of support. It helps if the people you will be relying on have expectations in place about their role.

For instance, some people may prefer placement in a facility rather than relying on a loved one if they can no longer use the bathroom independently. Others may wish to remain in their homes as long as this is a feasible option.

Connecting with available resources

There are local and federal initiatives designed to help aging adults find and get the help they need. The Centers for Medicare & Medicaid Services recently launched the GUIDE Model to improve care and quality of life for both those suffering from dementia and their caregivers.

This program connects caregivers with local resources and provides a 24-hour support line for crises. While GUIDE, which stands for Guiding an Improved Dementia Experience, is currently in the pilot stage, it is slowly expanding, and I am hopeful that it will eventually expand to provide enhanced coverage for those suffering from dementia nationwide.

The Program for All-Inclusive Care of the Elderly helps dual-eligible Medicare and Medicaid recipients remain at home as they age. This program provides comprehensive services including medical care, a day center and home health services.

Area agencies on aging are regionally located and can connect older adults with local resources, based on availability and income, such as meals, transportation and home modifications that help maintain independence.

Unfortunately, all of these programs and others that support older adults are threatened by recent federal budget cuts. The tax breaks and spending cuts bill, which was signed into law in July 2025, will result in progressive reductions to Medicaid funding over the next 10 years. These cuts will decrease the number of individuals eligible for Medicaid and negatively affect how nursing homes are reimbursed.

The government funding bill passed on Nov. 13 extends current Medicare funding through Jan. 30, 2026, at which point Medicare funding may be reduced.

Even as the future of these programs remains uncertain, it’s important for older adults and their caregivers to be intentional in making plans and to familiarize themselves with the resources available to them.

Kahli Zietlow, Physician and Clinical Associate Professor of Geriatrics & Internal Medicine, University of Michigan

This article is republished from The Conversation under a Creative Commons license. Read the original article.

ooOOoo

This article is a wakeup call for me, because I have no plan in place.

While I think about death more frequently than I used to, the fact that I don’t have plan is naive: I must get myself to a stage where I have a plan, and soon! I guess I am not the only person in their 80s without a plan!

What if we die before our pets?

We love our dogs and can never envisage being without one.

So what happens to them after the last one of us die?

I have just turned 81 and, although I am fit, think more seriously about this matter than I used to. Jean has no children and my son and daughter, from a previous marriage, are living in the U.K.

So an article from The Conversation caught my eye and I wanted to share it with you.

ooOOoo

Diane Keaton’s $5M pet trust would be over the top if reports prove true – here’s how to ensure your beloved pet is safe after you are gone

Allison Anna Tait, University of Richmond

Diane Keaton loved her dog, Reggie.

The award-winning actor, director and real estate entrepreneur frequently posted photos and video clips of the golden retriever on her social media accounts. After she died on Oct. 11, 2025, at 79, some news outlets reported that she left US$5 million of her estimated $100 million estate to her dog.

I’m a law professor who teaches about wills, trusts and other forms of inheritance law. Every semester, I teach my students how they can help clients provide for their pets after death. Because they, like many Americans, love their pets and want to know how to take care of them, this topic always piques their interest. https://www.youtube.com/embed/FYJGCvpJIV0?wmode=transparent&start=0 Diane Keaton was very open about her devotion to her dog, Reggie.

Writing pets into a will

An estimated 66% of all U.S. households include at least one pet. Many Americans consider their cats, dogs, tortoises or other animals to be part of their family, and their spending on those nonhuman relatives is immense. In 2024, they paid a total of about $152 billion for goods and services to feed and otherwise support their pets.

Taking good care of your pets can go beyond buying them treats and sweaters. It can include leaving clear directions to ensure their needs are met once you’re gone. There are several ways that you can do this.

The first is through your will. You can’t give your pet money directly in your will, because the law says that pets are property, like your books or your dishes.

You can, however, leave a bequest, the technical term for a gift to a person or a cause listed in a will, to someone who will be the animal’s caretaker. That bequest can include directions that the money be spent meeting the pet’s needs.

It’s worth it to also name an alternate or contingent caretaker in case the first person you name does not want to or cannot take on that responsibility, or they die before you or the animals you’ve provided for in the will.

Choupette’s life of luxury

German fashion designer, photographer and creative director Karl Lagerfeld, who died in 2019 at 85, was someone who made the mistake of leaving money directly to his fluffy Birman cat, Choupette. It worked out for Choupette, though.

The cat was, according to several reports, still alive in 2025 and eating meals out of the porcelain bowls that Lagerfeld bought for her. Choupette is cared for at great expense and in the utmost luxury by Françoise Caçote, the designer’s former housekeeper. The cat even had a 13th birthday party at Versailles.

Another pet owner who did right by her pet was the comedian, producer and red carpet interviewer Joan Rivers.

Rivers had two rescue dogs in Manhattan and two more dogs in California when she died in 2014 at age 81. Rivers had made provisions for their care in her will.

A petite woman holding a tiny dog stands next to three men on a TV set.
The late Joan Rivers, right, seen on the set of her short-lived talk show in 1987, planned ahead for her dogs’ care. Bettmann via Getty Images

Creating pet trusts

If you’d like an arrangement that’s more secure than a will, then you might want to opt for a pet trust, another celebrity favorite. These kinds of trusts were not possible until the 1990s, because pets were not considered true beneficiaries – meaning they couldn’t sue the trustee.

But in the 1990s, states began to change their rules to allow for pet trusts. Today, pet trusts are valid in the whole country, although the rules vary slightly from state to state.

To establish a pet trust, you or a lawyer must draw up a trust document that names two important people: a trustee and a caretaker. The trustee is the person who will manage the money you leave in trust. They will make distributions to the caretaker that you select.

You must also specify how the money is to be spent meeting the animal’s needs and who would get any money that could be left in the trust when the pet dies. Typically, these trusts take effect at the owner’s death, just like other provisions in a will.

Drafting a pet trust can be free, if you use an online template and get no legal guidance. The same thing might cost around $100 if you use an online service such as Legal Zoom that provides directions. More commonly, however, pet trusts are part of a broader estate plan, and costs range depending on how complicated your estate is.

When the rich go overboard

One of the most over-the-top pet trusts came from Leona Helmsley, the New York hotel and real estate mogul known widely as the “Queen of Mean.” She was famous for her pettiness and tough management style and for landing in prison for tax evasion.

When Helmsley died in 2007, she left her dog, a Maltese named Trouble who had reportedly bitten members of her staff, a $12 million trust fund. Most of Helmsley’s estate went to the Helmsley Charitable Trust, but she made individual gifts to several relatives, and the gift to Trouble was larger than any of those.

The grandchildren, upset that Trouble got more money than they did, took the case to court, where the probate judge was less than impressed by Trouble’s luxury lifestyle and knocked down the amount in trust to $2 million. The other $10 million flowed back to her family’s foundation, where the bulk of the estate went in the first place.

Lesson learned: Your dog can have a trust fund, but don’t go overboard.

Bequests for pets can be challenged – in which case it’s up to courts to determines how much they think is reasonable for the pet’s need. In Helmsley’s case, $12 million was found to be excessive. And maybe with good reason. Trouble still had a nice life with fewer millions. The dog died in December 2010 after several years in Sarasota, Florida, at a Helmsley-owned hotel.

Other pet owners who aren’t celebrities have used pet trusts as well, such as Bill Dorris, a Nashville businessman without any human heirs. He left his dog, Lulu, $5 million.

Pet-loving celebrities who loved all the pets

Finally, there’s a lesson to be learned from British fashion designer and icon Alexander McQueen, who was worth £16 million ($21 million) when he died in 2010 at the age of 40. McQueen left £50,000 ($66,000) in a trust for his two bull terriers so that they would be well cared for during the remainder of their lives.

McQueen also included a bequest of £100,000 ($132,000) to the Battersea Dogs and Cats Home in his will to help fund the care of some of the millions of other animals out there that need the basics of food and shelter.

Animal shelters, in the U.K., the United States and other countries, help rescue and protect animals, and these animals need more help than the Choupettes and Troubles of the world.

So, my advice is that you go ahead and create a pet trust for your cat. But don’t forget to give some money in your will – and ideally while you’re alive – to help the vast majority of the millions of companion animals who need new homes every year. None of them have trust funds.

What becomes of Reggie, Keaton’s golden retriever, and her estate remains to be seen. Keaton, who starred in hit movies such as “Annie Hall,” “Reds” and “The First Wives Club,” isn’t the first celebrity to leave millions of dollars to a pet. And it’s unlikely that she will be the last.

Allison Anna Tait, Professor of Law, University of Richmond

This article is republished from The Conversation under a Creative Commons license. Read the original article.

ooOOoo

Amending our Will to create a pet trust seems like a very good idea! And making sure there is money for the trust as well.

The Science of Sleep.

Exploring the connection between rest and health.

This is a one-hour video (as in a YouTube video) from an accredited US healthcare educator.

That is all for today. Just to add that sleep is so important.

‘When we turn the clocks ahead this spring, we’ll lose an hour of sleep—but for many Americans, poor sleep is a nightly problem. Lack of sleep takes a toll on physical and mental health, increasing risk for chronic conditions like high blood pressure and obesity. Our expert panelists will unpack the latest research on rest, including how social and environmental factors can impact our sleep. They’ll explore the connections between race, sleep, and health disparities. And they’ll share advice on how to improve sleep quality for better overall health.’

The Truth about Gods, part two

The concluding part of this essay by Patrice Ayme.

ooOOoo

Endowing aspects of the universe with spirituality, a mind of their own, is stupid in this day and age, only if one forgets there are natural laws underlying them. But if one wants to feel less alone and more purposeful, it is pretty smart.    

Patrice Ayme

Here is the inventor of monotheism: Nefertiti. Once a fanatic of Aten, the sun god, she turned cautious, once Pharaoh on her own, backpedalled and re-authorized Egyptian polytheism. (The sun-God, Sol Invictis, was revived by Roman emperor Dioclesian 17 centuries later, in his refounding of Romanitas and the empire. His ultra young successor and contemporary, emperor Constantine, used the revived monotheism to impose his invention of Catholicism. Funny how small the conceptual world is.)

***

The preceding part (see Part One yesterday} contains many iconoclastic statements which made the Articial Intelligence (AI) I consulted with try to correct me with what were conventional, but extremely erroneous, ill-informed data points. AI also use the deranged upside down meta-argument that it is well-known that Christianism is not like that, so I have got to be wrong. Well, no, I was raised as a Catholic child in two different Muslim countries, and also in a Pagan one; the Muslim faiths I knew as child were as different from Suny/Shiah faiths as Christianism is, overall, from Islamism. In other words, I know the music of monotheism. So here are:

***

TECHNICAL NOTES: 

[1] To speak in philosophical linguo, we capture two civilizational “ontologies” (logic of existence):

  1. Polytheistic-personal: relational, distributed, ecological.
  2. Monotheistic-fascistic: hierarchical, authoritarian, abstracted.

[2] Paganus, in a religious sense, appears first in the Christian author Tertullian, around 202 CE, to evoke paganus in Roman military jargon for ‘civilian, incompetent soldier‘ by opposition to the competent soldiers (milites) of Christ that Tertullian was calling for.

[3] ‘FAIR OF FACE, Joyous with the Double Plume, Mistress of Happiness, Endowed with Favour, at hearing whose voice one rejoices, Lady of Grace, Great of Love, whose disposition cheers the Lord of the Two Lands.

With these felicitous epithets, inscribed in stone more than 3,300 years ago, on the monumental stelae marking the boundaries of the great new city at Tell el Amarna on the Nile in central Egypt, the Pharaoh Akhenaten extolled his Great Royal Wife, the chief queen, the beautiful Nefertiti.

Nefertiti (‘the beautiful one has come‘) co-ruled Egypt with her older (and apparently famously ugly, deformed by disease) husband. Egypt was at its wealthiest. She was considered to be a DIVINITY. All her life is far from known, and revealed one fragment of DNA or old text at a time. She ruled as sole Pharaoh after her husband’s death, and seems to have offered to marry the Hittite Prince (as revealed by a recently found fragment: ”I do not wish to marry one of my subjects. I am afraid…” she confessed in a letter to the amazed Hittite emperor.). She apparently decided to re-allow the worship of the many Egyptian gods, and her adoptive son and successor Tutankhaten switched his name to Tutankhamen). Both her and Tutankhamen died, and they were replaced by a senior top general of Akhenatten who both relieved the dynasty from too much inbreeding (hence the deformed Akhenaten) and too much centralism focused on the sun-disk (‘Aten’)  

[4] Those who do not know history have a small and ridiculous view of FASCISM. Pathetically they refer to simpletons, such as Hitler and Mussolini, to go philosophical on the subject.. Google’s Gemini tried to pontificate that ‘labeling the structure of monotheism (especially its early forms) as fascistic’ is anachronistic and highly inflammatory. Fascism is a specific 20th-century political ideology. While the author means authoritarian and hierarchical, using ‘fascistic’ distracts from the historical and philosophical points by introducing modern political baggage. It would be clearer and less polemical to stick to Hierarchical’ or ‘Authoritarian-Centralized.

I disagree virulently with this cognitive shortsightedness of poorly programmed AI. The Romans were perfectly aware of the meaning that the faces symbolized (they got them from the Etruscans). So were the founders of the French and American republics aware of the importance of fascism and the crucial capabilities it provided, the powerful republics which, in the end, succeeded the Roman Republic (which died slowly under the emperors until it couldn’t get up); those two republics gave the basic mentality now ruling the planet.

Fascism is actually an instinct. Its malevolent and dumb confiscation by ignorant  morons such as Hitler and Mussolini ended pathetically under the blows of regimes (the democracies on one side, the fascist USSR on the other) which were capable of gathering enough, and much more, and higher quality fascism of their own to smother under a carpet of bombs the cretinism of the genocidal tyrants. It is actually comical, when reading old battles stories, to see the aghast Nazis out-Nazified by their Soviet opponents (discipline on the Soviet side was a lethal affair at all and any moment.) Or then to see SS commanders outraged by the ferocity of their US opponents. At Bir Hakeim, a tiny French army, 3,000 strong, buried in the sands, blocked the entire Afrika Korps and the Italian army, for weeks, under a deluge of bombs and shells, killing the one and only chance the Nazis had to conquer the Middle East. Hitler ordered the survivors executed, Rommel, who knew he was finished, disobeyed him.   

***

Early Christianism was highly genocidal. The Nazi obsession with the Jews was inherited from Nero (who, unsatisfied with just crucifying Christians (64 CE), launched the annihilation wars against Israel) and then the Christians themselves. There were hundreds of thousands of Samaritans, a type of Jew, with their own capital and temple (above Haifa). Warming up, after centuries of rage against civilization, the Christians under emperor Justinian, in the Sixth Century, nearly annihilated the Samaritans; a genocide by any definition.

Later, by their own count, at a time when Europe and the greater Mediterranean counted around 50 million inhabitants, the Christians, over centuries, killed no less than 5 million Cathars from Spain to Anatolia. Cathars, the pure ones in Greek (a name given to them by their genociders), were a type of Christian). In France alone, in a period of twenty years up to a million were killed, (not all Cathars, but that accentuates the homicidal character). As a commander famously said: ”Tuez les tous, Dieu reconnaitra les siens” (Kill them all, God will recognize his own). The anti-Cathars genocide drive in France, an aptly named ‘crusade‘,  something about the cross, lasted more than a century (and boosted the power of the Inquisition and the Dominicans). The extinction of Catharism was so great that we have only a couple of texts left. 

Want to know about Christianism? Just look at the torture and execution device they brandish, the cross. Christianism literally gave torture and execution a bad name, and it’s all the most cynical hypocrisy hard at work. 

And so on. To abstract it in an impactful way, one could say that much of Christianism instigated Nazism. That’s one of the dirty little secrets of history, and rather ironical as the dumb Hitler was anti-Christian, and still acted like one, unbeknownst to himself, his public, and his critiques; those in doubt can consult the descriptions of the Crusades by the Franks themselves, when roasting children was found to relieve hunger.

Chroniclers like Radulph of Caen (a Norman historian writing around 1118) described it vividly: “In Ma’arra our troops boiled pagan adults in cooking-pots; they impaled children on spits and devoured them grilled.” Other sources, such as Joinville, Fulcher of Chartres and Albert of Aachen, corroborate the desperation and brutality, though they express varying degrees of horror or justification.   These acts were not systematic policy but extreme responses to the hunger and chaos of war, and they were preserved in Frankish narratives as part of the Crusade’s grim legacy. (There were also cases of cannibalism in WW2).

Christianism, when not actively genocidal, certainly instigated a mood, a mentality, of genocide; read Roman emperor Theodosius I about heresy. Here is the end of Theodosius’ famous quote: ‘According to the apostolic teaching and the doctrine of the Gospel, let us believe in the one deity of the Father and of the Son and of the Holy Spirit, in equal majesty and in a holy Trinity. We order the followers of this law to embrace the name of Catholic Christians; but as for the others, we judge them to be demented and ever more insane (dementes vesanosque iudicantes), we decree that they shall be branded with the ignominious name of heretics, and shall not presume to give to their conventicles the name of churches. They will suffer in the first place the chastisement of the divine condemnation and in the second the PUNISHMENT OF OUR AUTHORITY which in accordance with the will of Heaven WE SHALL DECIDE TO INFLICT.

The ‘Men In Black‘ of the Fourth Century destroyed books, libraries and intellectuals, ensuring the smothering of civilization, as intended (destruction of the Serapeum in Alexandria, the world’s largest library) around 391 CE. Contemporary writers like Eunapius and Libanius lamented the ‘rage for destruction’ of the Men In Black. Some non-Christian texts were preserved in monasteries, true, but the point is that Christianism made possible the destruction of non-Yahweh knowledge. This is the problem king David himself already had, the fascism, the power obsessed little mind of Yahweh. Monasteries were often built with a covert anti-Christian mentality, things were complicated. When queen Bathilde outlawed slavery (657 CE), her closest allies were bishops, yet she had to execute other, slave-holding bishops. She also founded and funded four monasteries. Soon the Frankish government passed a law enforcing secular teaching by religious establishments.

The uniforms of the Men In Black were copied later by the Dominicans (‘Black Friars’) who led the genocide of the Cathars, in co-operation with the Inquisition, also dressed in black, and then the SS. Luther. Saint Louis expressed explicitly that nothing would bring them more joy than Jews suffering to death. Saint Louis was more descriptive, evoking a knife planted in the belly of the unbeliever and great pleasure. In Joinville’s Life of Saint Louis (c. 1309), Louis recounts a story of a knight who, during a debate with Jews, stabbed one in the belly with a dagger, saying it was better to “kill him like that” than argue theology.  Louis presented this approvingly as zeal for the faith, and wished he could partake. Although he warned, he wouldn’t do it, that would be illegal. With a faith like that Louis IX could only be canonized in 1297 CE. And, following Saint Louis’ hint, the Nazis removed his legal objection by changing the law in 1933, when they got to power.

Luther gave multiple and extensive ‘sincere advices‘ on how to proceed with the genocide of the Jews in his book: ”The Jews and their Lies”. Here is a sample: “If I had to baptize a Jew, I would take him to the bridge of the Elbe, hang a stone around his neck and push him over, with the words, ‘I baptize thee in the name of Abraham.” 

But Musk’s AI, ‘Grok’ informed me that its basic axiom was that Christianism was never genocidal, but instead ‘suppressive‘. Then it thought hard for ‘nine seconds‘ to try to prove to me, with biased context, that I had exaggerated.

I had not.  

***.  

The entire church was into assassination madness, glorifying in its own cruelty; the chief assassin of Hypatia, a sort of Charlie Manson to the power 1000, was made into a saint: Saint Cyril. Cyril’s minions grabbed Hypatia who had just finished giving a lecture, dragged her in the streets, and stripped her clothing, and then stripped her of her own skin, flaying her with oysters shells, causing her demise. She was the top intellectual of the age. Hypatia met her torturous end in 415 CE. Cyril was made into a saint 29 years later, in 444 CE! With saints like that, who needs Hitler?

Not to say Catholicism was useless; the jealous and genocidal, yet loving and all-knowing Yahweh was always a good excuse to massacre savages and extend civilization on the cheap. The Teutonic Knights, finding the Yahweh fanatics known as Muslims too hard a nut to crack, regrouped in Eastern Germany and launched a very hard fought crusade against the Prussian Natives, who were Pagans. After mass atrocities on both sides, the Teutons won.

The Franks embraced the capabilities of the cross, fully. Having converted to Catholicism, they were in a good position to subdue other Germans, who were Arians (and that they did, submitting Goths and Burgonds, Ostrogoths and Lombards). Three centuries later, Charlemagne used Christianism as an instrument to kill Saxons on an industrial scale, in the name of God, to finally subdue them, after Saxons had terminally aggravated Romanitas for 800 years, driving Augustus crazy

Charlemagne, in daily life, showing how relative Christianism was, and its true Jewish origins, used the nickname ‘David’ for king David, the monarch who refused to obey Yahweh, who had ordered David to genocidize a people (petty, jealous God Yahweh then tortured David’s son to death)

Charlemagne lived the life of a hardened Pagan, with a de-facto harem, etc. More viciously, Charlemagne passed laws pushing for secular, and thus anti-Christian education. Following in these respects a well-established Frankish custom. Charlemagne knew Christianism was a weapon, and he was careful to use it only on the Saxons; internally, there was maximum tolerance: Christians could become Jews, if they so wished.

PA

ooOOoo

I found the full essay quite remarkable. Jean has heard me rattle on about it numerous times since I first read the essay on November 2nd. I sincerely hope you will read it soon.

Finally, let me reproduce what I wrote in a response to Patrice’s post:

Patrice, in your long and fascinating article, above, you have educated me in so many ways. My mother was an atheist and I was brought up in likewise fashion. But you have gone so much further in your teachings.

Your article needs a further reading. But I am going to share it with my readers on LfD so many more can appreciate what you have written. Plus, I am going to republish it over two days.

Thank you, thank you, thank you!


The Truth about Gods, part one.

A brilliant essay by Patrice Ayme.

Patrice writes amazing posts, some of which are beyond me. But this one, The Personification Of The World, PAGANISM, Gives Us Friends Everywhere, is incredible.

My own position is that my mother and father were atheists and I was brought up as one. Apart from a slip-up when I was married to my third wife, a Catholic, and she left me and I thought that by joining the Catholic church I might win her back. (My subconcious fear of rejection.)

My subconscious fear of rejection was not revealed to me until the 50th anniversary of my father’s death in 2006 when I saw a local psychotherapist. Then I met Jean in December, 2007 and she was the first woman I truly loved!

Back to the essay; it is a long essay and I am going to publish the first half today and the second half tomorrow. (And I have made some tiny changes.)

ooOOoo

The Personification Of The World, PAGANISM, Gives Us Friends Everywhere

Abstract: Personification of the world (polytheism/paganism) is more pragmatic, psychologically rewarding, and ecologically sound than the hierarchical, abstracted structure of monotheism, which the author labels “fascistic.” [4]

***

Switching to a single fascist God, Ex Pluribus Unum, a single moral order replacing the myriad spirits of the world, was presented as a great progress: now everybody could, should, line up below the emperor, God’s representative on Earth, and obey the monarch and his or her gang. The resulting organization was called civilization. Submitting to God was the only way Rome could survive, because it provided a shrinking army and tax base with more authority than it deserved.

However peasants had to predict the world and it was more judicious to personalize aspects of it. The resulting logico-emotional relationship had another advantage: the supportive presence of a proximal Gods… All over!.[1]

*** 

personification

/pəˌsɒnɪfɪˈkeɪʃn/ noun

1.the attribution of a personal nature or human characteristics to something non-human, or the representation of an abstract quality in human form.

***

Before Christianism, Gods were everywhere. When the Christians took over, they imposed their all powerful, all knowing Jewish God. Some present the Jewish God as a great invention, symbolizing some sort of progress of rationality that nobody had imagined before. 

However, the single God concept was not that new. Even Americans had it in North America, as the chief of God, the Aztecs, had a similar concept, and even Zeus was a kind of chief God. Zoroastrianism had Ahura Mazda, who did not control Angra Manyu, but still was somewhat more powerful. The Hindus had Vishnu and his many avatars.

Eighteen centuries before those great converters to Christianism, Constantine, Constantius II, and Theodosius I, there was an attempt to forcefully convert the Egyptians to a single God. Pharaoh Akhenaten’s monotheistic experiment (worship of Aten) caused turmoil and was erased by his immediate successors.

According to the latest research it seems likely that the famous Nefertiti became a Pharaoh on her own, after the death of her husband, and retreated from monotheism by re-establishing Egyptian polytheism [3]. In the fullness of time, the infernal couple got struck by what the Romans, 15 centuries later, would call damnatio memoriae. Their very names and faith were erased from hundreds of monuments

Shortly after the Aten episode, there was another confrontation between polytheism and monotheism. The colonizers of Gaza were apparently Greek, of Aegean origin, and, as such, over more than a millennium of conflict with the Jewish states in the hills, Greek Gods confronted Yahweh. The Greeks obviously did not see Yahweh as a major conceptual advance, as they did not adopt Him (until Constantine imposed Him, 15 centuries later).

While the area experienced enormous turmoil, including the permanent eclipse of Troy after the war with Greece (see Homer), and later its replacement by Phrygia, then followed by the Bronze Age collapse, then the rise of Tyre, and the Assyrian conquest, the Greeks survived everything, and their civilization kept sprawling (the early Christian writings were in Greek).

Ultimately, the lack of ideological bending, the obsession with pigs and other silliness, helped to bring devastating Judeo-Roman wars. By comparison, the much larger Gaul bent like a reed when confronted with the Greco-Romans, absorbing the good stuff. Mercury, the God of trade, preceded Roman merchants. Gaul didn’t take religion too seriously, and went on with civilizational progress.

The lack of elasticity of the single God religion of the Jews brought their quasi-eradication by Rome; Judaism was tolerated, but Jewish nationalism got outlawed. By comparison, the Greeks played the long game, and within a generation or so of Roman conquest, they had spiritually conquered their conqueror. Christianism was actually an adaptation of Judaism to make Yahweh conquer the heart and soul of fascist Rome.

***

To have Gods everywhere? Why not? Is not the Judeo-Christian God everywhere too? Doesn’t it speak through fountains, and the immensity of the desert, and the moon, and the stars, too?

***

Yahweh, the Jewish God Catholic Romans called “Deus” was deemed to be also the ultimate love object. Yahweh had promised land to the Jews, Deus promised eternal life of the nicest sort – To all those who bought the program. 

Christians were city dwellers and their power over the countryside and barbarians came from those who had imposed Christianism, the imperial powers that be (at the time, more than 90% of the people worked in agriculture). Already as a teenager, Constantine, a sort of superman born from imperial purple, terrified the court which was supposed to hold him hostage. Such a brute and excellent general could only get inspired by Yahweh’s dedication to power.

The country dwellers, the villagers, disagreed that they needed to submit to a God organized, celebrated and imposed by the all-powerful government (god-vernment?). In classical Latin paganus meant ‘villager, rustic; civilian, non-combatant’. In late imperial Latin it came to mean non-Judeo-Christian (and later non-Judeo-Christo-Islamist) [2].

Christianism found it very difficult to penetrate the countryside, where the food was produced. It never quite succeeded (Even in Muslim Albania, Pagan rituals survived until the 20th century; much of the cult of saints is barely disguised Paganism).

Peasants knew that power was distributed throughout nature, and they had to understand those powers, thus love them – That enabled them to predict phenomena.

Peasants could ponder the mood of a river, and even predict it; flooding was more of a possibility in some circumstances, and then it was no time to indulge in activities next to the river. Peasants had to guess the weather, and the earlier, the better. Peasants had to know which part of the astronomical cycle they were in, to be able to plant crops accordingly; that was not always clear just looking outside, but the stars would tell and could be trusted to tell the truth.

We can be friends to human beings, and sometimes it’s great, but sometimes we feel betrayed and abandoned. But a mountain or a sea? They will always be there, they are not running away, they are never deliberately indifferent, and generally exhibit predictable moods. It is more pragmatic and rewarding to love them more rather than an abstract Dog in Heavens. Call them Gods if you want.

ooOOoo

Part two will be published tomorrow.

I am publishing the notes, on both days, so you can look them up now rather than waiting another day.

TECHNICAL NOTES: 

[1] To speak in philosophical linguo, we capture two civilizational ‘ontologies’ (logic of existence):

  1. Polytheistic-personal: relational, distributed, ecological.
  2. Monotheistic-fascistic: hierarchical, authoritarian, abstracted.

[2] Paganus, in a religious sense, appears first in the Christian author Tertullian, around 202 CE, to evoke paganus in Roman military jargon for ‘civilian, incompetent soldier‘ by opposition to the competent soldiers (milites) of Christ that Tertullian was calling for.

[3] ‘FAIR OF FACE, Joyous with the Double Plume, Mistress of Happiness, Endowed with Favour, at hearing whose voice one rejoices, Lady of Grace, Great of Love, whose disposition cheers the Lord of the Two Lands.

With these felicitous epithets, inscribed in stone more than 3,300 years ago on the monumental stelae marking the boundaries of the great new city at Tell el Amarna on the Nile in central Egypt, the Pharaoh Akhenaten extolled his Great Royal Wife, the chief queen, the beautiful Nefertiti.

Nefertiti (‘the beautiful one has come‘) co-ruled with her older (and apparently famously ugly, deformed by disease) husband, Egypt. Egypt was at its wealthiest. She was considered to be a DIVINITY. All her life is far from known, and revealed one fragment of DNA or old text at a time. She ruled as sole Pharaoh after her husband’s death, and seems to have offered to marry the Hittite Prince (as revealed by a recently found fragment: ”I do not wish to marry one of my subjects. I am afraid…” she confessed in a letter to the amazed Hittite emperor). She apparently decided to re-allow the worship of the many Egyptian gods and her adoptive son and successor Tutankhaten switched his name to Tutankhamen. Both her and Tutankhamen died, and they were replaced by a senior top general of Akhenatten who both relieved the dynasty from too much inbreeding (hence the deformed Akhenaten) and too much centralism focused on the sun-disk (‘Aten’).  

[4] Those who do not know history have a small and ridiculous view of FASCISM. Pathetically they refer to simpletons, such as Hitler and Mussolini, to go philosophical on the subject. Google’s Gemini tried to pontificate that ‘labeling the structure of monotheism (especially its early forms) as fascistic’ is anachronistic and highly inflammatory. Fascism is a specific 20th-century political ideology. While the author means authoritarian and hierarchical, using ‘fascistic’ distracts from the historical and philosophical points by introducing modern political baggage. It would be clearer and less polemical to stick to Hierarchical’ or ‘Authoritarian-Centralized’.

I have Ring Worm!

Luckily not badly.

Last week I went to my doctor to find out what was causing my itching, mainly in my groin and the skin of my forearms.

Dr. Mount told me that I had ringworm. I do not remember having had it before.

The Cleveland Clinic have the details online and it struck me that I should share the information with you. The source of this article is here. I hope very much that sharing this article is alright with Cleveland Clinic.

ooOOoo

Ringworm

Ringworm is an itchy, contagious fungal infection that causes a ring-shaped pattern on your skin. Over-the-counter and prescription treatments can stop the fungus from spreading to other parts of your body or to others.

Overview

Ringworm (Tinea Corporis) on the skin.
Ringworm is a circular-shaped skin rash caused by a fungal infection.

What is ringworm?

You might be surprised to learn that a fungus — and not a worm — causes ringworm. Fungi thrive in warm and humid areas such as locker rooms and public showers. This common and contagious skin infection gets its name from the red, itchy, ring-shaped skin plaque (a type of scaly rash). It spreads easily and through close contact.

You get ringworm from contact with an infected person, animal or object. Ringworm goes by different names depending on which body part it affects. Ringworm on your body is called tinea corporis. This type of ringworm affects your arms, legs, torso and face. Ringworm is treated with antifungal medication available either over the counter or as a prescription.

Types of ringworm

Ringworm has different names based on where it appears on your body — and it can appear just about anywhere. Ringworm infections include:

  • Athlete’s foot: Also called tinea pedis, this fungal infection causes an itchy, burning skin rash between your toes and on the soles of your feet. Your skin may become scaly and cracked or develop blisters. Sometimes, your feet smell bad.
  • Jock itch: Tinea cruris, or jock itch, causes a red, itchy rash in your groin, upper thighs or rectum. Some people get blisters.
  • Scalp ringworm (tinea capitis): This causes scaly, red, itchy bald spots on your scalp. If left untreated, the bald spots can grow bigger and become permanent.
  • Hands (tinea manuum): Signs of ringworm on your hands include dry, cracked palms and ring-like patches.
  • Beard (tinea barbae): Ringworm appears on your neck, chin and cheeks. The patches might become crusted over or filled with pus.
  • Toenails or fingernails (tinea unguium or onychomycosis): Nails become thick, discolored and deformed.

What does ringworm look like?

Ringworm typically begins as a flat, discolored patch, which may appear red in lighter complexions and brown in darker complexions. The patch has a ring-like or circular shape with a raised, scaly border.

Who gets ringworm?

Ringworm affects people of all ages. You’re more at risk for ringworm if you:

  • Have a weakened immune system or an autoimmune disease like lupus.
  • Participate in high-contact sports, such as wrestling (this ringworm is called tinea gladiatorum).
  • Sweat excessively (hyperhidrosis).
  • Use public locker rooms or public showers.
  • Work closely with animals that might have ringworm.

How common is ringworm?

Ringworm is contagious and extremely common. It can affect 20% to 25% of the world’s population at any given time.

Symptoms and Causes

What are the signs of ringworm?

Signs typically appear between four and 14 days after your skin comes in contact with the fungi that cause ringworm, including:

  • Circular, ring-shaped scales or plaques.
  • Flat patches with a raised, round border.
  • Itchy skin.
  • Hair loss or bald spots in the affected area.

What causes ringworm?

Despite its name, a fungus causes ringworm. This type of fungus naturally lives on your skin, hair and nails. However, when their environment gets hot and damp, the fungi start growing uncontrollably. You can get this infection anytime your skin comes into contact with the ringworm fungus on someone else’s skin.

How contagious is ringworm?

Ringworm is contagious. It can live on your skin, on surfaces and in soil. The main ways ringworm spreads are:

  • Skin-to-skin contact with a person who has ringworm.
  • Contact with an infected dog, cat or animal (livestock or pets).
  • Contact with a contaminated surface, such as a locker room floor or sweaty gym clothes.
  • Sharing objects with an infected person or animal such as a brush, towel or bedding.
  • Contaminated soil.

Diagnosis and Tests

How is ringworm diagnosed?

Your healthcare provider can diagnose ringworm by looking at your skin and assessing your symptoms. They may scrape the area to look at the skin cells under a microscope, too. Examining the scales typically confirms ringworm.

Management and Treatment

How is ringworm treated?

Several nonprescription (over-the-counter) and prescription antifungal medications are available to treat ringworm. Antifungals come in various forms like creams, gels or powders. Your healthcare provider can treat more widespread ringworm with oral antifungal medication.

Antifungal creams and powders

Over-the-counter (OTC) antifungal creams, gels or powders typically work well. OTC products include:

If your symptoms get worse or don’t clear after two weeks, you may need an oral prescription medication from your healthcare provider.

Oral medication

Your healthcare provider may write you a prescription for oral antifungal medication if you have ringworm on your scalp or on many parts of your body. Most medications are prescribed for between one and three months. Oral antifungal medications include:

Antifungal shampoo

Antifungal shampoo, such as ketoconazole shampoo (Nizoral A-D®), may stop scalp ringworm from spreading. It won’t cure it, but it may help contain the infection. You also need to take a prescribed oral antifungal medication. Unaffected family members may benefit from using the shampoo as well.

Home remedies for ringworm

Home remedies like apple cider vinegar or tea tree have little to no benefit. Apple cider vinegar may cause open sores or inflammation. Tea tree oil has antifungal and antimicrobial properties but its effects aren’t well-known.

Your home may require treatment as well. The ringworm fungus can live on surfaces for months. Disinfectant sprays like Lysol® or bleach can remove the fungus. Wash clothes, sheets and towels in hot water and detergent to prevent ringworm from spreading.

Steroid creams

Corticosteroid creams may help reduce inflammation, but they shouldn’t be used to treat ringworm. In fact, they may worsen the infection.

What cures ringworm?

Mild cases of ringworm clear up within a few weeks. More serious infections may require treatment for six to 12 weeks.

Some other things you can do to promote healing:

  • Keep the affected area clean and dry.
  • Apply antifungal lotions, creams or ointments for the entire treatment period.
  • Avoid touching the area and wash your hands before touching other areas of your body.

Does ringworm go away by itself?

Although ringworm can go away by itself, it’s not common. While ringworm is present on your skin, you’re still contagious to others.

Outlook / Prognosis

Can ringworm come back?

Yes, ringworm can come back. Ringworm will go away if you treat it appropriately. Follow your healthcare provider’s treatment plan until the infection clears completely. If you stop treatment or treatment ends too soon, the infection can come back.

What are the complications of ringworm?

If you suspect you or your child has ringworm, don’t use anti-itch creams containing corticosteroids. These creams weaken your skin’s defenses. They can allow the infection to spread and cover larger sections of skin. On rare occasions, the ringworm fungus goes deeper into your skin, making it even harder to treat.

Scalp ringworm can lead to a painful inflammation called kerion. With kerion, you may develop crusty, pus-filled sores, often with hair loss and scarring.

Prevention

How can I prevent ringworm?

Ringworm thrives in damp, warm areas. The fungus can live on towels, clothes, sheets and household surfaces for months. Preventing ringworm involves:

  • Changing your socks and underwear daily or more frequently if they become damp or soiled.
  • Showering immediately after contact sports or exercise.
  • Wearing sandals or shower shoes at the pool and in public locker rooms and showers.
  • Drying your skin thoroughly after showering, especially between your toes.
  • Avoiding sharing towels, washcloths, sheets, clothes, combs or other personal hygiene items.
  • Washing clothes, athletic gear, sheets and towels in hot water and detergent.
  • Disinfecting surfaces with bleach or sprays like Lysol®.
  • Treating pets for ringworm, if they’re infected.
  • Washing hands thoroughly after contact with animals.

A weak immune system or living in a damp, warm climate increases your risk of a fungal infection.

Living With

When should I call the doctor?

Call your healthcare provider if the ringworm infection:

  • Appears on your scalp.
  • Looks infected (redness and swelling).
  • Occurs during pregnancy.
  • Spreads to other areas of your body.
  • Doesn’t improve after using over-the-counter antifungal medication as directed.

What questions should I ask my doctor?

You’re sure to have questions if you or your child develop ringworm. You might ask your healthcare provider:

  • How did I get ringworm?
  • How long is ringworm contagious?
  • Should I (or my child) stay home from work/school until the ringworm infection is gone?
  • What steps can I take to prevent ringworm from spreading to other parts of my body?
  • What steps can I take to prevent ringworm from spreading to other people?
  • What’s the best treatment for ringworm?
  • Should I avoid any medications or treatments?
  • What steps can I take to keep from getting ringworm again?
  • How can I tell if my pet has ringworm?
  • Should I look out for signs of complications?

Additional Common Questions

Is ringworm an actual worm?

No, ringworm isn’t a worm. It’s a fungal infection that gets its name from its ring-like border.

How does ringworm affect pregnancy?

Ringworm fungus won’t affect your pregnancy. Still, you should check with your healthcare provider before using over-the-counter antifungal creams or powders. Oral antifungal medications appear to be safe to take during pregnancy. Your pregnancy care provider can discuss potential risks and benefits with you.

Can you get ringworm from dogs or cats?

Yes, you can get ringworm from dogs, cats and other animals like cows, goats or horses. You can protect yourself by always washing your hands after playing with or petting animals. If your pet has ringworm, disinfect your pet’s bedding and take extra care to clean surfaces your pet has visited in your home.

How is ringworm different from eczema?

Eczema and many other skin conditions can resemble ringworm. Both ringworm and eczema cause itchy, red skin. Unlike ringworm, eczema isn’t contagious and doesn’t spread from one area to another on your body. Ringworm has a unique, ring-like appearance. Contact a healthcare provider for an appropriate diagnosis.

A note from Cleveland Clinic

Ringworm can be unpleasant, but antifungal medications will help you get rid of the fungus that causes ringworm. The treatment may take time, but it’s important to follow your healthcare provider’s treatment plan for as long as recommended. Ending treatment too soon can cause ringworm to return and make the infection harder to treat. Ask your provider about how you can keep ringworm from spreading to other parts of your body and to other people.

ooOOoo

We were having a small and casual party at our house celebrating Halloween. There were thirteen coming some of them in fancy dress. However, a close neighbour, who was coming with her husband, recommended a cancellation just in case I passed the fungal infection on to someone else.

Thus I have to be patient and keep putting on the cream for at least the next couple of weeks, hoping that in time I will notice the fungus going.

Finally, let me just repeat this from the article by Clevedon: ‘Ringworm is contagious and extremely common. It can affect 20% to 25% of the world’s population at any given time.’

The start of it all!

A wonderful documentary of the formation of Planet Earth.

From the website The Earth through time, I quote: The Earth was formed about 4.6 billion years ago. 4.6 billion is 4,600,000.000 years ago. It was formed by collisions of particles in a large cloud of material. Slowly gravity gathered together all these particles of dust and gas and formed larger clumps. These clumps continued to collide and gradually grew bigger and bigger eventually forming the Earth. The earth at this time was very different to how we know it today.

I left a comment on the site: What a wonderful story.. So many comments in support of this fantastic film, and rightly so.

Reply

Prehistory

We all live in the Quantenary period. From Wikipedia I quote a small piece:

It follows the Neogene Period and spans from 2.6 million years ago to the present.

I don’t know about you but 2.6 million years ago (Ma) seems like a very long time. But then the prior period was the Neogene that went from 2.6 Ma to 23 Ma.

But if one wants to think ‘old’ then try the Ordovician period:

The Ordovician spans 41.6 million years from the end of the Cambrian Period 486.85 Ma (million years ago) to the start of the Silurian Period 443.1 Ma.

****

Just to put us humans into context, human evolution is very much shorter. I have it from six million years onwards. But here are two videos, courtesy of YouTube. The first one is a short one:

Scientists use fossils to reconstruct the evolutionary history of hominins—the group that includes modern humans, our immediate ancestors, and other extinct relatives. Today, our closest living relatives are chimpanzees, but extinct hominins are even closer. Where and when did they live? What can we learn about their lives? Why did they go extinct? Scientists look to fossils for clues.

 The second video is a 54-minute one from PBS.

They have both been watched thousands of times.

Now on to today’s post.

ooOOoo

Giant ground sloths’ fossilized teeth reveal their unique roles in the prehistoric ecosystem

Harlan’s ground sloth fossil skeleton excavated and displayed at the La Brea Tar Pits in Los Angeles. Larisa DeSantis

Larisa R. G. DeSantis, Vanderbilt University and Aditya Reddy Kurre, University of Pennsylvania

animal hanging from a branch looks upside down at the camera
A two-toed sloth at the Nashville Zoo. Larisa R. G. DeSantis

Imagine a sloth. You probably picture a medium-size, tree-dwelling creature hanging from a branch. Today’s sloths – commonly featured on children’s backpacks, stationery and lunch boxes – are slow-moving creatures, living inconspicuously in Central American and South American rainforests.

But their gigantic Pleistocene ancestors that inhabited the Americas as far back as 35 million years ago were nothing like the sleepy tree huggers we know today. Giant ground sloths – some weighing thousands of pounds and standing taller than a single-story building – played vital and diverse roles in shaping ecosystems across the Americas, roles that vanished with their loss at the end of the Pleistocene.

In our new study, published in the journal Biology Letters, we aimed to reconstruct the diets of two species of giant ground sloths that lived side by side in what’s now Southern California. We analyzed remains recovered from the La Brea Tar Pits of what are colloquially termed the Shasta ground sloth (Nothrotheriops shastensis) and Harlan’s ground sloth (Paramylodon harlani). Our work sheds light on the lives of these fascinating creatures and the consequences their extinction in Southern California 13,700 years ago has had on ecosystems.

Dentin dental challenges

Studying the diets of extinct animals often feels like putting together a jigsaw puzzle with only a portion of the puzzle pieces. Stable isotope analyses have revolutionized how paleoecologists reconstruct the diets of many ancient organisms. By measuring the relative ratios of light and heavy carbon isotopes in tooth enamel, we can figure out what kinds of foods an animal ate – for instance, grasses versus trees or shrubs.

dental drill in hands near an animal jawbone
Drilling teeth provides a sample for stable isotope analyses. Aditya Kurre

But the teeth of giant ground sloths lack enamel, the highly inorganic and hard outer layer on most animal teeth – including our own. Instead, sloth teeth are made primarily of dentin, a more porous and organic-rich tissue that readily changes its chemical composition with fossilization.

Stable isotope analyses are less dependable in sloths because dentin’s chemical composition can be altered postmortem, skewing the isotopic signatures.

Another technique researchers use to glean information about an animal’s diet relies on analyzing the microscopic wear patterns on its teeth. Dental microwear texture analysis can infer whether an animal mostly ate tough foods such as leaves and grass or hard foods such as seeds and fruit pits. This technique is also tricky when it comes to sloths’ fossilized teeth because signs of wear may be preserved differently in the softer dentin than in harder enamel.

Prior to studying fossil sloths, we vetted dental microwear methods in modern xenarthrans, a group of animals that includes sloths, armadillos and anteaters. This study demonstrated that dentin microwear can reveal dietary differences between leaf-eating sloths and insect-consuming armadillos, giving us confidence that these tools could reveal dietary information from ground sloth fossils.

Distinct dietary niches revealed

Previous research suggested that giant ground sloths were either grass-eating grazers or leaf-eating browsers, based on the size and shape of their teeth. However, more direct measures of diet – such as stable isotopes or dental microwear – were often lacking.

Our new analyses revealed contrasting dental wear signatures between the two co-occurring ground sloth species. The Harlan’s ground sloth, the larger of the two, had microwear patterns dominated by deep pitlike textures. This kind of wear is indicative of chewing hard, mechanically challenging foods such as tubers, seeds, fungi and fruit pits. Our new evidence aligns with skeletal adaptations that suggest powerful digging abilities, consistent with foraging foods both above and below ground.

diagram of sloth profiles, tooth outline and magnified surface of two bits of the teeth
The fossil teeth of the Harlan’s ground sloth typically showed deeper pitlike textures, bottom, while the Shasta ground sloth teeth had shallower wear patterns, top. DeSantis and Kurre, Biology Letters 2025

In contrast, the Shasta ground sloth exhibited dental microwear textures more akin to those in leaf-eating and woody plant-eating herbivores. This pattern corroborates previous studies of its fossilized dung, demonstrating a diet rich in desert plants such as yucca, agave and saltbush.

Next we compared the sloths’ microwear textures to those of ungulates such as camels, horses and bison that lived in the same region of Southern California. We confirmed that neither sloth species’ dietary behavior overlapped fully with other herbivores. Giant ground sloths didn’t perform the same ecological functions as the other herbivores that shared their landscape. Instead, both ground sloths partitioned their niches and played complementary ecological roles.

Extinctions brought ecological loss

The Harlan’s ground sloth was a megafaunal ecosystem engineer. It excavated soil and foraged underground, thereby affecting soil structure and nutrient cycling, even dispersing seed and fungal spores over wide areas. Anecdotal evidence suggests that some anachronistic fruits – such as the weird, bumpy-textured and softball-size Osage orange – were dispersed by ancient megafauna such as giant ground sloths. When the Pleistocene megafauna went extinct, the loss contributed to the regional restriction of these plants, since no one was around to spread their seeds.

The broader consequence is clear: Megafaunal extinctions erased critical ecosystem engineers, triggering cascading ecological changes that continue to affect habitat resilience today. Our results resonate with growing evidence that preserving today’s living large herbivores and understanding the diversity of their ecological niches is crucial for conserving functional ecosystems.

Studying the teeth of lost giant ground sloths has illuminated not only their diets but also the enduring ecological legacies of their extinction. Today’s sloths, though charming, only hint at the profound environmental influence of their prehistoric relatives – giants that shaped landscapes in ways we are only beginning to appreciate.

Larisa R. G. DeSantis, Associate Professor of Biological Sciences, Vanderbilt University and Aditya Reddy Kurre, Dental Student, University of Pennsylvania

This article is republished from The Conversation under a Creative Commons license. Read the original article.

ooOOoo

I am going to finish with a link, and a small extract, from a Wikipedia article on the evolution of Homo sapiens

The timeline of human evolution outlines the major events in the evolutionary lineage of the modern human speciesHomo sapiens, throughout the history of life, beginning some 4 billion years ago down to recent evolution within H. sapiens during and since the Last Glacial Period.

The beautiful moon, but …

… does it make us sleepless?

As has been mentioned previously, my dear wife and her Parkinson’s means that we go to bed early and get up early the following morning. Thus a recent item on The Conversation fascinated me and it is shared with you now.

ooOOoo

Does the full moon make us sleepless? A neurologist explains the science behind sleep, mood and lunar myths

How much does the moon cycle affect sleep? Probably less than your screen time at night. Muhammad Khazin Alhusni/iStock via Getty Images Plus

Joanna Fong-Isariyawongse, University of Pittsburgh

Have you ever tossed and turned under a full moon and wondered if its glow was keeping you awake? For generations, people have believed that the Moon has the power to stir up sleepless nights and strange behavior – even madness itself. The word “lunacy” comes directly from luna, Latin for Moon.

Police officers, hospital staff and emergency workers often swear that their nights get busier under a full moon. But does science back that up?

The answer is, of course, more nuanced than folklore suggests. Research shows a full moon can modestly affect sleep, but its influence on mental health is much less certain.

I’m a neurologist specializing in sleep medicine who studies how sleep affects brain health. I find it captivating that an ancient myth about moonlight and madness might trace back to something far more ordinary: our restless, moonlit sleep.

What the full moon really does to sleep

Several studies show that people really do sleep differently in the days leading up to the full moon, when moonlight shines brightest in the evening sky. During this period, people sleep about 20 minutes less, take longer to fall asleep and spend less time in deep, restorative sleep. Large population studies confirm the pattern, finding that people across different cultures tend to go to bed later and sleep for shorter periods in the nights before a full moon.

The most likely reason is light. A bright moon in the evening can delay the body’s internal clock, reduce melatonin – the hormone that signals bedtime – and keep the brain more alert.

The changes are modest. Most people lose only 15 to 30 minutes of sleep, but the effect is measurable. It is strongest in places without artificial light, such as rural areas or while camping. Some research also suggests that men and women may be affected differently. For instance, men seem to lose more sleep during the waxing phase, while women experience slightly less deep and restful sleep around the full moon.

Young adult woman lying in bed wide awake, staring out the window toward a bright light.
Sleep loss from a bright moon is modest but measurable. Yuliia Kaveshnikova/iStock via Getty Images Plus

The link with mental health

For centuries, people have blamed the full moon for stirring up madness. Folklore suggested that its glow could spark mania in bipolar disorder, provoke seizures in people with epilepsy or trigger psychosis in those with schizophrenia. The theory was simple: lose sleep under a bright moon and vulnerable minds might unravel.

Modern science adds an important twist. Research is clear that sleep loss itself is a powerful driver of mental health problems. Even one rough night can heighten anxiety and drag down mood. Ongoing sleep disruption raises the risk of depression, suicidal thoughts and flare-ups of conditions like bipolar disorder and schizophrenia.

That means even the modest sleep loss seen around a full moon could matter more for people who are already at risk. Someone with bipolar disorder, for example, may be far more sensitive to shortened or fragmented sleep than the average person.

But here’s the catch: When researchers step back and look at large groups of people, the evidence that lunar phases trigger psychiatric crises is weak. No reliable pattern has been found between the Moon and hospital admissions, discharges or lengths of stay.

But a few other studies suggest there may be small effects. In India, psychiatric hospitals recorded more use of restraints during full moons, based on data collected between 2016 and 2017. In China, researchers noted a slight rise in schizophrenia admissions around the full moon, using hospital records from 2012 to 2017. Still, these findings are not consistent worldwide and may reflect cultural factors or local hospital practices as much as biology.

In the end, the Moon may shave a little time off our sleep, and sleep loss can certainly influence mental health, especially for people who are more vulnerable. That includes those with conditions like depression, bipolar disorder, schizophrenia or epilepsy, and teenagers who are especially sensitive to sleep disruption. But the idea that the full moon directly drives waves of psychiatric illness remains more myth than reality.

The sleep/wake cycle is synchronized with lunar phases.

Other theories fall short

Over the years, scientists have explored other explanations for supposed lunar effects, from gravitational “tidal” pulls on the body to subtle geomagnetic changes and shifts in barometric pressure. Yet, none of these mechanisms hold up under scrutiny.

The gravitational forces that move oceans are far too weak to affect human physiology, and studies of geomagnetic and atmospheric changes during lunar phases have yielded inconsistent or negligible results. This makes sleep disruption from nighttime light exposure the most plausible link between the Moon and human behavior.

Why the myth lingers

If the science is so inconclusive, why do so many people believe in the “full moon effect”? Psychologists point to a concept called illusory correlation. We notice and remember the unusual nights that coincide with a full moon but forget the many nights when nothing happened.

The Moon is also highly visible. Unlike hidden sleep disruptors such as stress, caffeine or scrolling on a phone, the Moon is right there in the sky, easy to blame.

A woman staring at her cellphone while lying in the dark.
Screen-time habits are far more likely to have detrimental effects on sleep than a full moon. FanPro/Moment via Getty Images

Lessons from the Moon for modern sleep

Even if the Moon does not drive us “mad,” its small influence on sleep highlights something important: Light at night matters.

Our bodies are designed to follow the natural cycle of light and dark. Extra light in the evening, whether from moonlight, streetlights or phone screens, can delay circadian rhythms, reduce melatonin and lead to lighter, more fragmented sleep.

This same biology helps explain the health risks of daylight saving time. When clocks “spring forward,” evenings stay artificially brighter. That shift delays sleep and disrupts circadian timing on a much larger scale than the Moon, contributing to increased accidents and cardiovascular risks, as well as reduced workplace safety.

In our modern world, artificial light has a much bigger impact on sleep than the Moon ever will. That is why many sleep experts argue for permanent standard time, which better matches our biological rhythms.

So if you find yourself restless on a full moon night, you may not be imagining things – the Moon can tug at your sleep. But if sleeplessness happens often, look closer to home. It is likely a culprit of the light in your hand rather than the one in the sky.

Joanna Fong-Isariyawongse, Associate Professor of Neurology, University of Pittsburgh

This article is republished from The Conversation under a Creative Commons license. Read the original article.

ooOOoo

Ever since I have been an adult I have wondered what the purpose was of daylight time and standard time. The University of Colorado have the history of the time change and, as I suspect, it was brought about by the war; World War I.

Here’s part of that article:

It was first introduced in Germany in 1916 during World War I as an energy saving measure, according to CU Boulder sleep researcher Kenneth Wright. The U.S. followed suit, adopting DST in 1918. Initially implemented as a wartime measure, it was repealed a year later. 

Daylight saving time was reinstituted in 1942 during World War II. The next couple decades were a free-for-all, when states and localities switched between DST and standard time (ST) at will. To put an end to the clock chaos, Congress finally passed the Uniform Time Act in 1966, which standardized daylight saving time and its start and end dates across the country — with the exception of Hawaii and Arizona, which opted to keep standard time year-round.