This is not the first time that I have wandered through this subject. Indeed, Learning from Dogs would never have seen the light of day if, all those years ago, Jon Lavin hadn’t raised the fascinating idea that dogs are integrous animals. As the quote says in the sidelink Purpose of Learning from Dogs;
There is nothing to fear except the persistent refusal to find out the truth, the persistent refusal to analyse the causes of happenings. Dorothy Thompson.
So what is it that has rocked my boat again? A number of things, to be honest. So much so, please forgive me for running these musings over to tomorrow!
Regular readers may have noticed that both Tuesday’s post Modelling the future and yesterday’s A study of man’s behaviours explored determining truth; frequently a fickle beast to track down! Then last Monday, I read the latest post from Climate Denial Crock of the Week that was about Sea Ice Slowing to Minimum. It was yet another reminder that embracing the truth of what is happening to our planet is vital, I mean VITAL, for anyone who has a reasonable expectation to be alive in 20 years time.
Here’s how Peter opened that post (published with Peter’s kind permission):
Not there yet, but in an interview with Dr. Jennifer Francis of Rutgers last week, the message was clear – the ice has retreated so much that at this point, we will already be experiencing the impacts of a low or no-ice arctic minimum, including “very interesting” weather in the northern hemisphere this fall and winter. Wow. I can’t wait.
Peter then included an update in that post, a reference to a report published in the Australian newspaper The Sydney Morning Herald. I requested permission to republish the SMH article but that wasn’t granted, well to be factual it was offered at a fee of $420.75 – say no more. Here is how that report from on board the Greenpeace ice-breaker Arctic Sunrise opened,
We are a few hundred miles from the north pole. The air temperature is -3C, the sea freezing. All around us in these foggy Arctic waters at the top of the world are floes – large and small chunks of sea ice that melt and freeze again with the seasons.
Arne Sorensen, our Danish ice pilot, is 18 metres up in the crow’s nest of the Arctic Sunrise vessel. Visibility is just 200 metres and he inches the 1,000-tonne Greenpeace ice-breaker forward at two knots through narrow passages of clear water.
A few paragraphs later come this:
More than 600,000 sq km more ice has melted in 2012 than ever recorded by satellites. Now the minimum extent has nearly been reached and the sea is starting to refreeze.
‘‘This is the new minimum extent of the ice cap,’’ [Sorensen] says – the frontline of climate change. ‘‘It is sad. I am not doubting this is related to emitting fossil fuels to a large extent. It’s sad to observe that we are capable of changing the planet to such a degree.’’
British, Italian and American scientists on the Arctic Sunrise say they are shocked at the speed and extent of the ice loss.
Over at the Guardian newspaper, their reporter John Vidal, also aboard the Arctic Sunrise, reports:
One of the world’s leading ice experts has predicted the final collapse of Arctic sea ice in summer months within four years.
In what he calls a “global disaster” now unfolding in northern latitudes as the sea area that freezes and melts each year shrinks to its lowest extent ever recorded, Prof Peter Wadhams of Cambridge University calls for “urgent” consideration of new ideas to reduce global temperatures.
In an email to the Guardian he says: “Climate change is no longer something we can aim to do something about in a few decades’ time, and that we must not only urgently reduce CO2 emissions but must urgently examine other ways of slowing global warming, such as the various geoengineering ideas that have been put forward.”
Professor Peter Wadhams is head of the Cambridge University Polar Ocean Physics Group, from which one may learn,
Sea ice covers 7% of the surface of our planet. It is one of the most important and variable components of the planetary surface and is the key to understanding many basic questions about the energy balance of the Earth. The ice-covered seas represent the cold end of the enormous heat engine that enables the Earth to have temperatures suitable for human life over most of its surface.
Just go back and re-read, “.. the enormous heat engine that enables the Earth to have temperatures suitable for human life over most of its surface.”
So determining the truth of what is happening to our planet is not some elegant academic exercise, it is about determining the likelihood of human life surviving or not!
Doesn’t that put everything else we are doing into some form of perspective? Let me rant on tomorrow!
A reflection on why living in harmony with Planet Earth seems so challenging.
John Hurlburt is the ‘mover and shaker’ behind a series of talks and discussions under the overall title of Everything Fits Together, part of the adult education umbrella of St Paul’s Episcopal Church here in Payson, AZ. John generously asked if I would lead the discussion tonight (19th) along the theme of Nature and Faith. I plan to close the session with these words and the compelling video that was on Learning from Dogs last Friday A planet worth protecting.
oooOOOooo
Man – a study in behaviours.
The relationship between Planet Earth and man goes back a very long way. But what of today?
There is little doubt that many people, even with the minimum of awareness about the world in which we live, are deeply worried. On so many fronts there are forbidding and scary views. It feels as though all the certainty of past times has gone; as if all the trusted models of society are now broken. Whether we are talking politics, economics, employment or the environment, nothing seems to be working.
Why might this be?
It would be easy to condemn man’s drive for progress and an insatiable self-centredness as the obvious causes of our society failing in widespread ways. But in my view that’s too simple an explanation. It’s much more complex.
I propose that the challenges we all face today have their roots in the dawning of our evolution. Let’s remind ourselves how far back that goes.
The earliest documented members of the genus Homo are Homo habilis which evolved around 2.3 million years ago. Homo habilis was the first species for which we have positive evidence of the use of stone tools.
A theory known as Recent African Ancestry theory, postulates that modern humans evolved in Africa possibly from Homo heidelbergensis and migrated out of the continent some 50,000 to 100,000 years ago, replacing local populations of Homo erectus and Homo neanderthalensis.
Thus for tens of thousands of years, the behaviours of humans have served our species well, by definition. Ergo, mankind has evolved as the result of mankind’s behaviours. Behaviours that may have changed little over those countless years.
So one might speculate that these behaviours have been potentially damaging to the ultimate survival of our species, perhaps hugely damaging, for a very long time. But because man’s population footprint has been so small for 99% of eternity the consequences have not impinged on the planet until now. Let’s reflect on those population figures.
Until the development of agriculture, around the 11th millennium BC, the world population was stable at around one million persons, as man lived out a subsistence hunter-gatherer existence. By about 2000 years ago the global population of man had climbed to around 300 million. It took another 1,200 years for that global population to reach the first billion, as it did in 1804.
However, just 123 years later, in 1927, the two-billionth baby was born. The three-billionth baby was born in 1960, just 33 years later! Only a further 14 years slip by for the four-billionth baby to be born in 1974. Another blink of the geological eyelid and 13 years later, in 1987, along comes the five-billionth bundle of joy. Around October 1999, the sixth-billionth baby is born! It is likely that we are in a world where there are now seven billion people! Indeed, the world population clock estimates that on September 12th, a week ago, the world population was 7,039,725,283 persons.
About a billion every decade. The equivalent of a growth of 100 million each and every year, or around 270,000 every single day! Or if you prefer 11,250 an hour (Remember that’s the net growth, births minus deaths, of the population of humans on this planet!)
Combine man’s behaviours with this growth of population and we have the present situation. A totally unsustainable situation on a planet that is our only home.
The only viable solution is to amend our behaviours. To tap into the powers of integrity, self-awareness and mindfulness and change our game.
All of us, no exceptions, have to work with the fundamental, primary relationships we have with each other and with the planet upon which we all depend. We need the birth of a new level of consciousness; of our self, of each other and of the living, breathing planet. A new consciousness that will empower change. We need spiritual enlightenment. We need a spiritual bond with this beautiful planet.
Over eons of time, Planet Earth has favoured our evolution. Now, today, not tomorrow, it is time to favour our beautiful planet with our love and with our faith. It is the ultimate decision for our species.
oooOOOooo
If you need a reminder of how beautiful our planet is (and I’m sure the majority of LfD readers don’t require that reminder) then go back and watch David Attenborough’s video and voice-over to the song What a Wonderful World.
I will close by inserting into this post, the video that Martin Lack included in a recent comment to my post The wind doth blow!
Can we trust the predictive output of computer modelling?
I would be the first to admit that this is not an area where I have anything more than general knowledge. However, what prompted me to think about this topic was a chance conversation with someone here in Payson. We were chatting over the phone and this person admitted to being less than fully convinced of the ’cause and effect’ of man’s influence on the global biosphere.
When I queried that, what was raised was the idea that all modelling algorithms used in climate change predictions must incorporate mathematical constants. I continued to listen as it was explained that, by definition, all constants were, to some degree, approximations. Take, for example, the obvious one of the constant π, that Wikipedia describes as: a mathematical constant that is the ratio of a circle’s circumference to its diameter. Pi, of course, would have to be rounded if it was to be used in any equation. Even taking it to thirty decimal places, as in 3.14159 26535 89793 23846 26433 83279, would mean rounding it to 3.14159 26535 89793 23846 26433 83280 (50288 being the 30th to 35th decimal places).
OK, so I must admit that I was leaning to the viewpoint that this person had a valid perspective. I then asked Martin Lack, he of Lack of Environment and a scientifically trained person, for his thoughts. The rest of this post is based on the information that Martin promptly sent me.
One of the links that Martin sent was to this post on the Skeptical Science blogsite. That post sets out the common skeptics view, namely:
Models are unreliable
“[Models] are full of fudge factors that are fitted to the existing climate, so the models more or less agree with the observed data. But there is no reason to believe that the same fudge factors would give the right behaviour in a world with different chemistry, for example in a world with increased CO2 in the atmosphere.” (Freeman Dyson)
The author of the Skeptical Science posting responds,
Climate models are mathematical representations of the interactions between the atmosphere, oceans, land surface, ice – and the sun. This is clearly a very complex task, so models are built to estimate trends rather than events. For example, a climate model can tell you it will be cold in winter, but it can’t tell you what the temperature will be on a specific day – that’s weather forecasting. Climate trends are weather, averaged out over time – usually 30 years. Trends are important because they eliminate – or “smooth out” – single events that may be extreme, but quite rare.
Climate models have to be tested to find out if they work. We can’t wait for 30 years to see if a model is any good or not; models are tested against the past, against what we know happened. If a model can correctly predict trends from a starting point somewhere in the past, we could expect it to predict with reasonable certainty what might happen in the future.
So all models are first tested in a process called Hindcasting. The models used to predict future global warming can accurately map past climate changes. If they get the past right, there is no reason to think their predictions would be wrong. Testing models against the existing instrumental record suggested CO2 must cause global warming, because the models could not simulate what had already happened unless the extra CO2 was added to the model. All other known forcings are adequate in explaining temperature variations prior to the rise in temperature over the last thirty years, while none of them are capable of explaining the rise in the past thirty years. CO2 does explain that rise, and explains it completely without any need for additional, as yet unknown forcings.
I strongly recommend you read the full article here. But I will republish this graph that, for me at least, is a ‘slam dunk’ in favour for modelling accuracy.
Sea level change. Tide gauge data are indicated in red and satellite data in blue. The grey band shows the projections of the IPCC Third Assessment report (Copenhagen Diagnosis 2009).
Not only does this show that the data is within the range of projections of the modelled output, more seriously the data is right at the top end of the model’s predictions. The article closes with this statement:
Climate models have already predicted many of the phenomena for which we now have empirical evidence. Climate models form a reliable guide to potential climate change.
There is a more detailed version of the above article available here. Do read that if you want to dig further down into this important topic. All I will do is to republish this,
There are two major questions in climate modeling – can they accurately reproduce the past (hindcasting) and can they successfully predict the future? To answer the first question, here is a summary of the IPCC model results of surface temperature from the 1800’s – both with and without man-made forcings. All the models are unable to predict recent warming without taking rising CO2 levels into account. Noone has created a general circulation model that can explain climate’s behaviour over the past century without CO2 warming. [my emphasis, Ed.]
Finally, back to Lack of Environment. On the 6th February, 2012, Martin wrote an essay Climate science in a nut fragment. Here’s how that essay closed:
Footnote:
If I were to attempt to go even further and summarise, in one single paragraph, why everyone on Earth should be concerned about ongoing anthropogenic climate disruption, it would read something like this:
Concern over anthropogenic climate disruption (ACD) is not based on computer modelling; it is based on the study of palaeoclimatology. Computer modelling is based on physics we have understood for over 100 years and is used to predict what will happen to the atmosphere for a range of projections for CO2 reductions. As such, the range of predictions is due to uncertainty in those projections; and not uncertainties in climate science. Furthermore, when one goes back 20 years and chooses to look at the projection scenario that most-closely reflects what has since happened to emissions, one finds that the modelled prediction matches reality very closely indeed.
In his email, Martin included these bullet points.
Concern over anthropogenic climate disruption (ACD) is not based on computer modelling.
It is based on our understanding of atmospheric physics (and how the Earth regulates its temperature).
Computer modelling is based on this physics (which we have understood for over 100 years).
Models have been used to predict temperature and sea level rise for a range of projections for CO2 emissions.
The wide range of predictions was due to uncertainty in those emissions projections not uncertainties in climate science.
This can be demonstrated by looking at predictions made over 20 years ago in light of what actually happened to emissions.
The model predictions for both temperature and sea level rise are very accurate (if not slightly under-estimating what has happened).
Sort of makes the point in spades! The sooner all human beings understand the truth of what’s happening to our planet, the sooner we can amend our behaviours. I’m going to pick up the theme of behaviours in tomorrow’s post on Learning from Dogs.
Finally, take a look at this graph and reflect! This will be the topic that I write about on Thursday.
Science may just be starting to make some sense of this cruelest of diseases.
It used be to the dreaded ‘C’ word; cancer. But now that ‘C’ word has a companion, the dreaded ‘A’ word. The incidence of Alzheimer’s disease seems to be on a terrible rise. Indeed, my wife, Jean, lost her late husband to Alzheimer’s disease. My half-sister back in England is now very ill with the disease. Just chatting to some people here in Payson a few days ago revealed many who had friends or relations suffering.
So a recent item first seen on the website of The Permaculture Research Institute of Australia really jumped off the ‘page’! It was an article by George Monbiot entitled The Mind Thieves. I dropped Mr. Monbiot a quick email requesting permission to republish the article and very promptly received a positive answer. Thank you, Sir.
So before moving to the article, first a little background on George M. From his website, one quickly reads,
George Monbiot
I had an unhappy time at university, and I now regret having gone to Oxford, even though the zoology course I took – taught, among others, by Richard Dawkins, Bill Hamilton and John Krebs – was excellent. The culture did not suit me, and when I tried to join in I fell flat on my face, sometimes in a drunken stupor. I enjoyed the holidays more: I worked on farms and as a waterkeeper on the River Kennet. I spent much of the last two years planning my escape. There was only one job I wanted, and it did not yet exist: to make investigative environmental programmes for the BBC.
I’m not going to copy the full ‘About George‘ description but do urge you to pop across to here and read it yourself; George has had, trust me, a fascinating life journey that I suspect is far from over. This is how that About description closes,
Here are some of the things I love: my family and friends, salt marshes, arguments, chalk streams, Russian literature, kayaking among dolphins, diversity of all kinds, rockpools, heritage apples, woods, fishing, swimming in the sea, gazpacho, sprinting up the pitch in ultimate frisbee, ponds and ditches, growing vegetables, insects, pruning, forgotten corners, fossils, goldfinches, etymology, Bill Hicks, ruins, Shakespeare, landscape history, palaeoecology and Father Ted.
Here are some of the things I try to fight: undemocratic power, corruption, deception of the public, environmental destruction, injustice, inequality and the misallocation of resources, waste, denial, the libertarianism which grants freedom to the powerful at the expense of the powerless, undisclosed interests, complacency.
Here is what I fear: other people’s cowardice.
I still see my life as a slightly unhinged adventure whose perpetuation is something of a mystery. I have no idea where it will take me, and no ambitions other than to keep doing what I do. So far it’s been gripping.
The article was first published in the British Guardian newspaper (there’s an online link to it here) as the article mentions below. But I am republishing, in full thanks to George, the copy that appeared on George’s website on the 10th September last, including the references.
oooOOOooo
The Mind Thieves
September 10th, 2012
The evidence linking Alzheimer’s disease to the food industry is strong and growing.
By George Monbiot, published in the Guardian, 11th September 2012
When you raise the subject of over-eating and obesity, you often see people at their worst. The comment threads discussing these issues reveal a legion of bullies, who appear to delight in other people’s problems.
When alcoholism and drug addiction are discussed, the tone tends to be sympathetic. When obesity is discussed, the conversation is dominated by mockery and blame, though the evidence suggests that it can be driven by similar forms of addiction(1,2,3,4). I suspect that much of this mockery is a coded form of snobbery: the strong association between poor diets and poverty allows people to use this issue as a cipher for something else they want to say, which is less socially acceptable.
But this problem belongs to all of us. Even if you can detach yourself from the suffering caused by diseases arising from bad diets, you will carry the cost, as a growing proportion of the health budget will be used to address them. The cost – measured in both human suffering and money – could be far greater than we imagined. A large body of evidence now suggests that Alzheimer’s is primarily a metabolic disease. Some scientists have gone so far as to rename it. They call it diabetes type 3.
New Scientist carried this story on its cover last week(5): since then I’ve been sitting in the library trying to discover whether it stands up. I’ve now read dozens of papers on the subject, testing my cognitive powers to the limit as I’ve tried to get to grips with brain chemistry. While the story is by no means complete, the evidence so far is compelling.
Around 35 million people suffer from Alzheimer’s disease worldwide(6); current projections, based on the rate at which the population ages, suggest that this will rise to 100 million by 2050(7). But if, as many scientists now believe, it is caused largely by the brain’s impaired response to insulin, the numbers could rise much further. In the US, the percentage of the population with diabetes type 2, which is strongly linked to obesity, has almost trebled in 30 years(8). If Alzheimer’s, or “diabetes type 3”, goes the same way, the potential for human suffering is incalculable.
Insulin is the hormone which prompts the liver, muscles and fat to absorb sugar from the blood. Diabetes 2 is caused by excessive blood glucose, resulting either from a deficiency of insulin produced by the pancreas, or resistance to its signals by the organs which would usually take up the glucose.
The association between Alzheimer’s and diabetes 2 is long-established: type 2 sufferers are two to three times more likely to be struck by this dementia than the general population(9). There are also associations between Alzheimer’s and obesity(10) and Alzheimer’s and metabolic syndrome (a complex of diet-related pathologies)(11).
Researchers first proposed that Alzheimer’s was another form of diabetes in 2005. The authors of the original paper investigated the brains of 54 corpses, 28 of which belonged to people who had died of the disease(12). They found that the levels of both insulin and insulin-like growth factors in the brains of Alzheimer’s patients were sharply reduced by comparison to those in the brains of people who had died of other causes. Levels were lowest in the parts of the brain most affected by the disease.
Their work led them to conclude that insulin and insulin-like growth factor are produced not only in the pancreas but also in the brain. Insulin in the brain has a host of functions: as well as glucose metabolism, it helps to regulate the transmission of signals from one nerve cell to another, and affects their growth, plasticity and survival(13,14).
Experiments conducted since then appear to support the link between diet and dementia(15,16,17,18), and researchers have begun to propose potential mechanisms. In common with all brain chemistry, these tend to be fantastically complex, involving, among other impacts, inflammation, stress caused by oxidation, the accumulation of one kind of brain protein and the transformation of another(19,20,21,22). I would need the next six pages of this paper even to begin to explain them, and would doubtless get it wrong (if you’re interested, please follow the links on my website).
Plenty of research still needs to be done. But if the current indications are correct, Alzheimer’s disease could be another catastrophic impact of the junk food industry, and the worst discovered so far. Our governments, as they are in the face of all our major crises, appear to be incapable of responding.
In this country as in many others, the government’s answer to the multiple disasters caused by the consumption of too much sugar and fat is to call on both companies and consumers to regulate themselves. Before he was replaced by someone even worse, the former health secretary, Andrew Lansley, handed much of the responsibility for improving the nation’s diet to food and drinks companies: a strategy that would work only if they volunteered to abandon much of their business(23,24).
A scarcely-regulated food industry can engineer its products – loading them with fat, salt, sugar and high fructose corn syrup – to bypass the neurological signals which would otherwise prompt people to stop eating(25). It can bombard both adults and children with advertising. It can (as we discovered yesterday) use the freedoms granted to academy schools to sell the chocolate, sweets and fizzy drinks now banned from sale in maintained schools(26). It can kill the only effective system (the traffic light label) for informing people how much fat, sugar and salt their food contains. Then it can turn to the government and blame consumers for eating the products it sells. This is class war: a war against the poor fought by the executive class in government and industry.
We cannot yet state unequivocally that poor diet is a leading cause of Alzheimer’s disease, though we can say that the evidence is strong and growing. But if ever there was a case for the precautionary principle, here it is. It’s not as if we lose anything by eating less rubbish. Averting a possible epidemic of this devastating disease means taking on the bullies: those who mock people for their pathologies and those who spread the pathologies by peddling a lethal diet.
References:
1. Caroline Davis et al, 2011. Evidence that ‘food addiction’ is a valid phenotype of obesity. Appetite Vol. 57, pp711–717. doi:10.1016/j.appet.2011.08.017
2. Paul J. Kenny, November 2011. Common cellular and molecular mechanisms in obesity and drug addiction. Nature Neuroscience, Vol. 12, pp 638-651. doi:10.1038/nrn3105
3. Joseph Frascella et al, 2010. Shared brain vulnerabilities open the way for nonsubstance addictions: Carving addiction
at a new joint? Annals of the New York Academy of Sciences, Vol. 1187, pp294–315. doi: 10.1111/j.1749-6632.2009.05420.x
4. Ashley N. Gearhardt et al, 2010. Can food be addictive? Public health and policy implications. Addiction, 106, 1208–1212. ad. d_3301 1208..1212 doi:10.1111/j.1360-0443.2010.03301.x
5. Bijal Trivedi, 1st September 2012. Eat Your Way to Dementia. New Scientist.
6. Sónia C. Correia et al, 2011. Insulin-resistant brain state: The culprit in sporadic Alzheimer’s disease? Ageing Research Reviews Vol. 10, 264–273. doi:10.1016/j.arr.2011.01.001
7. Fabio Copped`e et al, 2012. Nutrition and Dementia. Current Gerontology and Geriatrics Research, Vol. 2012, pp1-3. doi:10.1155/2012/926082
8. See the graph in Bijal Trivedi, 1st September 2012. Eat Your Way to Dementia. New Scientist.
9. Johanna Zemva and Markus Schubert, September 2011. Central Insulin and Insulin-Like Growth Factor-1 Signaling – Implications for Diabetes Associated Dementia. Current Diabetes Reviews, Vol.7, No.5, pp356-366. doi.org/10.2174/157339911797415594
10. Eg Weili Xu et al, 2011. Midlife overweight and obesity increase late life dementia risk: a population-based twin study. Neurology, Vol. 76, no. 18, pp.1568–1574.
11. M. Vanhanen et al, 2006. Association of metabolic syndrome with Alzheimer disease: A population-based study. Neurology, vol. 67, pp.843–847.
12. Eric Steen et al, 2005. Impaired insulin and insulin-like growth factor expression and signaling mechanisms in Alzheimer’s disease – is this type 3 diabetes?. Journal of Alzheimer’s Disease, Vol. 7, pp.63–80.
13. Konrad Talbot et al, 2012. Demonstrated brain insulin resistance in Alzheimer’s disease patients is associated with IGF-1 resistance, IRS-1 dysregulation, and cognitive decline. The Journal of Clinical Investigation, Vol.122, No.4, pp.1316–1338. doi:10.1172/JCI59903.
14. Naoki Yamamoto et al, 2012. Brain insulin resistance accelerates Aβ fibrillogenesis by inducing GM1 ganglioside clustering in the presynaptic membranes. Journal of Neurochemistry, Vol. 121, 619–628. doi: 10.1111/j.1471-4159.2012.07668.x
15. Eg:
Wei-Qin Zhao and Matthew Townsend, 2009. Insulin resistance and amyloidogenesis as common molecular foundation for type 2 diabetes and Alzheimer’s disease.
Biochimica et Biophysica Acta, Vol.1792, pp.482–496. doi.org/10.1016/j.bbadis.2008.10.014,
16. Sónia C. Correia et al, 2011. Insulin-resistant brain state: The culprit in sporadic Alzheimer’s disease? Ageing Research Reviews Vol. 10, 264–273. doi:10.1016/j.arr.2011.01.001
17. T. Ohara et al, 2011. Glucose tolerance status and risk of dementia in the community, the Hisayama study. Neurology, Vol. 77, pp.1126–1134.
18. Karen Neumann et al, 2008. Insulin resistance and Alzheimer’s disease: molecular links & clinical implications. Current Alzheimer Research, Vol.5, no.5, pp438–447.
19. Eg: Lap Ho et al, 2012. Insulin Receptor Expression and Activity in the Brains of Nondiabetic Sporadic Alzheimer’s Disease Cases. International Journal of Alzheimer’s Disease, Volume 2012. doi:10.1155/2012/321280
20. Suzanne M. de la Monte, 2012. Contributions of Brain Insulin Resistance and Deficiency in Amyloid-Related Neurodegeneration in Alzheimer’s Disease. Drugs, Vol. 72, no.1, pp. 49-66. doi: 10.2165/11597760
21. Ying Liu et al, 2011. Deficient brain insulin signalling pathway in Alzheimer’s disease and diabetes. Journal of Pathology, Vol. 225, pp.54–62. doi: 0.1002/path.2912
22. Konrad Talbot et al, 2012. Demonstrated brain insulin resistance in Alzheimer’s disease patients is associated with IGF-1 resistance, IRS-1 dysregulation, and cognitive decline. The Journal of Clinical Investigation, Vol.122, No.4, pp.1316–1338. doi:10.1172/JCI59903.
Don’t know about you but the above is a fine example of investigative reporting. It deserves the widest circulation because if it is proved that there is a link between diet and Alzheimer’s disease then, once again, it shows how taking personal responsibility for our health has huge implications for us, our families and for society at large.
OK, that sub-heading must seem a tad bizarre! Let me explain. On Tuesday, Jean had an important visit to make down in Mesa, AZ on the outskirts of Phoenix. The first 65 miles, give or take, from Payson to Mesa are down along Highway 87.
At 11.20 we started on our return from Mesa planning on being early back home, say by 1pm at the latest. But 31 miles up the Northbound carriageway of Highway 87, we came to a halt. The road was closed due to an accident with a tanker. As our local newspaper, the Payson Roundup, put it,
The driver of the truck was taken by ambulance to a Scottsdale hospital with non life-threatening injuries. DPS has not ruled out speed as the cause of the crash. Photo by Andy Towle.
Due to a hazardous spill, Highway 87 was closed most of Tuesday, but reopened Wednesday morning after overnight clean up efforts, according to the Arizona Department of Public Safety.
Officials initially thought the roadway could be closed as many as two days due to the amount of oil spewed across both sides of the highway.
The highway closed down after a semi truck carrying oil used for paving rolled Tuesday afternoon near milepost 228, at the bottom of Slate Creek.
That resulted in us having to take a 170 mile detour and not arriving back until 4.30pm!
So what’s that got to do with the post for today? Simply that the implications of Tuesday spilled, like the tanker’s oil cargo, across into Wednesday and the long, thoughtful post I had in my mind to write got put on hold. Thus in its place is this republication of a recent release by Stanford School of Engineering at Stanford University. Apologies for another republished item but the article is relevant and interesting.
oooOOOooo
WIND COULD MEET WORLD’S TOTAL POWER DEMAND – AND THEN SOME – BY 2030
Wind turbines near Livermore, CA.
HIGH RESOLUTION MODELS
In their study, Jacobson and Archer adapted the three-dimensional, atmosphere-ocean-land computer model known as GATOR-GCMOM to calculate the theoretical maximum wind power potential on the planet taking into account wind reduction by turbines. Their model assumed wind turbines could be installed anywhere and everywhere, without regard to societal, environmental, climatic or economic considerations.
The new paper contradicts two earlier studies that said wind potential falls far short of the aggressive goal because each turbine steals too much wind energy from other turbines, and that turbines introduce harmful climate consequences that would negate some of the positive aspects of renewable wind energy.
The new model provides a more sophisticated look than previously possible by separating winds in the atmosphere into hypothetical boxes stacked atop and beside one another. Each box has its own wind speed and weather. In their model, Jacobson and Archer exposed individual turbines to winds from several boxes at once, a degree of resolution earlier global models did not match.
“Modeling the climate consequences of wind turbines is complex science,” said Jacobson. “This software allows that level of detail for the first time.”
With a single model, the researchers were able to calculate the exposure of each wind turbine in the model to winds that vary in space and time. Additionally, the model extracts the correct amount of energy from the wind that gets claimed by the turbines, reducing the wind speed accordingly while conserving energy. It then calculates the effect of these wind speed changes on global temperatures, moisture, clouds and climate.
POTENTIAL APLENTY
Among the most promising things the researchers learned is that there is a lot of potential in the wind—hundreds of terawatts. At some point, however, the return on building new turbines plateaus, reaching a level in which no additional energy can be extracted even with the installation of more turbines.
“Each turbine reduces the amount of energy available for others,” Archer said. The reduction, however, becomes significant only when large numbers of turbines are installed, many more than would ever be needed.
“And that’s the point that was very important for us to find,” Archer said.
The researchers have dubbed this point the saturation wind power potential. The saturation potential, they say, is more than 250 terawatts if we could place an army of 100-meter-tall wind turbines across the entire land and water of planet Earth. Alternatively, if we place them only on land (minus Antarctica) and along the coastal ocean there is still some 80 terawatts available—about seven times the total power demand of all civilization. Hypothetical turbines operating in the jet streams six miles up in the atmosphere could extract as much as an additional 380 terawatts.
“We’re not saying, ‘Put turbines everywhere,’ but we have shown that there is no fundamental barrier to obtaining half or even several times the world’s all-purpose power from wind by 2030. The potential is there, if we can build enough turbines,” said Jacobson.
Mark Z. Jacobson, professor of civil and environmental engineering. Photo: Linda Cicero / Stanford News Service
HOW MANY TURBINES?
Knowing that the potential exists, the researchers turned their attention to how many turbines would be needed to meet half the world’s power demand—about 5.75 terawatts—in a 2030 clean-energy economy. To get there, they explored various scenarios of what they call the fixed wind power potential—the maximum power that can be extracted using a specific number of wind turbines.
Archer and Jacobson showed that four million 5-megawatt turbines operating at a height of 100 meters could supply as much 7.5 terawatts of power—well more than half the world’s all-purpose power demand—without significant negative affect on the climate.
“We have a long way to go. Today, we have installed a little over one percent of the wind power needed,” said Jacobson.
In terms of surface area, Jacobson and Archer would site half the four million turbines over water. The remaining two million would require a little more than one-half of one percent of the Earth’s land surface—about half the area of the State of Alaska. However, virtually none of this area would be used solely for wind, but could serve dual purposes as open space, farmland, ranchland, or wildlife preserve.
Rather than put all the turbines in a single location, Archer and Jacobson say it is best and most efficient to spread out wind farms in high-wind sites across the globe—the Gobi Desert, the American plains and the Sahara for example.
“The careful siting of wind farms will minimize costs and the overall impacts of a global wind infrastructure on the environment,” said Jacobson. “Regardless, as these results suggest, the saturation of wind power availability will not limit a clean-energy economy.”
Funding sources include National Science Foundation, U.S. Environmental Protection Agency, and National Aeronautics and Space Administration high-end computing.
Andrew Myers is associate director of communications for the Stanford University School of Engineering.
Monday, September 10, 2012
oooOOOooo
Let me close by pointing you to Mark Jacobson’s website.
An afterthought about the adventurous spirit of man.
While the focus on the manned exploration of space has declined significantly since those days of the Apollo missions, the spirit to explore has not diminished. This was underlined in spades by a recent post from the British blog Earth & Solar System that I have been subscribing to since a few weeks ago.
This blog reflects the research interests of the Isotope Cosmochemistry and Geochemistry Group at the University of Manchester. In our laboratories we study samples from comets, interstellar dust, interplanetary dust, Mars, the moon and asteroids to understand how the Earth and the Solar System were formed, how they evolved and became what we see today. We study the Earth and its chemistry to understand how it works, its mantle, crust, oceans and how we change it. We want to share and discuss what we find with everyone.
The blog is for sharing science and what we and other research groups discover as we do science in real time. Discussion, questioning and enquiry are good, but politics, and opinion that can’t be backed up by published scientific work are strictly off-limits and will be removed.
Yet another example of why integrity is the only way forward.
Anyway, the recent post that was published came into my ‘in-box’ on Monday and I wanted to share it with you. Primarily because the mainstream media have moved on and there is little ‘news’ about NASA’s Curiosity rover. That’s why this post is so fascinating and it’s reproduced on Learning from Dogs with the permission of Ashley King, the author.
The past week has seen NASAs Curiosity rover return more amazing images of the Gale crater, fire up its DAN and SAM instruments, and take its first steps towards Mt. Sharp.
Mastcam view south-west from the Bradbury landing site. The foreground is boulder-strewn and contains the edge of an impact crater. The layered rocks in the background form the base of Mt. Sharp (NASA/JPL-Caltech/MSSS).
The new images, captured using the 100mm telephoto lens of the Mastcam, provide a glimpse of the geological treats that await scientists at the base of Mt. Sharp. Of particular interest has been the identification of an unconformity, where two rocks in contact but of different ages indicate a break in the geological record. Satellite data suggests that the rocks lying below the unconformity contain hydrous minerals whilst those above are “dry”. It appears these rock units formed under very different environmental conditions.
Unconformity (marked by white dots) at Mt. Sharp (NASA/JPL-Caltech/MSSS).
Next, Curiosity had another driving lesson, this time positioning itself over one of the scour marks created during landing. This allowed the rover to continue testing the ChemCam and turn on the Dynamic Albedo of Neutrons (DAN) instrument, which will be used to search for water below the Martian surface. The Sample Analysis at Mars (SAM) instrument, comprising of a mass spectrometer, gas chromatograph and tunable laser spectrometer, was also gently woken up. SAM can measure the abundance of C compounds, H, N and O, elements associated with life, in atmospheric and powdered rock samples. A quick test of some Earth air trapped in the instrument since launch confirmed that it is working well and should soon be ready for Martian samples.
Curiosity has now completed four drives and is heading for Mt. Sharp. However, the first target is Glenelg, a rock outcrop 400m to the east of the Bradbury landing site, where it’s hoped Curiosity will start using its drill. Although the journey will take several weeks, Glenelg contains at least three different rock types that will help scientists piece together the geological history of Gale crater.
Leaving the Bradbury landing site. This Navcam image shows the tracks left in the Martian soil by Curiosity (NASA/JPL-Caltech).
oooOOOooo
Makes a nice change to forget about the goings-on here on Planet Earth!
Like millions of others on this planet, I was held spellbound by the historic and epic moment of man placing his mark on another heavenly body, the Moon. I had been so wrapped up in NASA’s space missions that I took a holiday from work (I was working at the time for ICIANZ in Sydney, Australia) for the week of July 16th, 1969.
It was, of course, July 16th when the Apollo 11 Mission launched from the Kennedy Space Center culminating at precisely 20:17:39 UTC on July 20, 1969, the moment when the Lunar Module made lunar contact.
But in terms of me writing my own obituary for Neil, what could I offer?
Then a couple of items changed my mind.
Neil Armstrong (August 5, 1930 – August 25, 2012)
The first was reading the obituary printed in The Economist. I have long admired the many, many beautiful obituaries that have been published by this newspaper and this one was no exception. Take this extract from the Neil Armstrong obituary,
He had an engineer’s reserve, mixed with a natural shyness. Even among the other astronauts, not renowned for their excitability, he was known as the “Ice Commander”. Mike Collins, one of his crew-mates on the moon mission, mused that “Neil never transmits anything but the surface layer, and that only sparingly.” He once lost control of an unwieldy contraption nicknamed the Flying Bedstead that was designed to help astronauts train for the lunar landing. Ejecting only seconds before his craft hit the ground and exploded, he dusted himself off and coolly went back to his office for the rest of the day. There was work to be done.
Then the beautiful words that bring the obituary to a close,
Earth’s beauty
Over half a century, the man who never admitted surprise was surprised to observe the fading of America’s space programme. The Apollo project was one of the mightiest achievements of the potent combination of big government and big science, but such enterprises came to seem alien as well as unaffordable. Mr Armstrong, who after his flight imagined bases all over the moon, sadly supposed that the public had lost interest when there was no more cold-war competition.
Yet the flights had one huge unintended consequence: they transformed attitudes towards Earth itself. He too had been astonished to see his own planet, “quite beautiful”, remote and very blue, covered with a white lace of clouds. His reserve, after all, was not limitless. One photograph showed him in the module after he and Buzz Aldrin had completed their moon-walk, kicking and jumping their way across the vast, sandy, silver surface towards the strangely close horizon. He is dressed in his spacesuit, sports a three-day beard, and is clearly exhausted. On his face is a grin of purest exhilaration.
” … they transformed attitudes towards Earth itself. He too had been astonished to see his own planet, “quite beautiful”, remote and very blue, covered with a white lace of clouds.” For that reason alone, we need to celebrate the achievement of the Apollo 11 mission for putting our own planet into perspective within the enormity of the universe.
The second item that persuaded me to write this was a wonderful historic insight into how a potential catastrophy on the surface of the Moon would have been handled by President Nixon. This historic item was published on Carl Milner’s blog the other day, the specific item being What if the Moon Landing Failed? Republished with the very kind permission of Carl.
When Richard Nixon was the President of the United States, they had a speech ready for him to deliver to the world just in case the 1969 moon landing had ended in disaster. In fact many experts believed there was a big chance that Neil Armstrong and Edwin ‘Buzz’ Aldrin could have really gotten stuck on the moon. It’s something we don’t really think about now because we all know it was such a success. American Archives have unearthed the speech that would have been delivered if the late great Armstrong and Aldrin had never made it back to earth. This is such a great piece of history that I thought I might never see.
Give it a read, It’s such a moving and well prepared speech, and such a good thing that President Nixon never had to delivered it.
So, as with millions of others, I am delighted that this speech remained unspoken and instead we experienced: “At 5:35 p.m. (US EDT), Armstrong and Aldrin successfully docked and rejoined Collins, and at 12:56 a.m. on July 22 Apollo 11 began its journey home, safely splashing down in the Pacific Ocean at 12:50 p.m. on July 24.”
Neil Armstrong’s legacy is not only being part of the wonderful team that allowed man to make the first footprint on the Moon but also bringing into our human consciousness that this blue, wonderful planet we all live on is the only home we have.
First Full-View Photo of Earth Photograph courtesy NASA Johnson Space Center This famous “Blue Marble” shot represents the first photograph in which Earth is in full view. The picture was taken on December 7, 1972, as the Apollo 17 crew left Earth’s orbit for the moon. With the sun at their backs, the crew had a perfectly lit view of the blue planet.
Strikes me that celebrating July 20th each year as Blue Planet Day might not be a bad idea! Any takers? Now that would be a legacy for Neil!
Regular readers will know that I subscribe to the blog Naked Capitalism masterminded by Yves Smith. Some time ago, there was a link on NK to a story about how a tiny Chihuahua dog rescued some missing girls. It seemed like a good opportunity to take a closer look at this most magical aspect of a dog’s qualities.
First to that story.
I saw it on the Care2 website, from which I quote the following:
A 3-year-old chihuahua named Bell is an unexpected hero after finding three young girls who became lost for hours in the woods in Newnan, Georgia, on Monday.
CBS Atlanta reports that, on Monday, 8-year-old Carlie and 5-year-old Lacey Parga went for a walk with their dog Lucy down a cul-de-sac on trails near their neighborhood.
What started as a casual stroll became an unintended, and at times frightening, experience. As Carlie tells CBS, ‘”We tried to find our way out of the woods. We kept following paths and stuff and we got lost.” Indeed, they became scared that they were only to get more and more lost.
Carlie’s father, David Parga, noted that it wasn’t characteristic of them to wander off and, after searching for them but not hearing them respond, he contacted police and firefighters. Neighbors joined them including Carvin Young who thought to take Bell, who plays with the girls every day and knew their scent. Bell was able to lead searchers to the girls.
The full story on the Care2 website is here and on the CBS website here.
So what is it about the nose of the dog? A dog has more than 220 million olfactory receptors in its nose, while humans have only five million, making dogs’ sense of smell a thousand times greater. Frankly, trying to get one’s intellect around precisely what having a sense of smell one thousand times greater than a human means is tough! So on to another story.
17 Dogs, 3 Generations, 70 Years. There’s one constant… …the family dog.
After moving to Wellesley, Massachusetts for an anchor job with a major television sports network, Kevin began taking his German Shepherd, Beverly, for walks in the surrounding neighborhoods. They developed a route that included historic Atwood Street. Beverly kept veering toward one house in particular that had also caught Kevin’s eye previously, thinking it looked familiar but not knowing the reason.
After talking to a close family relative who had also once lived in Wellesley, Kevin was shocked to discover that the memorable house had once been a childhood home to his father, Bob Walsh, before WWII. After digging through old family photos that had been tucked away for years, Kevin uncovered a picture of his father as a toddler with his family on the house’s front porch, complete with their first family dog, Dee Dee.
Kevin’s father had been writing short stories about all of their family dogs through the years, but never knew about the photo. Its discovery was the pivotal moment that offered proof that the Walsh family’s journey with dogs had come back to the exact place where it started.
They’ve turned this story, along with other dog tales, into a book called Follow the Dog Home: How a Simple Walk Unleashed an Incredible Family Journey.
Dog’s nose leads family to back long lost old home, site unseen. German Shepherd, Beverly, is chronicled on WCVB TV’s news magazine show Chronicle. 70 years later, the family goes back “home” for stunning reunion and photograph.
A wonderful investment in studying America’s ecology is just starting.
I am indebted to The Economist for including in their issue of the 25th August a story about NEON, something I had previously not heard about.
It was then an easy step to locate the main website for the National Ecological Observatory Network, or NEON. (Just an aside that I can’t resist – NEON is such a fabulous acronym that one wonders how much push and shove there was to come up with the full name that also fitted the word ‘NEON’! Sorry, it’s just me!)
Anyway, back to the plot. The following video gives a very good idea of the projects aims. When I watched it, I found it inspiring because it seemed a solid example of how the nation, that is the USA, is starting to recognise that evolving to a new, sustainable way of life has to be built on good science. NEON strikes me as excellent science. You watch the video and see if you come to the same conclusion.
There’s also a comprehensive introduction to the project from which I will republish this,
In an era of dramatic changes in land use and other human activities, we must understand how the biosphere – the living part of earth – is changing in response to human activities. Humans depend on a diverse set of biosphere services and products, including air, water, food, fiber, and fuel. Enhancements or disruptions of these services could alter the quality of human life in many parts of the world.
To help us understand how we can maintain our quality of life on this planet, we must develop a more holistic understanding of how biosphere services and products are interlinked with human impacts. This cannot be investigated using disconnected studies on individual sites or over short periods of observation. Further, existing monitoring programs that collect data to meet natural resource management objectives are not designed to address climate change and other new, complex environmental challenges.
NEON, the first continental-scale ecological observatory, will provide comprehensive data that will allow scientists to address these issues.
Later on there’s more detail, as follows,
NEON has partitioned the U. S. into 20 eco-climatic domains, each of which represents different regions of vegetation, landforms, climate, and ecosystem performance. In those domains, NEON will collect site-based data about climate and atmosphere, soils and streams and ponds, and a variety of organisms. Additionally, NEON will provide a wealth of regional and national-scale data from airborne observationsand geographical data collected by Federal agencies and processed by NEON to be accessible and useful to the ecological research community. NEON will also manage a long-term multi-site stream experiment and provide a platform for future observations and experiments proposed by the scientific community.
The data collected and generated across NEON’s network – all day, every day, over a period of 30 years — will be synthesized into information products that can be used to describe changes in the nation’s ecosystem through space and time. It will be readily available in many formats to scientists, educators, students, decision makers and the general public.
For some reason I couldn’t find on the NEON website the informative map that was included in The Economist so I grabbed that one, and offer it below:
These eco-climatic domains are fully described here on the NEON website.
The benefits of this fabulous project are described thus, “The data NEON collects and provides will focus on how land use change, climate change and invasive species affect the structure and function of our ecosystems. Obtaining this kind of data over a long-term period is crucial to improving ecological forecast models. The Observatory will enable a virtual network of researchers and environmental managers to collaborate, coordinate research, and address ecological challenges at regional, national and continental scales by providing comparable information across sites and regions.”
As they say in business, if you can’t measure it, you can’t manage it! So reading in the above the sentence, ‘Obtaining this kind of data over a long-term period is crucial to improving ecological forecast models.‘ is cheering to the soul.
The United States quite rightly gets a huge bashing over it CO2 emissions but to condemn the USA for that and not to applaud this sort of wonderful research is utterly unjustified. As I have hinted before, America has, more than any other country in the world, the energy to make things better over the coming years.
As Professor Sir Robert Watson highlighted here recently said, ‘… deep cuts in CO2 emissions are possible using innovative technologies without harming economic recovery.’
The mistake we all make is to look behind us and think the future will be the same.
Let me start with something that is not really news. Not news in the sense that it has been very widely reported. I’m speaking of the probability, the high probability, that this year’s summer ice area in the Arctic will be a record low, with all the implications that this carries. Let me refer to a recent BBC news item that included a stunningly powerful chart.
Scientists at the US National Snow and Ice Data Center said data showed that the sea ice extent was tracking below the previous record low, set in 2007.
Latest figures show that on 13 August ice extent was 483,000 sq km (186,000 sq miles) below the previous record low for the same date five years ago.
The ice is expected to continue melting until mid- to late September.
“A new daily record… would be likely by the end of August,” the centre’s lead scientist, Ted Scambos, told Reuters.
“Chances are it will cross the previous record while we are still in ice retreat.”
The US National Snow and Ice Data Center may be found here.
So to the piece that generated the title of this post, the future of the car.
On the 17th August, I wrote an article highlighting the fact that the U.S. leads the world in cutting CO2 emissions. That was endorsed by an item published on the U.S. Energy Information Administration’s (EIA) website, that said,
U.S. carbon dioxide (CO2) emissions resulting from energy use during the first quarter of 2012 were the lowest in two decades for any January-March period. Normally, CO2 emissions during the year are highest in the first quarter because of strong demand for heat produced by fossil fuels. However, CO2 emissions during January-March 2012 were low due to a combination of three factors:
A mild winter that reduced household heating demand and therefore energy use
A decline in coal-fired electricity generation, due largely to historically low natural gas prices
Reduced gasoline demand
It was the last item that caught my eye. Because it resonated with an article on Chris Martensen’s Peak Prosperity blog just over a week ago. That article, written by Gregor Macdonald, was called The Demise of the Car.
About a third of the way into the article, Gregor writes,
But it’s not just India that has incorrectly invested in automobile transport. The other giant of Asia, China, has also placed large resources into auto-highway infrastructure.
It appears that at least a decade ago, the developing world made the same assumption about future oil prices as was made in Western countries. The now infamous 1999 Economist cover, Drowning in Oil, reflected the pervasive, status-quo view that the global adoption of the car could continue indefinitely. A decade later, however, we find that after oil’s extraordinary price revolution, the global automobile industry is now starved for growth.
Then a little further down in this interesting article there is this,
More broadly, however, global governments are captured by sunk-cost decision making as the past 60-70 years of highway infrastructure investment is now a legacy just too painful to leave behind. Interestingly, whether citizens and governments want to face this reality or not, features of the oil economy are already going away as infrastructure is increasingly stranded. Moreover, there are cultural shifts now coming into play as young people are no longer buying cars – in the first instance because they can’t afford them, and in the second instance because it’s increasingly no longer necessary to own a car to be part of one’s group. See this piece from Atlantic Cities:
Youth culture was once car culture. Teens cruised their Thunderbirds to the local drive-in, Springsteen fantasized about racing down Thunder Road, and Ferris Bueller staged a jailbreak from the ‘burbs in a red Ferrari. Cars were Friday night. Cars were Hollywood. Yet these days, they can’t even compete with an iPhone – or so car makers, and the people who analyze them for a living, seem to fear. As Bloomberg reported this morning, many in the auto industry “are concerned that financially pressed young people who connect online instead of in person could hold down peak demand by 2 million units each year.” In other words, Generation Y may be happy to give up their wheels as long as they have the web. And in the long term, that could mean Americans will buy just 15 million cars and trucks each year, instead of around 17 million.
If future car sales in the US will be limited by the loss of 2 million purchases just from young people alone, then the US can hardly expect to return to even 15 million car and truck sales per year. US sales have only recovered to 14 million. (And that looks very much like the peak for the reflationary 2009-2012 period)
Indeed, the migration from suburbs back to the cities, the resurrection of rail, and the fact that oil will never be cheap again puts economies – and culture – on a newly defined path to other forms of transport and other ways of working.
It’s a long and interesting article that demonstrates an old truth, no better put than in this quotation reputed to have been said by John F. Kennedy,
Change is the law of life. And those who look only to the past or present are certain to miss the future.