On January 28, 2014, during his State of the Union address, Barack Obama boldly declared “The debate is settled. Climate change is a fact”. Now, of course the debate is never settled. Science is never settled. But at the same time, ironically, and taken out of his intended context, part of Obama’s statement on Climate Change is essentially correct.
Climate Change is a fact. We have concrete geological evidence of its existence. Our climate has been changing for billions of years. And it’s a staggeringly complex system with an enormous number of inputs – some of which we likely don’t even know of. But exactly how it is changing – and how it will change – is something that is currently beyond our grasp. Anyone who tells you otherwise is either lying or is misinformed.
We absolutely believe in Climate Change. But we also don’t know if we are currently undergoing Global Warming – the current data shows a plateau in temperatures for the last twenty years. And if we are undergoing Global Warming, nobody can tell you by what degree, or how long or short the cycles are. Nobody understands or can measure the magnitude of human generated influences nor their direct impact on our climate.
And why is it that undersea volcanic activity - which likely plays a significant role in climate change - is simply ignored by all the climate models? For that matter, why does the Climate Change crowd go to such great lengths to ignore the obvious impact on our climate from changes in the activity of our sun over large timescales? The sun is obviously an enormous influence on our planet’s climate - it’s the primary driver. As the Hoover Institute points out, “The sun’s energy that reaches the Earth’s atmosphere provides 342 [watts per square meter] Wm–2.”
To put things into perspective, the “estimated extra energy from excess CO2 - the annual anthropogenic greenhouse gas contribution - is far smaller… at 0.036 [watts per square meter] Wm–2, or 0.01 percent of the sun’s energy. If our estimate of the sun’s energy were off by more than 0.01 percent, that error would swamp the estimated extra energy from excess CO2.
Another area of obvious uncertainty lies in the earth’s natural cloud cover. Clouds not only reflect incoming radiation, they also trap it as it is outgoing. A world encompassed by cloud cover would have dramatically different temperatures than a world that is devoid of clouds.
But as the Intergovernmental Panel on Climate Change, known as IPCC, has admitted, modeling cloud cover and the effects on atmospheric temperature has proven difficult: The simulation of clouds in climate models remains challenging. There is very high confidence that uncertainties in cloud processes explain much of the spread in modeled climate sensitivity.
The idea that clouds are hard to model isn’t particularly surprising. As Hoover notes, “They are amorphous; they reside at different altitudes and are layered on top of each other, making them hard to discern; they aren’t solid; they come in many different types; and scientists don’t fully understand how they form. As a result, clouds are modeled poorly. This contributes an average uncertainty of ±4.0 [watts per square meter] Wm–2 to the atmospheric thermal energy budget of a simulated atmosphere during a projection of global temperature.”
“This thermal uncertainty is 110 times as large as the estimated annual extra energy from excess CO2. If our climate model’s calculation of clouds were off by just 0.9 percent - 0.036 is 0.9 percent of 4.0 - that error would swamp the estimated extra energy from excess CO2. The total combined errors in our climate model are estimated to be about 150 [watts per square meter] Wm–2, which is over 4,000 times as large as the estimated annual extra energy from higher CO2 concentrations.”
Where we’re living, if you say the words “it will be sunny and warm today” you will be right about 80% of the time. And still, forecasters get the local weather wrong with uneasy levels of frequency. We’re talking about daily forecasts – let alone the weather a couple weeks out. Yet, one risks being labeled a “climate denier” – if they do not believe that we have a realistic understanding or comprehensive grasp of our global climate’s long-term predictability.
The simple truth is this: our global climate is far too complex to successfully model with our current level of scientific understanding. For those that disagree we’d ask a number of basic questions. Why did the claimed warming trend between 1978 and 1998 essentially stop – despite computer climate model predictions of steady warming? How sensitive is the climate to increased carbon-dioxide levels? What feedback mechanisms exist that can increase or decrease that sensitivity?
Why did periods of high carbon-dioxide levels earlier in Earth’s history result in temperature levels both above and below the average? Why do the 55 different models used by the Intergovernmental Panel on Climate Change (IPCC) differ in their descriptions of the past century’s global average surface temperature by more than three times the entire warming recorded during that time? For that matter, Why does the IPCC use 55 different models – models that each encompass many other differing factors as well?
And why is it that the IPCC models have been consistently wrong in their predictions – and why have they always generated overstatements in predicted temperature movements? Why isn’t the IPCC forthcoming about how often the IPCC models have changed, what were the changes, why were they changed and when did they change? And why are the IPCC reports and models based on thousands of research papers whose conclusions have never been independently verified?
Why do the models deal with fluid dynamics of the atmosphere and oceans, while ignoring the effects of biology – of vegetation and topsoil? How do we acknowledge the interactions and influences of the ocean’s impact on variabilities when we do not know or understand the level of undersea volcanic activity and resulting carbon dioxide (warming) and aerosol gas (cooling) emissions?
Why is it that today’s best estimate of the sensitivity induced by a hypothetical doubling of carbon-dioxide concentration (between 2.7 degrees Fahrenheit and 8.1 degrees Fahrenheit) is no different, and no more certain, than it was 30 years ago?
Why do the models predict that the lower atmosphere in the tropics will absorb much of the heat of the warming atmosphere when no such area has been found – although the weather stations have been quietly moved – and resulting temperatures revised? And why can no one accurately answer what rough percentage of historical warming has been generated by human activity?
The Intergovernmental Panel on Climate Change (IPCC) is the leading international body for the assessment of climate change. It was established by the United Nations Environment Programme (UNEP) and the World Meteorological Organization (WMO) in 1988, ostensibly to provide the world with a clear scientific view on the current state of knowledge in climate change and its potential environmental and socio-economic impacts.
What many people don’t realize is that the IPCC does not conduct any research nor does it monitor climate related data or parameters. We encourage you to visit the Intergovernmental Panel on Climate Change (IPCC) data section and read some of the actual reports. When you do so, please make sure to examine the footnotes. What becomes very clear while wading through these reports is that they comprise a series of assumptional layers – all built on the other.
Initial research papers are compiled. They are accepted. Assumptions are made and then built into highly complex and varying models. The variables at play are simply baffling in their sheer numbers. And they all lie on top of the other.
The Global Warming Policy Foundation summed this up in a paper, Peer Review – Why Skepticism is Essential, noting that “Reproducibility is the backbone of sound science. If it is infeasible to independently evaluate the numerous assumptions embedded within climate model software, and if third parties lack comparable computing power, a great deal of climate science would appear to be inherently non-reproducible.
The world is currently spending billions on measures intended to combat humanity’s allegedly significant role in climate change. The IPCC tells us this is prudent and necessary. But IPCC reports are based on thousands of research papers whose conclusions have never been independently verified.” We also have the disconcerting issue of data revision.
Back in 2015, the Wall Street Journal reported something interesting in historical National Oceanic and Atmospheric Administration - or NOAA - data. Temperatures for the years 2005 and 2010 were exactly tied in 2010, ending the reported upward slope in temperature revisions.
But after a very convenient June 2015 “study” was conducted by Tom Karl, director of the NOAA’s National Centers for Environmental Information, the temperature in 2010 was revised to be listed as slightly warmer at 0.70°C versus 2005 at 0.66°C, just enough to impart an upward slope to any graph that ignores statistical uncertainty.
“The study, by adjusting upward temperature readings from certain ocean buoys to match shipboard measurements, eliminated the “pause” in global warming seen in most temperature studies over the past 15 years.” In a piece by Bob Tisdale and Anthony Watts, they noted that the NOAA adjusted their use of ocean temperature readings by putting more weight on certain ocean buoys, adjusting ship-based temperature readings upward, and slightly raising land-based temperatures.
Further elaboration - and internal confirmation - was provided by Dr John Bates, a top NOAA scientist turned whistleblower. Bates said this revised increase in temperatures in 2010 was achieved by “an upwards 'adjustment' of readings from fixed and floating buoys, which are generally reliable, to bring them into line with readings from a much more doubtful source – water taken in by ships.
This, Dr Bates explained, has long been known to be questionable: ships are themselves sources of heat, readings will vary from ship to ship, and the depth of water intake will vary according to how heavily a ship is laden – so affecting temperature readings. As Dr Bates said: 'They had good data from buoys. And they threw it out and 'corrected' it by using the bad data from ships. You never change good data to agree with bad, but that's what they did – so as to make it look as if the sea was warmer.'”
There are many such examples of these revisions. And we continue to question why the data always seems to be revised in a way that continues to increase the Global Warming case. would expect that, statistically speaking, a certain portion of adjustments would result in the opposite effect. This rarely, if ever, occurs. As we noted earlier, the 55 IPCC models have been consistently wrong in their predictions – and they have always generated overstatements in predicted temperature movements.
The IPCC models lack transparency, are built on assumptional layers of potentially flawed data, are subject to ongoing data revision, and have failed to demonstrate predictive ability. The IPCC, as an organization and in its results, is clearly subject to intense politicization – in their operations, in their creation and collection of data – and in the dissemination of that data through their published conclusions. Yet we spend billions – even trillions – as a result of the published results.
caught a misspelling. “barack obama” should be broncobunghole