Climate Change is not a problem: Unless we make it one.

Climate Change is not a problem: Unless we make it one.

WUWT February 11, 2020

Guest Post by Martin Capages Jr. PhD PE


As long as humans have been on Earth, they have been adapting to changes in regional climates. A regional climate is the average of the weather for a relatively long period of time, usually 30+ years, at a particular location on the planet. The natural periodicity of prolonged regional weather variations has been documented in various ways by humans for eons. For a comparison of human civilization in the northern hemisphere to Greenland ice core temperatures for the last 18,000 years see here. Some of the means of documenting changes in long term weather patterns, i.e. climate change, include crude prehistoric cave drawings of the animals and plants, paintings of frozen rivers (see Figure 1 of ice skating on the River Thames in 1684), and archaeological digs. There are also written records of climatic conditions as early as 5,000 years ago, perhaps even earlier. Ice, subsea, peat and lake bed cores are also used, for a more detailed discussion of the methods used see here and the links therein.

Figure 1. Ice skating on the River Thames in London in January 1684, during the Little Ice Age. Museum of London, link.

Most geologists agree that we are currently in an extended ice age. Technically we are in an “icehouse” condition (see here). When ice caps exist on one or more poles year-round for an extended period of time, the Earth is said to be in an icehouse. Global temperature may decrease further if the solar activity remains at its current low level (see here). But geologists deal in massive time increments of thousands, millions even billions of years. The general public makes its observations in decades, perhaps a generation and maybe even in a century, but not much more than that. Such a myopic view of the Earth’s climate can be misleading.


Climate science is a combination of many scientific specialties such as geology, geophysics, astrophysics, meteorology, and ecology just to name a few of the larger branches. Some of these scientists are working to develop computer models of the climate using atmospheric physics, chemistry, actual data, proxy data, empirical variables and assumed constants. The models include statistical tools to present the results in the form of projections of measurable parameters, one of these is the global mean temperature. These projections are presented in time increments that mean something to the public. Dr. Judith Curry has written a good overview of computer climate modeling that can be downloaded here.

To gain an understanding of the regional climate that preceded humankind, we have to get creative. That means using proxies to determine the average temperature and perhaps life conditions in earlier years. The two most cited proxies are ice cores and tree rings, but there are other lesser known proxies. In addition, we can also make reasonable assumptions about the prehistorical past with observations of regional geology. For example, glacier movements are revealed by the scars and strange debris fields that are left with each glacial expansion and retreat. Great boulders are left in the middle of grassy plains as glaciers melt. Gravel placed by high velocity melt water rivers can even reveal the dynamics involved, perhaps even provide a timetable for the events. These points are made just to illustrate the importance of the geological perspective in understanding why the climate changes. It is, after all, the physical record.

Many scientists, across many disciplines, have made their career goals the understanding of these worldly and sometimes outer-worldly events. Some of these scientists have developed hypotheses that they defend with great vigor which is, of course, understandable. There is peer admiration, public recognition and research funding available when one’s hypotheses prove to be correct. But there is a danger in pushing any hypothesis beyond its limits. And that may be the case of the proponents of the singular CO2 driven global warming hypothesis.


Instead of following more traditional methods of analyzing data acquired through research, noting some phenomenon, developing an hypothesis that might explain the phenomenon, then publishing the research and the scientific conclusions to get the scrutiny of peers in that particular field of research, the CO2 warming proponents appear to have started with an hypothesis. The hypothesis was that “humankind’s accelerated use of fossil fuels had led to an increase in average global temperature by adding more CO2 to the atmosphere and enhancing the Green House Gas effect.” This is easily seen in the stated objective of the United nations Framework Convention on Climate Change (UNFCCC):

“UNFCCC’s ultimate objective is to achieve the stabilization of greenhouse gas concentrations in the atmosphere at a level that would prevent dangerous interference with the climate system.” (link)

In other words, they assumed that stabilizing the atmospheric greenhouse gas concentration would prevent climate change, they did not prove this assertion first. The previous hypothesis had been that aerosols would cause a cooling of the average global temperature and lead to a new massive glacial advance or “Ice Age.” The media sometimes calls a major glacial advance an “ice age,” but we are already in an ice age and have been for millions of years. Some say the new ice age predictions in the 1970s were in the minority and erroneous. They claim there was no consensus on global cooling (link). Others say there was a consensus (link).  Then the impact of chlorofluorohydrocarbons (CFCs) on the ozone layer became the new major focus. A damaged ozone layer could increase solar radiation and lead to more cancer, animal blindness and plant withering (link).

Consensus among scientists means nothing. Proposing that a consensus exists by distilling published papers means absolutely nothing. Getting scientists together for an open discussion, presenting one’s hypothesis, showing the proof, then having a robust debate followed by an open show of hands may be a better way to define a scientific consensus, but even that could be biased by the quality of the presentations and the presenters involved.


Research funding has always been the result of patronage, both private and governmental. An individual researcher must have some sort of sustenance to survive. If successful in the research, that scientist will attract more funding than the competition in the same field. The attraction to the funders of that successful research may be in public prestige received or there may be a purely economic or even military advantage for the patron, politician or governmental entity. Most research is performed by academia. Many, if not most, of the governmental agencies funding research, are pressured by political entities to fund research that supports a political agenda. Government funding injects politics into scientific research and can make research outcome oriented. Today, there is little research based on scientific curiosity. Most research is agenda-driven and based on the biases of the funding source and the biggest source is the government. That has led to the climate change debacle the world now faces.

The actual climate change that will occur will be revealed at the pace that nature allows. Unfortunately, adapting to these changes takes time and resources. Understanding the causes of climate change may lead to decisions to take measures to mitigate the change or to adapt in advance of the climate change. The underlying assumption is that the projected climate change will have a negative impact on humans or even end humankind. So, the research has been directed at mathematical models of the climate centered on producing projections of global average temperature over time and comparing temperature to CO2 concentrations. These projections have actually been of the positive or negative deviation of the temperature above or below a selected historic baseline. While this is a valid and well accepted manner to display projections, the selection of historic baseline can distort the public’s perception of the change.

These dynamic, mathematical models must use the power of digital computer programming to produce temperature projections in a reasonable time frame. There are many constants and variables that are fed into the models. Both the equations, the input constants and variables can be “tweaked” to generate projections until the projections can hindcast the majority of the historical record with some accuracy. Typically, data samples are not absolute but introduce a range around some point of reference. This departure from the norm requires the introduction of probability and statistics to represent a range of values. Temperature varies with latitude and elevation, so temperature anomalies must be computed at as many places around the Earth as possible and then the anomalies are averaged. Each projection consists of bands of departures from the specific reference point. The plots are not absolute temperature versus time but the “temperature anomaly” above and below as many base lines.  But matching history requires controls and record keeping on the tweaks to the constants, variables and the equations themselves.

Figure 2. The upper graph shows the IPCC (Intergovernmental Panel on Climate Change) projections of temperature (red and blue lines) without any man-made CO2, just natural forces. The lower graph shows projections (again in red and blue) including man-made CO2. The black line in both graphs are the observations. The blue and yellow very fine lines are the individual model runs that are averaged to make the blue and red lines. Source, IPCC WG1, AR5, FAQ 10.1, page 895, link.

In Figure 2 we see the result. The IPCC, the Intergovernmental Panel on Climate Change, uses models from the Coupled Model Intercomparison Project (CMIP3 in 2010 and CMIP5 in 2014). The computed uncertainty in these estimates of global temperature change since 1860 are shown in blue and yellow. As the graphs show, the uncertainty range is larger than the deviation since 1860. The lower bound in 2000 overlaps the upper bound in 1860 in the lower graph. Since 2000, the observations have been fairly flat, as shown by the black line. In the upper graph, which is supposed to show only natural influences on climate change, the projections are flat, except for large volcanic eruptions, which decrease global temperatures. The authors want us to believe that none of the global warming in the past 150 years is natural? Did they assume this? Or do they know this? It is unclear. For a fuller discussion of Figure 2, see here.

The data itself must be distilled down. To then develop a projection of the results and keep it clear of bias, probabilistic techniques such as a Monte Carlo methodology are employed. These are computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be deterministic in principle. Many climate change scientists have relied on Monte Carlo methods in the probability density function analysis of radiative forcing. Unfortunately, the actual data set adjustments and model “tweaking” have raised concerns about possible bias in the projections.

Furthermore, the equations used in the millions of lines of software code may contain errors. Computer simulations provide a means to test hypotheses but do not provide “proof.” That is why computer projections must never be considered “settled science” or confused with observations. It is dangerous to do so. (Curry 2017).


The problem we have today is the divisive manner used by the scientists who are proponents of the “CO2 control knob” for global mean atmospheric temperature. Their computer models yield results that show a significant increase in the average global temperature by 1.1 to 4.2 degrees C (See figure 1, here) by the year 2100. That could be a problem perhaps, if it actually occurs. While the actual effect of a 4-degree temperature rise is unknown, it is assumed that it would be a bad thing and that assumption is widely believed. The “CO2 control knob” proponents (see here for an example), henceforth called “Alarmists” have declared that the doubling of the level of CO2 in the atmosphere could cause a global temperature increase of 4.5-degrees C (Link) by the end of the 21st Century, 80 years from now. They have recommended reducing, or even eliminating the use of fossil fuels which they believe is the primary cause of the rise in atmospheric CO2 from around 300 parts per million or 300 ppm at the beginning of the Industrial Age to today’s level of over 400 ppm.

Fossil fuels have always been referred to in the media in the pejorative and associated with “Big Oil”, another pejorative reference. The truth is that the use of fossil fuels has exponentially improved the ability of humans to flourish and Big Oil has been the means for that flourishing to take place. Big Oil has done some wasteful and selfish things and deserves some criticism. But Big Oil is not an evil entity, it is a business, a business of large and smaller corporations with shareholders, executives and employees, just like the Silicon Valley technical giants. Even the real Big Oil, the Organization of Petroleum Exporting Countries or OPEC, performs in the manner of a large corporation. The problem with Big Oil is that it has never been able to “stick up for itself.” It has even needed the help of “outsiders” to voluntarily join the battle on its behalf. Luckily, a few outsiders have decided to do that; however, it may be too late to change public perception of the fossil fuel energy industry (Epstein 2014). On the other hand, Silicon Valley has no such handicap as yet, but there is some negativism building with respect to privacy concerns and monopolistic behavior of the Tech Giants.


The United Nations has exploited the negative view of fossil fuels to enhance its role and power in global affairs. Others have supported the CO2 argument to enhance their opportunistic investments in alternative energy sources with the exception of nuclear and hydro-electric power. Hydro-electric is a non-carbon, reliable renewable while nuclear is non-carbon and near-renewable due to its availability and energy density. These two alternative sources have been opposed by anti-humanity environmental extremists. These combined negative forces have generated very slick UN Proposals for Policy Makers that are based on the singular premise that the global temperature is increasing at an alarming rate, the root cause is the increase in atmospheric CO2 due to the use of fossil fuels, and that the entire world should participate in reducing human-caused CO2 emissions to zero.

But what if the temperature increase is not due to increased human generated CO2 levels? What if the computer models projecting an increasing global average temperature are wrong? Are all the computer models based on the same general hypothesis? If so, are they just tweaking constants and variables to match the history? And, what exactly does a 1 degree or even 3 degree C temperature rise mean?


We need to get the answers to these questions. Who can provide these answers? There are many scientists and engineers who are knowledgeable in the physics and the chemical processes that set the boundaries for climate science. Many of the scientists are retired members of academia with years of experience in research, others are retired from large corporations that have their own research organizations. There are also scientists and engineers that have performed advanced research in government facilities, including military research. Current climate research is being performed at the public and private universities, corporations and in government laboratories. In the United States alone, the GAO estimates the government has spent over $107 billion dollars on climate research from 1993 to 2014 (Link). By far, most of the funding originates with governments. The government-academia research complex and rotating door has coopted research. Projects that fit social agendas are approved while more practical research languishes. Private research is denigrated by the government supported researchers.

Scientists in academia keep a scorecard on their performance called peer-reviewed publications. Successful publications lead to more funding for more research as well as increased faculty prestige. High performers are rewarded and protected by their employers, primarily the universities. High performers are also recruited by the university alumnae since this maintains the prestige of the institution, their alma mater. These are all normal and understandable factors. Competition between universities and even between corresponding researchers in the different institutions generally leads to an increased understanding of the science.

Unfortunately, the proponents of the “CO2 control knob” theory, the “Alarmists”, are dominant in mass media communications and on social media platforms. They have also established control of the research publications issued by various scientific organizations by serving as subject matter expert editors. For a discussion of these problems see The Center for Accountability in Science here. There are even specialized websites and blogs that provide only the “Alarmists” view and that launch attacks on questioners of the orthodoxy, the “deniers.” “Deniers” is a pejorative term that should not be used in this context, it would be better to use the term “Skeptics.” The “Skeptics” have less organized funding than the “Alarmists.” Both of these terms, Skeptics and Alarmists, have about the same level of negative connotation so they will be used in the following paragraphs, no offense intended to anyone.

The nature of the current disagreement is unfortunate, and it is seriously affecting scientific discourse. Science advances through hypotheses, research and experiments to test the hypotheses, and a robust defense against the skeptics of the hypotheses. But today skeptics are attacked through insidious means, including personal attacks, limitations on publications, and media blitzes. Even the very best scientists, emeritus professors from prestigious universities, some even experts in the field of climatology, are demeaned by the Alarmists if they even comment on a particular hypothesis or question the physics in the computer models. There are also many retired scientists including geologists and geophysicists, who have questioned the hypothesis but have few resources now as they have left academia or the corporate world. Some of these skeptics have organized to counter the United Nations effort by organizing the Nongovernmental International Panel on Climate Change and publishing skeptical reports, see here.


The overall solution to this climate conundrum may be to just “wait a while.” Today we have satellites continually measuring both surface and atmospheric temperatures 24/7 all over the globe. We also have detailed records of regional weather events in many parts of the world that can be used to infer climatic change. And it is changes in regional climates that effect humans. Regional climates have been changing for eons. And, we know the impact on humankind, in the past, as a result of those changes. We can use common sense to determine what to do to adapt to possible future climate changes.

We should also wait until we know if additional CO2 is good or bad. There is a lot of evidence that additional CO2 is currently a benefit and surprisingly little that it is bad, see here for a discussion.


So, what causes a regional climate to change? It is likely not completely due to the amount of CO2 added to the atmosphere by burning fossil fuels. A regional climate is just that, climate that is specific to a region. A change in wind and ocean currents can change the humidity over any particular region, making it wetter or drier. If it is over a cold area, perhaps there will be more snow and ice for longer periods or just the opposite. It may expand a desert or create a rain forest. We have a fairly lengthy record of regional climate changes. The causes of these changes are much more complex than the effect of a minor greenhouse gas on the average global temperature. The wind and ocean current changes are driven by uneven heating, not a single digit global temperature increase. The uneven heating is due to clouds, the amount of water vapor, the earth’s changing elliptical orbit around the sun, the earth’s obliquity and rotational precession, and the earth’s rotation itself (which creates night and day). The sun even has a variable output. For a discussion of these long-term effects on our climate see these posts by Javier (here and here)

We also know that humanity has and will continue to have an impact on the world’s environment, mostly through agriculture and development that both require massive sources of energy. As the population continues to increase, the production of food must also increase. This brings up the subjects of population control measures, genetically modified crops, land use and many more. Without GMO measures, we would not be able to feed the current world population. That is just a fact. Unsound environmental policies that restrict the removable of dead shrubs and undergrowth as well as irrational restrictions on irrigation have contributed significantly to the wildfires in California and Australia and have reduced crop production. Continued residential and commercial developments in the flood plain and along coastlines are going to increase the adverse effects of any sea level rise, regardless of the amount of the rise or “apparent” rise. Sea level rise and land subsidence look the same to the casual observer but subsidence of land due to tectonics and water mismanagement are very real. The latter may be something we can do something about.

Mitigation (reducing CO2) is not the only way to combat climate change and it may not even work. Each community has its own climate change threats, sea level rise, changes in precipitation, storms, etc. These climate changes may be natural or man-made or both, we just don’t know. Each community can use modern technology and fossil fuels to adapt. They can build sea-walls like Galveston or The Netherlands. They can store water or improve drainage. Local adaptation is easier, cheaper and less risky than trying to change the whole world economy.


What we need to do is wait a while. Work together and stop the scientific infighting. The CO2 level in the atmosphere is going to continue to increase because China and India are burning more fossil fuels. Africa will be next. They have to in order to feed their populations. And if the temperature continues to rise a little more, it will most likely be beneficial to the planet in general, so long as China and India control the real problems of fossil fuel combustion, SO2 and NOx (and a few others, but not CO2). If it gets colder, not warmer, then we will have to burn more carbon-based fuel to stay warm and that might also raise the global temperature, or so I’ve heard.


Epstein, Alex. 2014. The Moral Case for Fossil Fuels. New York: Penguin Group. Link.

May, Andy. 2018. CLIMATE CATASTROPHE! Science or Science Fiction? The Woodlands, Texas: American Freedom Publications LLC. Link.

The Center for Acountability in Science. “Government-funded Science.” Accessed February 4, 2020. Link.