A Climate Modeller Spills the Beans

Quadrant 23rd September 2019

Tony Thomas

There’s a top-level oceanographer and meteorologist who is  prepared to cry “Nonsense!”on the “global warming crisis” evident to climate modellers but not in the real world. He’s as well or better qualified than the modellers he criticises — the ones whose Year 2100 forebodings of 4degC warming have set the world to spending $US1.5 trillion a year to combat CO2 emissions.

The iconoclast is Dr. Mototaka Nakamura. In June he put out a small book in Japanese on “the sorry state of climate science”. It’s titled Confessions of a climate scientist: the global warming hypothesis is an unproven hypothesis, and he is very much qualified to take a stand. From 1990 to 2014 he worked on cloud dynamics and forces mixing atmospheric and ocean flows on medium to planetary scales. His bases were MIT (for a Doctor of Science in meteorology), Georgia Institute of Technology, Goddard Space Flight Centre, Jet Propulsion Laboratory, Duke and Hawaii Universities and the Japan Agency for Marine-Earth Science and Technology. He’s published about 20 climate papers on fluid dynamics.[i]

Today’s vast panoply of “global warming science” is like an upside down pyramid built on the work of a few score of serious climate modellers. They claim to have demonstrated human-derived CO2 emissions as the cause of recent global warming and project that warming forward. Every orthodox climate researcher takes such output from the modellers’ black boxes as a given.

A fine example is from the Australian Academy of Science’s explanatory booklet of 2015. It claims, absurdly, that the models’ outputs are “compelling evidence” for human-caused warming.[ii] Specifically, it refers to model runs with and without human emissions and finds the “with” variety better matches the 150-year temperature record (which itself is a highly dubious construct). Thus satisfied, the Academy then propagates to the public and politicians the models’ forecasts for disastrous warming this century.

Now for Dr Nakamura’s expert demolition of the modelling. There was no English edition of his book in June and only a few bits were translated and circulated. But Dr Nakamura last week offered via a free Kindle version his own version in English. It’s not a translation but a fresh essay leading back to his original conclusions.

The temperature forecasting models trying to deal with the intractable complexities of the climate are no  better than “toys” or “Mickey Mouse mockeries” of the real world, he says. This is not actually a radical idea. The IPCC in its third report (2001) conceded (emphasis added),

In climate research and modelling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible. (Chapter 14, Section 14.2.2.2. )]

Somehow that official warning was deep-sixed by the alarmists. Now Nakamura has found it again, further accusing the orthodox scientists of “data falsification” by adjusting previous temperature data to increase apparent warming “The global surface mean temperature-change data no longer have any scientific value and are nothing except a propaganda tool to the public,” he writes.

The climate models are useful tools for academic studies, he says. However, “the models just become useless pieces of junk or worse (worse in a sense that they can produce gravely misleading output) when they are used for climate forecasting.” The reason:

These models completely lack some critically important  climate processes and feedbacks, and represent some other critically important climate processes and feedbacks in grossly distorted manners to the extent that makes these models totally useless for any meaningful climate prediction.

I myself used to use climate simulation models for scientific studies, not for predictions, and learned about their problems and limitations in the process.

Nakamura and colleagues even tried to patch up some of the models’ crudeness

…so I know the workings of these models very well … For better or worse I have more or less lost interest in the climate science and am not thrilled to spend so much of my time and energy in this kind of writing beyond the point that satisfies my own sense of obligation to the US and Japanese taxpayers who financially supported my higher education and spontaneous and free research activity. So please expect this to be the only writing of this sort coming from me.

I am confident that some honest and courageous, true climate scientists will continue to publicly point out the fraudulent claims made by the mainstream climate science community in English. I regret to say this but I am also confident that docile and/or incompetent Japanese climate researchers will remain silent until the ’mainstream climate science community’ changes its tone, if ever.

He projects warming from CO2 doubling, “according to the true experts”, to be only 0.5degC. He says he doesn’t dispute the possibility of either catastrophic warming or severe glaciation since the climate system’s myriad non-linear processes swamp “the toys” used for climate predictions. Climate forecasting is simply impossible, if only because future changes in solar energy output are unknowable.  As to the impacts of human-caused CO2, they can’t be judged “with the knowledge and technology we currently possess.”

Other gross model simplifications include

# Ignorance about large and small-scale ocean dynamics

# A complete lack of meaningful representations of aerosol changes that generate clouds.

# Lack of understanding of drivers of ice-albedo (reflectivity) feedbacks: “Without a reasonably accurate representation, it is impossible to make any meaningful predictions of climate variations and changes in the middle and high latitudes and thus the entire planet.”

# Inability to deal with water vapor elements

# Arbitrary “tunings” (fudges) of key parameters that are not understood

Concerning CO2 changes he says,

I want to point out a simple fact that it is impossible to correctly predict even the sense or direction of a change of a system when the prediction tool lacks and/or grossly distorts important non-linear processes, feedbacks in particular, that are present in the actual system …

… The real or realistically-simulated climate system is far more complex than an absurdly simple system simulated by the toys that have been used for climate predictions to date, and will be insurmountably difficult for those naïve climate researchers who have zero or very limited understanding of geophysical fluid dynamics. I understand geophysical fluid dynamics just a little, but enough to realize that the dynamics of the atmosphere and oceans are absolutely critical facets of the climate system if one hopes to ever make any meaningful prediction of climate variation.

Solar input, absurdly, is modelled as a “never changing quantity”. He says, “It has only been several decades since we acquired  an ability to accurately monitor the incoming solar energy. In these several decades only, it has varied by one to two watts per square metre. Is it reasonable to assume that it will not vary any more than that in the next hundred years or longer for forecasting purposes? I would say, No.”

Good modelling of oceans is crucial, as the slow ocean currents are transporting vast amounts of heat around the globe, making the minor atmospheric heat storage changes almost irrelevant. For example, the Gulf Stream has kept western Eurasia warm for centuries. On time scales of more than a few years, it plays a far more important role on climate than atmospheric changes. “It is absolutely vital for any meaningful climate prediction to be made with a reasonably accurate representation of the state and actions of the oceans.” In real oceans rather than modelled ones, just like in the atmosphere, the smaller-scale flows often tend to counteract the effects of the larger-scale flows. Nakamura spent hundreds of hours vainly trying to remedy the flaws he observed, concluding that the models “result in a grotesque distortion of the mixing and transport of momentum, heat and salt, thereby making the behaviour of the climate simulation models utterly unrealistic…”

Proper ocean modelling would require a tenfold improvement in spatial resolution and a vast increase in computing power, probably requiring quantum computers. If or when quantum computers can reproduce the small-scale interactions, the researchers will remain out of their depth because of their traditional simplifying of conditions.

Key model elements are replete with “tunings” i.e. fudges. Nakamura explains how that trick works

The models are ‘tuned’ by tinkering around with values of various parameters until the best compromise is obtained. I used to do it myself. It is a necessary and unavoidable procedure and not a problem so long as the user is aware of its ramifications and is honest about it. But it is a serious and fatal flaw if it is used for climate forecasting/prediction purposes.

One set of fudges involves clouds.

Ad hoc representation of clouds may be the greatest source of uncertainty in climate prediction. A profound fact is that only a very small change, so small that it cannot be measured accurately…in the global cloud characteristics can completely offset the warming effect of the doubled atmospheric CO2.

Two such characteristics are an increase in cloud area and  a decrease in the average size of cloud particles.

Accurate simulation of cloud is simply impossible in climate models since it requires calculations of processes at scales smaller than 1mm.” Instead, the modellers put in their own cloud parameters. Anyone studying real cloud formation and then the treatment in climate models would be “flabbergasted by the perfunctory treatment of clouds in the models.

Nakamura describes as “moronic” the claims that “tuned” ocean models are good enough for climate predictions. That’s because, in tuning some parameters, other aspects of the model have to become extremely distorted. He says a large part of the forecast global warming is attributed to water vapor changes, not CO2 changes. “But the fact is this: all climate simulation models perform poorly in reproducing the atmospheric water vapor and its radiative forcing observed in the current climate… They have only a few parameters that can be used to ‘tune’ the performance of the models and (are) utterly unrealistic.” Positive water vapor feedbacks from CO2 increases are artificially enforced by the modelers. They neglect other reverse feedbacks in the real world, and hence they exaggerate forecast warming.

The  supposed measuring of global average temperatures from 1890 has been based on thermometer readouts barely covering 5 per cent of the globe until the satellite era began 40-50 years ago. “We do not know how global climate has changed in the past century, all we know is some limited regional climate changes, such as in Europe, North America and parts of Asia.”  This makes meaningless the Paris targets of 1.5degC or 2degC above pre-industrial levels.

He is contemptuous of claims about models being “validated”, saying the modellers are merely “trying to construct narratives that justify the use of these models for climate predictions.” And he concludes,

The take-home message is (that) all climate simulation models, even those with the best parametric representation scheme for convective motions and clouds, suffer from a very large degree of arbitrariness in the representation of processes that determine the atmospheric water vapor and cloud fields. Since the climate models are tuned arbitrarily …there is no reason to trust their predictions/forecasts.

With values of parameters that are supposed to represent many complex processes being held constant, many nonlinear processes in the real climate system are absent or grossly distorted in the models. It is a delusion to believe that simulation models that lack important nonlinear processes in the real climate system can predict (even) the sense or direction of the climate change correctly.

I was distracted from his message because the mix of Japanese and English scripts in the book kept crashing my Kindle software. Still, I persevered. I recommend you do too. There’s at least $US30 trillion ($US30,000, 000,000,000) hanging on this bunfight.

Tony Thomas’s new book, The West: An insider’s tale – A romping reporter in Perth’s innocent ’60s is available from Boffins Books, Perth, the Royal WA Historical Society (Nedlands) and online here

 

[i] They include (to give you the flavor)

# “Destabilisation of thermohaline circulation by atmospheric eddy transports”

#“Effects of the ice-albedo [reflectivity] and runoff feedbacks on the thermohaline circulation”

# “Diagnoses of an eddy-resolving Atlantic Ocean model simulation in the vicinity of the Gulf Stream”

# “A simulation study of the 2003 heat wave in Europe”

# “Impacts of SST [sea surface temperature] anomalies in the Agulhas Current System on the climate variations in the southern Africa and its vicinity.”
# “Greenland sea surface temperature changes and accompanying changes in the north hemispheric climate.”

[ii] “Climate models allow us 
to understand the causes of past climate changes, and to project climate change into the future. Together with physical principles and knowledge of past variations, models provide compelling evidence that recent changes are due to increased greenhouse gas concentrations in the atmosphere … Using climate models, it is possible to separate the effects of the natural and human-induced influences on climate. Models can successfully reproduce the observed warming over the last 150 years when both natural and human influences are included, but not when natural influences act alone.” A footnote directs to a study by 15 modellers cited in the 2015 IPCC report.