'Nic Lewis, a coauthor of O’Donnell et al 2010, has been parsing climate sensitivity calculations for some time and with considerable frustration. Nic Lewis has a very important article at Judy Curry’s here.
One of the seminal sensitivity estimates is Forest et al 2006. Nic reports that he tried for over a year to get data for this study with Forest finally saying that the raw data was now “lost”.
I have been trying for over a year, without success, to obtain from Dr Forest the data used in Forest 2006…. Unfortunately, Dr Forest reports that the raw model data is now lost.
Nic was able to to get data for two predecessor studies and has concluded that the calculations in Forest et al 2006 were done erroneously:
If I am right, then correct processing of the data used in Forest 2006 would lead to the conclusion that equilibrium climate sensitivity (to a doubling of CO2 in the atmosphere) is close to 1°C, not 3°C, implying that likely future warming has been grossly overestimated by the IPCC.'
Sometimes I wonder at how conveniently 'climate scientists' can 'lose' the data they use to feed their models and spreadsheets so that their work can NOT be CHECKED much less duplicated. Hiding the programs and methodology used, losing the data, adjusting the data, it seems at times that every possible trick that can be used to obfuscate is employed.
True science is replicable. True theories are able to be refuted if wrong. Data is data, and needs to be available in order for error checking and evaluation by those who don't have a 'dog in the hunt' to ensure that mistakes are not being made.
Current 'climate science' hardly deserves the name.
h/t Steve McIntyre @ ClimateAudit