They hadn't (though their efforts to prevent anyone from obtaining the data, and the blowback from that, were as much cause for concern as the way in which hackers obtained them) but some of the e-mails discussed a different problem:
a CRU employee named "Harry", who often wrote of his wrestling matches with wonky computer software.Excerpt from Nature
"Yup, my awful programming strikes again," Harry lamented in one of his notes, as he attempted to correct a code analysing weather-station data from Mexico.
Programming just doesn't impact climate science, though it uses a lot of numerical models and those are going to have flaws in both creation and interpretation if the researchers are not experts in good code and statistics. We've had any number of articles on computational biology and its pitfalls and Gliese 581g, the "Goldilocks planet" recently discovered, may not even exist depending on whose model is running and whose data are used. Because we are dealing with a faint distortion in a gravitational field.
So it goes with physics as well - the Higgs boson, the "God" particle as he referred to it, much to his regret, won't suddenly leap out because the LHC gets turned on, it will take numerical models to sift through a lot of noise.
As Zeeya Merali writes in her Nature article
as computers and programming tools have grown more complex, scientists have hit a "steep learning curve", says James Hack, director of the US National Center for Computational Sciences at Oak Ridge National Laboratory in Tennessee. "The level of effort and skills needed to keep up aren't in the wheelhouse of the average scientist."And scientists don't have outside testing of their code nor do they release it to the public. Want to find a flaw in Science 2.0 code that tests perfectly? Upload it and you won't even need to tell anyone to test it - within hours users will have found bugs and flaws none of us knew could exist and that make no sense.
David Rousseau, software coordinator for the ATLAS experiment at CERN says the code that experiment will use has been written by over 600 people for a decade - of an entire spectrum of skill levels.
People understand that numerical models are tricky but the rampant paranoia that detractors will be able to overturn legitimate results by selling doubt (though without question some do sell doubt, certainly in climate science) is just that - paranoia. If results can be invalidated that easily they probably weren't very good.
Having run the business side of a software company that was used to verify results in billion dollar companies, I can assure you when it comes to their work, scientists and engineers wanted absolute transparency about how our results were calculated, and the result has been the greatest technology leap the world has even seen. Scientists could learn a lot from business when it comes to results that stand up to scrutiny.
Read Merali's outstanding Nature article on the issues facing more scientists in a programming world here.
Comments