Sigh. It is a bit like pointing out the bloody, bleedin' obvious, or should be. But people are always suckered by claims that a study has shown this or that.
The first question should always be "has it?"
By that I mean a study is only as good as its methodology.
If that is wrong or suspect or weak, then so is the study's results. But as Gittens observes, you'd never know this from the way they are used politically. All the "heroic" and "courageous" assumptions (ie educated guesses) underlying the results are ignored. So too the margins of error of the results themselves.
All too often the media then lets us down by simply repeating the results as fact, never bothering to look behind the executive summary to try and determine how valid they are.
Finally, the weaknesses of mathematical models to produce robust results about complex and not completely understood systems has another real world example, and that is climate change.
All the scary scenarios presented to you about how you are going to fry/freeze/die of thirst/drown in the future is based on nothing more than mathematical models that do not even come close to adequately modelling the actual climate. this is for the simple reason that we just don't know how large parts of this system operate.
How can you model what you don't know?