You know, what I'd really like to know is, on what basis was it "assumed" that global warming would cause an expansion of deserts, when we already know from the geological record that deserts tend to expand during cooler periods that have lower rainfall, not warmer periods that usually are wetter?
But this is old news in some ways.
I'm aware of another report a year or more ago that said that satellite imagery was clearly showing the Earth getting greener over the last 30 odd years.
Given all this, can you spot the elephant trying to hide in the corner?
Carbon dioxide is of course food for plants and they like higher concentrations of it in the atmosphere. I suspect our greening world is not unconnected to our carbon dioxide emissions.
Thanks to Andrew Bolt
And the tropical rain forests are expanding too, according to The New York Times.