Bloggers are fond of yelling "correlation is not causation" at any piece of research that comes to a conclusion they find distasteful, but what they almost never do is actually read the paper in question, which invariably addresses most of their concerns: research methodology; alternate explanations; potential intervening variables; results of similar studies in the past; shortcomings in the data set; etc. That's not to say that researchers always take every possible problem seriously enough, or that social science papers don't deserve heightened scrutiny. But it is to say that if, in 30 seconds, some possible problem with the research program occurs to you, it's almost a dead certainty that the person with a PhD who performed the study also thought of the same thing. And discusses it in the paper.
Amen, brother. This is a fractionally more statistically aware version of the "You haven't convinced me" argument that I've railed against before; it just lets you immediately dismiss findings without bothering to check how valid they are.
I always think that there's an easy test for whether or not your argument is bullshit when discussing the veracity of a scientific claim, and that's whether or not it can also be applied with regard to ideas you already absolutely believe. "Correlation is not causation" could be used as an argument questioning whether radioactive exposure is dangerous, for example. So if you can use your sound bite objection to call into doubt basic scientific facts, there's a strong case to be made that you're a fucking idiot.
That's not to say that no-one ever does mistake correlation for causality (I used to hammer this idea into schoolchildren back when I was still teaching), it just means that it's a mistake that needs to be found, not one that can just be assumed is there because that way you don't have to change your mind about anything.