The researchers asked students if this element probably linked to real news or fake news.What's the clue that tells you this story is probably not "real news"?
Fake news is in the (real) news lately. Whether you're looking at Facebook, Buzzfeed, or your online newspaper, companies may try to clickbait you into reading a story that's false. Companies may want you to read the story so that you'll be exposed to their advertising. Or a political group may want to persuade you of an extreme opinion. In some recent cases, people have read fake news stories, believed them, and then acted according to what they thought was true (here's an example).
How often do people mistake fake news for real news?
A team at Stanford University recently attempted to measure the problem in a large sample of high school students. The results of their study were summarized by the Wall Street Journal. The journalist from the WSJ reported the following:
...82% of middle-schoolers couldn’t distinguish between an ad labeled “sponsored content” and a real news story on a website, according to a Stanford University study of 7,804 students from middle school through college. The study, set for release Tuesday, is the biggest so far on how teens evaluate information they find online.
The study apparently showed students several examples, asking them for each one if it was a real story or fake news. You'll see an example of one of their study's stimuli in the photo to the left. You can see the other samples in the full report from Stanford's website (scroll to p. 9).
Here are some more results, reported by the WSJ:
More than two out of three middle-schoolers couldn’t see any valid reason to mistrust a post written by a bank executive arguing that young adults need more financial-planning help. And nearly four in 10 high-school students believed, based on the headline, that a photo of deformed daisies on a photo-sharing site provided strong evidence of toxic conditions near the Fukushima Daiichi nuclear plant in Japan, even though no source or location was given for the photo.
a) What kind of claim is it to say that "82% of middle-schoolers couldn’t distinguish between an ad labeled “sponsored content” and a real news story" (Frequency, association, or cause?) What is (are) the variable(s) in the claim?
b) In order to claim that "82% of middle schoolers" do something, you'd probably need to be sure that the study included a generalizable sample of middle schoolers. What are some ways the researchers could have obtained an externally valid sample?
c) For a frequency claim like this one, construct validity is also important. The construct validity of the Stanford study seems excellent, because the researchers asked students questions about realistic-looking mockups of online content. Reading back through the green quotes above, you'll see three different ways they measured the variable, "knowing when news is fake." What are the three ways?
I can't help but point out that in your research methods class, you will learn several media literacy skills. You're learning that journalists might not always get the details of a scientific study right--they might not even read the original article! Journalists might slap a causal claim on a correlational study. Or they might write a sensational study about a single study without reviewing the entire literature on a topic. Being a good consumer of information means you'll be able to critically evaluate media stories about science (and other topics, too).