A new misinformation quiz shows that, despite the stereotype, younger Americans have a harder time discerning fake headlines, compared with older generations
A new misinformation quiz shows that, despite the stereotype, younger Americans have a harder time discerning fake headlines, compared with older generations
It sounds interesting but I don’t think you can discern anything from a headline in isolation, without knowing the source and its biases and the context. I tried taking the test but gave up because short of actually knowing the topic each one would be a 50-50 guess.
Actually, I don’t think you’re supposed to be judging it on the topic. I considered each in terms of whether or not I thought the title would get a rise out of a particular target demographic (whether that rise was positive or negative) and I got 20/20.
Agreed, I think gauging whether you’re being affected by the substance of an article or being affected by the language used is crucial. Like the dangers of dihydrogen monoxide , there’s a lot of ways to tell the truth while manipulating to dishonest ends. Going further, you can speak to someone’s personal biases and don’t even have to bother with bending the truth.
I think that’s the point. If you looked at a headline for something you already know about, then you already know if it bogus or not. If you already know how reliable the source is, then your exposure to risk of accepting bad information is reduced. The point is to see if you are susceptible to new information that is bogus, and if you can recognize when a source you haven’t seen before is unreliable.
But I wouldn’t believe or reject any of them based on the headline alone, the true answer for most of them is “I don’t know / can’t know”. They all sound equally plausible to someone with no knowledge of the topic.
That’s true, you can’t ‘know’ the answer. I think the test is designed so you have to guess based on the question only. Some of them are obvious some not so. You have to determine your answer on whether it passes the ‘smell test’.
I think the intent is for us to judge what would be “reasonable” or “likely”, rather than having specific knowledge of the headline.
“Tornado rearranges DC highway into giant peace sign” could happen, theoretically, but it’s very unlikely to.
“Government appoints new head of some environmental division”? Sure, that happens all the time and is pretty mundane.
That would be an example where I can apply my existing knowledge, I know enough about tornados, highways, and peace signs to know that’s statistically improbable.
Whereas “Government appoints new head of some environmental division” I don’t know, sounds perfectly reasonable and plausible, but I couldn’t possibly say. In real life I could reason that a newspaper would have no reason to make up something so mundane (that’s why context is important), but knowing this is a test with fake answers makes it random chance.
The tornado falls into the category of “the figure of Jesus in the crust of a pizza”. It’s 100% subjective and it’s not news anyway.
What matters is who is talking to you. It’s the " about us" tab at the bottom of the website. Thaty why http://ground.news is useful.
What was your score?
I gave up when I realised the test was meaningless. There are a few I could tell were almost definitely false based on existing knowledge, but the rest would be 50/50 choices.