Three steps turn Facebook conspiracy into fact

January 9, 2016 7:33 am

  has become home to many conspiracy groups.

Conspiracy theories, hoaxes and other variants of baloney have become
so prevalent and intractable on Facebook that we no longer bother to
debunk them.
But a new study published in the Proceedings of the National Academy of Sciences provides some insight on exactly how misinformation spreads – with big implications for the fight against it.
The paper, titled The spreading of misinformation online,
comes from researchers at Boston University and several prominent
Italian institutions. It draws on five years of posts from 67 public
Facebook pages, roughly half devoted to conspiracy theories and half
about science , plus two unrelated pages that served as a control
group.
They found that, in essence, conspiracy theories and hoaxes spread in a predictable, three-step pattern.
Step 1: An individual or page posts a piece of conspiracy news or information, introducing it to their social network.

Step 2: That conspiracy is voluntarily shared and propagated
by individuals who agree with the narrative – largely within the first
two hours, but again at the 20-hour mark.
Step 3: The conspiracy
gradually branches throughout the network over a period of days, its
speed slowing but its audience growing continuously. Within a period of
two weeks or so, the theory has been adopted by large portions of the
community – and once they’ve been adopted, they’re “highly resistant to
correction”.
In fact, as this group of researchers has found
before, attempts to correct conspiracy theories often have the opposite
effect: They make conspiracists grip their beliefs all the more
strongly.
And while this particular study looked at conspiracy
theories, specifically, its findings also apply to misinformation of
other kinds: fake news, hoaxes, that sort of thing.
In an email
to the Washington Post, Walter Quattrociocchi – the head of the
Laboratory of Computational Social Science at IMT Lucca, and a co-author
of the paper – said that on Facebook, “attempts to correct information
(not only conspiracy theories) end up producing contents that are used
only in the echo chamber that produced the content”, or the debunk, to
begin with.
There are two very interesting things going on here.
First
off, these theories are not circulating willy-nilly around Facebook as a
whole. They spread within specific, defined, ideologically homogenous
communities, or echo chambers, which might not be visible to the naked
eye – but may as well be walled off.
The researchers find that:
Users tend to aggregate in communities of interest, which causes
reinforcement and fosters confirmation bias, segregation, and
polarisation. This comes at the expense of the quality of the
information and leads to proliferation of biased narratives fomented by
unsubstantiated rumours, mistrust, and paranoia.
Because users
create these walled communities themselves – choosing to read only news
that agrees with their biases, or unfriending people who challenge their
socio-political views – there’s not much Facebook can do to remedy the
situation.
Facebook itself came to that conclusion last year,
when the company’s data scientists found that echo chambers were born
less of algorithmic bias than our own intolerance and illiberality.
How do you solve a problem like human nature, though?
It
would be more comforting, frankly, if there was a technological
solution at hand: some algorithmic measure of truth, perhaps, or some
way to tag hoaxes.
This research concludes that those options probably won’t work, but that doesn’t mean they’ve given up.
Their
next steps will involve studying messages that improbably make it over
the walls of different echo chambers, the better to determine how social
and cognitive biases can be overcome on a network-wide scale.
For
the curious, here’s a first step: If you have conspiracy theorists,
fabulists or extreme ideologues in your social network, you actually
shouldn’t unfriend them.

Tags:
shared on wplocker.com