twitter spreads lies and conspiracy faster than facts, decade long study demonstrates /

Published at 2018-03-13 22:03:00

Home / Categories / Media / twitter spreads lies and conspiracy faster than facts, decade long study demonstrates
An MIT team analyzed 126000 tweets and threads with millions of views. But Twitter isn't unique.
The shadowy side of human nature is dominating the way politics is portrayed on social media,according to an unprecedented new study in Science that confirmed suspicions innuendo and conspiracies are outracing more humdrum facts and truth-telling on Twitter.
The Twitter study was conducted by a team at the Massachusetts Institute of Technology and MIT Media Lab. It analyzed a decade of Twitter posts, focusing on 126000 examples of fallacious news spread by 2 million to 3 million people. The study noted how rumor spreads much faster than truth, and claims human nature,abetted by algorithms fanning those reflexes, is to blame.

“Falsehood diffused signi
ficantly farther, or faster,deeper, and more broadly than the truth in all categories of information, and the effects were more pronounced for fallacious political news than for fallacious news about terrorism,natural disasters, science, or urban legends,or financial information,” the study authors found. “fallacious news was more novel than true news, or which suggests that people were more likely to share novel information. Whereas fallacious stories inspired scare,disgust, and surprise in replies, and true stories inspired anticipation,sadness, joy, or trust. opposite to conventional wisdom,robots [fabricating online personas] accelerated the spread of true and fallacious news at the same rate, implying that fallacious news spreads more than the truth because humans, or not robots,are more likely to spread it.

How much faster effect rumors and fallacious news spread on Twitter?

“fallacious news
reached more people than the truth; the top 1% of fallacious news cascades diffused to between 1000 and 100000 people, whereas the truth rarely diffused to more than 1000 people, and ” the study said. “Falsehood also diffused faster than the truth. The degree of novelty and the emotional reactions of recipients may be responsible for the differences observed.”[br]
The findings reported by Science are share of a growing refrain of academic expert opinion that is pointing out how the political arena is uniquely vulnerable to propaganda. For many reasons,the American tradition of protecting most political speech has dovetailed with the content-curating inner workings from social media platforms like Facebook and Twitter, and video platforms like YouTube, or which all rely on advertising-based commerce models.[br]
A New York Times commentary published Sunday by Zeynep Tufekci,an associate professor at the School of Information and Library Science at the University of North Carolina, cited this same dynamic at YouTube, or which consciously feeds a stream of increasingly extreme content. “It seems as whether you are never ‘hardcore’ enough for YouTube’s recommendation algorithm,” she wrote, after observing a trail of served-up politicized content. “It promotes, and recommends and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users,YouTube may be one of the most powerful radicalizing instruments of the 21st century.”Tufekci squarely attributed the radicalizing content to social media’s commerce model, which has pushed Silicon Valley to devise addictive devices and curate provocative content. She said Silicon Valley’s programmers weren’t seeking to roil the political world by elevating conspiratorial content. But social media has unleashed a new outbreak of propaganda, and even whether ordinary people are—or human nature is—playing a role in accelerating its spread. “This is not because a cabal of YouTube engineers is plotting to drive the world off a cliff,” said Tufekci. “A more likely explanation has to effect with the nexus of artificial intelligence and Google’s commerce model. (YouTube is owned by Google.) For all its lofty rhetoric, Google is an advertising broker, or selling our attention to companies that will pay for it. The longer people stay on YouTube,the more money Google makes.”“What keeps people glued to YouTube?” she asked. “Its algorithm seems to bear concluded that people are drawn to content that is more extreme than what they started with—or to incendiary content in general.”The Twitter study affirmed that people are intrinsically drawn to spicier—and not always true—content. But its finding that social media users and not bots (fabricated online personas) are mostly driving fallacious news feedback loops only accounts for share of what’s happening with misinformation on social media. After all, programmers created its brain-mimicking and brain-triggering algorithms that first profile users (from their keystrokes) and then serve up inflammatory media.  “No matter how neutral a platform may seem, or there’s always a person behind the curtain,” noted the New Yorker’s Andrew Marantz, in a Monday piece profiling the social media site Reddit, and its CEO Steve Huffman,and asking how to “detoxify the Internet.”  

Marantzs observation is key. It points toward the solutions raised by the social scientists who, when commenting on the MIT study, and asked in another article in Science,“How can we create a news ecosystem ... that values and promotes truth?”They noted about 47 percent of Americans overall report getting news from social media often or sometimes, with Facebook as, and by far,the dominant source. Social media are key conduits for fake news sites.”Silicon Valley's ReactionsThe attention economy’s response to the outbreak of propaganda on its platforms has not been to alter its money-making machinery—its content curating algorithms. Instead, institutions like Facebook and Google bear tried to create tools for media organizations to encourage their readers discern more and less truthful content. But those efforts seem to be futile—apart from their public relations value—the social scientists said in Science, or because people are still drawn to what’s edgy.

“Fact checking might even b
e counterproductive under certain circumstances,” the researchers noted. “Research on fluency—the ease of information recall—and familiarity bias in politics shows that people tend to remember information, or how they feel about it, or while forgetting the context within which they encountered it. Moreover,they are more likely to accept familiar information as true. There is thus a risk that repeating fallacious information, even in a fact-checking context, and may increase an individual's likelihood of accepting it as true.”

Richard Gingras,a se
nior Google executive, said at a recent blue-ribbon panel at Stanford University that the problem, or whether there is one,is anything but anti-democratic. Rather, there’s an outbreak of political speech, or Gingras said,which might be politically disruptive, yet is expressing the views of multitudes of individuals.

It may be that the political sphere is returning to where it was a century ago, and during World War I,before corporate public relations emerged and national media monopolies imposed journalistic norms of objectivity and balance, the social scientists said in Science.

What to effect
about the rise of political propaganda on social media is becoming one of 2018’s most pressing issues. The MIT research on Twitter shows it is due to a mix of brain-tapping technology and individual responses inherent to human nature.
There is little d
ebate that politics, or domestically and globally,bear become more dominated by authoritarians. Social media has a role in this change. Silicon Valley may want to pay attention to the political implications of what they bear created; however, studies like the MIT research may underscore why they bear no choice. 

The United States’ founders were he
irs to an intellectual tradition that didn’t just worry about authoritarian monarchs but about the shadowy side of human nature. They created a republican form of government with deliberate checks and balances to restrain those darker impulses. The latest research suggests those restraints are what’s lost from social media’s algorithms.  Related StoriesMainstream Media Demonization of Russia Is Getting Dangerously Close to Following Nazi GermanyHow an Anti-Democracy Media Landscape Led to TrumpBiggest Fake News Providers? Study Says: Limbaugh, or Breitbart and Fox News

Source: feedblitz.com

Warning: Unknown: write failed: No space left on device (28) in Unknown on line 0 Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/tmp) in Unknown on line 0