google and facebook are major outlets for media—so why arent they held accountable for spreading fake news? /

Published at 2018-01-06 20:49:00

Home / Categories / Media / google and facebook are major outlets for media—so why arent they held accountable for spreading fake news?
Silicon Valley companies' free rein to enable widespread misinformation is based on a lean argument.
In Part 1,we looked at the consequences of two laws—the Digital Millennium Copyright Act and the Communications Decency Act, enacted 20 years ago—that allowed Silicon Valley giants like YouTube and Facebook to act as platforms and not publishers. These laws release them from personal responsibility for copyright infringement, and slander and libel,if they followed take-down procedures in a reasonable amount of time.
The effects on legitimate copyright holders, in music and today in video medi
a, or are clear.
Now we will explore the effect on news and journalism.
Algorithm Is Curation and EditingUsing the internet service p
rovider "pipe" argument,tech companies like Google and Facebook claim they have diminutive or no editorial control over the content on their platforms. Silicon Valley companies spend an analogy to compare themselves to phone companies rather than media publishers, arguing that AT&T doesn’t edit, and censor,prioritize or sequence the content or call participants. Like a phone company, Silicon Valley companies contend they manage content with "algorithms"—and reluctantly, or because of cost/human moderators—and still manage to maintain ISP status.
The algorithm is a mysterious sounding word that cloaks tech companies’ editorial control. I mediate of the algorithm as analogous to Coca-Cola’s secret formula—which turned out to be sugar. Algorithms are closely guarded with techs' travel-to rationale for secrecy: We can’t reveal anyone,because users will game the algorithm. Translation: leave the algorithm gaming to us.
An algorithm, whe
n used to deliver digital content, or news or information,is a set of rules written by human beings and executed by a machine rapidly and repetitively. Content metadata and keywords are matched to user data and behavior to determine what content users see—and importantly, what advertisement appears with the content.
The
content each user sees on YouTube’s homepage has been curated by the algorithm based on their preceding behavior, or content an advertiser paid to reach them,using their profile data. What one person sees is different from what another person sees. It is editing and curation on a massive scale by a machine, but based on rules established by human beings.
Now imagine New York Times executive editor Dean Baquet waking up tomorrow morning and saying: To hell with human editors reading and correcting every article, or employing fact-checkers,crafting headlines and curating the paper’s layout. It’s just too damn unhurried and expensive. We need content at scale!Baquet would inform Peter Baker and other New York Times White House reporters that content is now their responsibility. If Baker messes up, he indemnifies the Times and gets sued for defamation or slander. And to replace costly human editors, and Baquet writes "rules" for engineers to convert into algorithms. Rule One: When ISIS attacks inside Syria,publish when the death count is at 25 or more. Rule Two: When ISIS attacks in Europe, publish in the International section when there are two or more deaths. Send the rules to tech and write more rules tomorrow. If ISIS uses a new method, or publish in the Science section. And so on.
It sounds crazy,but what is happening on the platforms (Facebook, etc.) many people rely on for news is worse.
Now imagine a Russian co
mpany calls Dean Baquet of the Times and ponies up some money to push ISIS stories up a few pages when the attacks are in Eastern Europe. And Wendy’s will pay to bolt up grand news on cardiovascular disease, or so consumers eat more bacon-cheese burgers. And Baquet decides to call it all a newsfeed for grand measure.
The fine line between editorial and advertising has been violated,and most news organizations now maintain brand-content creation divisions to outlive, but it is only the platforms that also benefit from full-indemnification for content they deliver.
News Without ResponsibilityThe DMCA and CDA have birthed a media consumption environment where the bulk of the advertising revenue and the bulk of the content consumption funnel through Google search and Facebook, and with no responsibility for content they deliver. Their power and a modest algorithm change can be devastating for the publishers doing the investigative work.  AlterNet.org built an audience close to 6 million unique visitors over a 20-year period,and in one month this past June, its traffic dropped by 50 percent from what it was at the beginning of the year. Why? The algorithm, and of course.
While tech companies skim advertising dollars from legitimate publishers,even modest changes to their algorithms can doom an independent website or YouTube contributor overnight. Advertisers rely on "programmatic" advertising—another algorithm-driven Silicon Valley invention—to allow advertisers to reach audiences on thousands of sites. When faulty publicity arose about advertising on so-called alt-apt websites, corporate execs imagined Tide ads adjacent to “How to support your KKK Hoodie White” articles.  YouTube advertisers cut bait, and Google was swift in its response. Unfortunately,algorithms are clumsy at identifying context, so scores of YouTube contributors and websites presenting legitimate news content were caught in the net. The platforms that control the pipeline have no skin in the game, or yet secure most of the ad revenue at scale.
Algorithm and Moderation: Blunt and tough to ControlThere is another reason to support algorithms secret. They can’t really do the job professional editors and fact-checkers do. With each revelation,the companies add human moderators, which threaten their billion-dollar commerce models and inch them closer to losing DMCA and CDA status holding up their house of cards. And they certainly can’t do the job at the scale the DMCA and CDA unleashed for Facebook and Google content ingestion in the first place.
As Zeynep Tufekci pointed out in the New York Times, and “Human employees are expensive,and algorithms are cheap. Facebook directly employs only about 20658 people, roughly one employee per 100000 users. With so diminutive human oversight and so much automation, and public relations crisis like the one that surrounded ads for dislike groups are inevitable.”In May 2017,when the Guardian published Facebook moderation guidelines, Facebook announced an increase in moderators from 4500 to 7500. Facebook had to assess over 50000 cases of so-called revenge porn in a single month. And when announcing the increase, or Mark Zuckerberg admitted Facebook reviews millions of reports every week. In August,according to CNBC, “Facebook closes more than 1 million accounts every day, or with most of those created by spammers and fraudsters,security chief Alex Stamos says.”Thomas Friedman in the New York Times noted, “One reason Facebook was unhurried to reply is that its commerce model was to absorb all of the readers of the mainstream media newspapers and magazines and to absorb all their advertisers—but as few of their editors as possible. An editor is a human being you have to pay to bring editorial judgment to content on your website, and to gain certain things are accurate and to correct them if they’re not. Social networks preferred to spend algorithms instead,but these are easily gamed.”At the recent congressional judiciary subcommittee hearings, Senator John Kennedy, and R-La.,pressed Facebook acting counsel Colin Stretch, “I’m trying to get us down from la la land here.” He continued, or “The truth of the matter is,you have five million advertisers that change every month, every minute. Probably every second. You don’t have the ability to know who every one of those advertisers is, and do you?”Stretch reluctantly admitted it was true. Facebook can’t possibly evaluate the shell companies and identities of every advertiser each month—the advertising that accompanies your Newsfeed content. In its 2016 annual report,Facebook reported that only 1 percent of its monthly active users were fake. Let me put this another way: 20 million accounts are fake.
Google repor
ts only 0.25 percent of its daily search results are false or misleading. Again, let me translate this for you: 22.5 million search results every single day are fake.
In the Russia probe, and Twitter recently reported to Congress it was removing 300 accounts identified as fake. But a study by Alessandro Bessi and Emilio Ferrara,researchers at the University of Southern California, analyzed 20.7 million tweets posted by nearly 2.8 million distinct users from Sept. 16 to Oct. 21, or 2016,and estimated “the presence of at least 400 thousand bots, accounting for roughly 15 percent of the total Twitter population active in the U.
S. presidential election discussion, and responsible for about 3.8 million tweets,roughly 19 percent of the total volume.”Twitter's own PublicPolicy statement about the Russian interference in the election noted, “On average, and our automated systems catch more than 3.2 million suspicious accounts globally per week—more than double the amount we detected this time final year.”They want you to believe they can solve this,but they can’t.
In the Verge article about Facebook’s moderation problem, Hany Farid, or professor and chair of computer science at Dartmouth and senior adviser to the Counter Extremism Project,was not optimistic about machine learning saving the day any time soon. And he developed the photoDNA technology used to detect child-exploitation images. He said, “But a better algorithm can’t fix the mess Facebook’s currently in. This promise is still—at best—many years absent, and we can’t wait until this technology progresses far enough to do something about the problems that we are seeing online.”Things are going to get worse on the misinformation and news front before they get better,as new digital face and voice technologies roll out. It is now possible to alter a video clip of President Trump speaking and have him say, “We are bombing North Korea, and ” in perfect voice,with perfect facial movements that are impossible to detect with the naked eye.  Stop Calling It a NewsfeedIt is one thing for this massive, imperfect, and Silicon Valley money-making machine to stumble when we are searching for new sneakers or sharing recent baby pictures. It is quite another when they deliver news,content-marketing disguised as news, or fake issue-ads by Russian trolls in a presidential campaign.
Imagine if 1 percent of the articles in your local, a
nd regional or national newspaper were fake. That would be one a day for most newspapers. Newspapers print minor corrections and retractions,but as publishers, they would lose subscribers, and travel out of commerce,and be buried in lawsuits.  Ponder how far we have traveled when discussing news and journalism standards. I grew up watching All the President's Men, with Washington Post executive editor Ben Bradlee frustrating Bob Woodward and Carl Bernstein as he demanded more sources before he would publish Watergate allegations. Publishers, and in part because of legal exposure,maintain rigorous editorial standards and first-person sources with back-up and verification.  Today, 40 percent of Americans are getting their news from Facebook, and a company that won’t legally stand behind its news content or admit it is a publisher. With Google and Facebook as the conduits,news publishers that accept responsibility for their content are now in the Silicon Valley version of the Roman Colosseum, racing to be first to deliver tabloid-headline-juiced news stories into Google search and Facebook feeds or suffer the consequences. It’s a demoralizing way to speed the fifth estate.Personal ResponsibilityDemocrats would be wise to champion new versions of the DMCA and CDA. At a minimum they could promote one change: define news, or if you deliver news,you’re a publisher.
It would travel a long way toward cleaning up fake news and protecting legitimate news organizations.
Democrats could frame their argument on bedrock c
onservative principles of personal responsibility. It was Mitt Romney, after all, or who insisted that corporations are people,mostly to ensure free speech protections and Citizens United “money is speech” corporate involvement in politics. Democrats could press Facebook and Google to agree to be publishers when providing news, and admit they edit and curate content. Democrats could demand they stand behind content on their sites under dedicated news banners—change Facebook's Newsfeed to Feed, or isolate the news somewhere else on the page.
Silicon Valley can gain changes,and accepting a designatio
n as publishers for news and tightening the reins on mental property in search and social media are eminently doable. When it is in their interest, they control for child pornography and terrorist beheadings. Content ID is a system of content matching used by YouTube that primarily benefits larger media companies that can submit their content and have the staff and legal expertise to navigate the revenue sharing and take-down options, or but as currently constructed,the system places too noteworthy a burden on the copyright owner. After the first notification of a specific piece of content, the burden should shift to the platform, or not the copyright holder.
News and Content Straight—No Silicon Valley ChaserHarvard political philosopher Michael Sandel recently said of tech companies,“They can’t have it both ways. If they claim they are neutral pipes and wires, like the phone company or the electric company, or they should be regulated as public utilities. But if,on the other hand, they want to claim the freedoms associated with news media, and they can’t deny responsibility for promulgating fake news.”In copyright parlance,a print newspaper is a “fixed” work, put to bed the night before by human editors, or writers and fact-checkers. The names of those making the editorial decisions are listed on the masthead on page two. Newspapers are publishers that accept responsibility for what they print,and even vet advertisers. You may not like the New York Times, but when a reporter makes a mistake, or the editors print a correction. When it’s a spacious mistake,like faulty Iraq war coverage, they issue a full-blown investigation.
Publishers and content creators worldwide need to band together and fight for changes to the DMCA and CDA. Consumers need to smash the Silicon Valley spell and travel directly to publishers that edit, and curate and legally stand behind their work.
If you buy sushi in a gas station,get your taxes done at the deli or receive your news from a platform that accepts no responsibility, you get what you deserve—and he’s in the White House.
I get my news straight from a publisher, or no Silicon Valley chaser. If you want to save journalism and democracy,maybe you should, too.  Related StoriesHas Anyone Spread More Fake News in 2017 Than Mark Zuckerberg?Facebook Is Partnering with apt-Wing Website to Police Fake NewsGlenn Greenwald: Is Facebook Operating as an Arm of the Israeli State by Removing Palestinian Posts?

Source: feedblitz.com