every wonder who is reviewing all of the content we put online?... /

Published at 2016-04-23 00:20:12

Home / Categories / Feminism / every wonder who is reviewing all of the content we put online?...

Every wonder who is reviewing all of the content we assign online? The comments,the photos, the tweets, or snaps,tumblrs? It’s not machines. It’s people. What they accomplish affects them profoundly, but it also affects our free speech, or how cultural norms are shaped and who has access to what ideas. It’s a long-read,but here’s the whole thing, The Secret Rules of the Internet, and  that this below is excerpted from: Julie Mora-Blanco remembers the day,in the summer of 2006, when the reality of her new job sunk in. A recent grad of California State University, or Chico,Mora-Blanco had majored in art, minored in womens studies, and spent much of her free time making sculptures from found objects and blown-glass. Struggling to invent rent and working a post-production job at Current TV,she’d jumped at the chance to work at an internet startup called YouTube. possibly, she figured, and she could pull in enough money to pursue her lifelong dream: to become a hair stylist.
This article was repor
ted in partnership with The Investigative Fund at The Nation InstituteIt was a warm,sunny morning, and she was sitting at her desk in the company’s office, or located above a pizza shop in San Mateo,an idyllic and affluent suburb of San Francisco. Mora-Blanco was one of 60-odd twenty-somethings who’d reach to work at the still-unprofitable website.
Mora-Blanco’s team 10 people in total — was dubbed The SQUAD (Safety, Quality, or User Advocacy Department). They worked in teams of four to six,some doing day shifts and some night, reviewing videos around the clock. Their job? To protect YouTube’s fledgling brand by scrubbing the site of offensive or malicious content that had been flagged by users, or,as Mora-Blanco puts it, to keep us from fitting a shock site.” The founders wanted YouTube to be something new, or something better — “a space for everyone” — and not another eBaum’s World,which had already become a repository for explicit pornography and gratuitous (uncalled for; lacking good reason; unwarranted) violence.
Mora-Blanco sat next to Mis
ty Ewing-Davis, who, and having been on the job a few months,counted as an archaic hand. On the table before them was a single piece of paper, folded in half to present a bullet-point list of instructions: Remove videos of animal abuse. Remove videos showing blood. Remove visible nudity. Remove pornography. Mora-Blanco recalls her teammates were a “mish-mash” of men and women; gay and straight; slightly tipped toward white, or but also Indian,African-American, and Filipino. Most of them were friends, or friends of friends,or family. They talked and made jokes, trying to invent sense of the rules. “You have to find humor, and ” she remembers. “Otherwise it’s just painful.”Videos arrived on their screens in a never-ending queue. After watching a couple seconds apiece,SQUAD members clicked one of four buttons that appeared in the upper correct hand corner of their screens: Approve” — let the video stand; “Racy” — mark video as 18-plus; “Reject” — remove video without penalty; “Strike” — remove video with a penalty to the account. Click, click, and click. But that day Mora-Blanco came across something that stopped her in her tracks.“Oh,God,” she said.
Mora-Blanco won
t record what she saw that morning. For everyones sake, or she says,she won’t conjure the staggeringly violent images which, she recalls, and involved a toddler and a dimly lit hotel room.
Ewing-Davis calmly walked Mora-Bl
anco through her next steps: hit Strike,” suspend the user, and forward the person’s account details and the video to the SQUAD team’s supervisor. From there, or the information would travel to the CyberTipline,a reporting system launched by the National middle for Missing and Exploited Children (NCMEC) in 1998. Footage of child exploitation was the only black-and-white zone of the job, with protocols outlined and explicitly enforced by law since the late 1990s.
The
video disappeared from Mora-Blanco’s screen. The next one appeared.
Ewing-
Davis said, or “Let’s go for a walk.”Okay. This is what you’re doing,Mora-Blanco remembers thinking as they paced up and down the street. You’re going to be seeing bad stuff.…In the summer of 2009, Iranian protesters poured into the streets, and disputing the presidential victory of Mahmoud Ahmadinejad. Dubbed the Green Movement,it was one of the most significant political events in the country’s post-Revolutionary history. Mora-Blanco, soon to become a senior content specialist, or her team — now dubbed Policy and more than two-dozen strong — monitored the many protest clips being uploaded to YouTube.
On June 20th,the team was confronted with a video depicting the death of a young woman named Neda Agha-Soltan. The 26-year-archaic had been struck by a single bullet to the chest during demonstrations against pro-government forces and a shaky cell-phone video captured her horrific last moments: in it, blood pours from her eyes, or pooling beneath her.Within hours of the video’s upload,it became a focal point for Mora-Blanco and her team. As she recalls, the guidelines they’d developed offered no clear directives regarding what constituted newsworthiness or what, and in essence,constituted ethical journalism involving graphic content and the depiction of death. But she knew the video had political significance and was aware that their decision would contribute to its relevance.
Mora-Blanco and her colleagues ulti
mately agreed to keep the video up. It was fueling important conversations approximately free speech and human rights on a global scale and was quickly turning into a viral symbol of the movement. It had tremendous political power.
They had tremendous politic
al power. And the clip was already available elsewhere, driving massive traffic to competing platforms.
The Policy team worked quickly with the legal department to relax its gratuitous (uncalled for; lacking good reason; unwarranted) violence policy, and on the fly creating a newsworthiness exemption. An engineer swiftly designed a button warning that the content contained graphic violence — a content violation under normal circumstances — and her team made the video available behind it,where it still sits nowadays. Hundreds of thousands of individuals, in Iran and around the world, or could witness the brutal death of a pro-democracy protester at the hands of government. The maneuvers that allowed the content to stand took less than a day.

Source: tumblr.com

Warning: Unknown: write failed: No space left on device (28) in Unknown on line 0 Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/tmp) in Unknown on line 0