05 March 2011

And whack this schmuck, too

Rico says not only is this asshole asking people to kill us, he's a damned American himself! (Go figure.) But Scott Shane has the whole story in The New York Times:
From the shootings at Fort Hood, Texas to the stabbing of a British member of Parliament, investigators have identified Anwar al-Awlaki’s stirring online calls to jihad as an important instigator of terrorism. So members of Congress appealed to YouTube last year  to remove calls for violence by Mr. Awlaki, the militant American-born cleric now in hiding in Yemen and, in an announcement reported last November, YouTube agreed.
End of story? Not at all. A quick search of YouTube today for “Anwar al-Awlaki” finds hundreds of his videos, most of them scriptural commentary or clerical advice, but dozens that include calls for jihad or attacks on the United States.
The story of You Tube and Mr. Awlaki is a revealing case study in the complexity of limiting controversial speech in the age of do-it-yourself media, as the House prepares for hearings next week on the radicalization of American Muslims.
In eloquent American English, or Arabic with English subtitles, Mr. Awlaki can be seen in videos decrying America’s “war on Islam”; warning Muslims why they should “never, ever trust a kuffar” (a non-Muslim); praising the attempt by his “student” to blow up a Detroit-bound airliner; and patiently explaining why American civilians are legitimate targets for killings. Such videos have been posted in multiple copies and viewed hundreds or thousands of times.
Since YouTube relies on viewers to flag objectionable material, and only a fraction of Mr. Awlaki’s videos violates its rules, it was never likely that his pronouncements would disappear from the site. Even if they did, scores of other sites without YouTube’s rules also host the declarations— written, audio, or video— of Mr. Awlaki, the man some have called the Osama bin Laden of the Internet.
“There’s no way as a practical matter to wipe this material off the face of the internet,” said John B. Morris, Jr., general counsel at the Center for Democracy and Technology, a nonprofit group in Washington. “It’s very unrealistic to believe that any action of any American company or American politician can keep this material off the web.”
But Evan F. Kohlmann, a terrorism analyst with the consulting company Flashpoint Global Partners, who has followed Mr. Awlaki for years, acknowledged the difficulties, but said that YouTube should make a greater effort to curtail his pro-terrorism message. “YouTube has become a major alternative distribution point for jihadi propaganda, especially for homegrown militants who may not have the pedigree to gain access to the classic password-protected jihadi chat forums,” Mr. Kohlmann said, referring to militant sites that restrict access. “If you don’t have online friends who can sneak you in, and if you don’t speak Arabic, then YouTube may be the best available option.” Mr. Kohlmann said that, while it might not be easy or cheap, “there are ways of removing this material in a relatively expeditious manner.”
YouTube, the six-year-old California-based powerhouse of Web video, owned by Google, says that every minute, day and night, it receives an average of 35 hours of video from millions of contributors. That ratio makes prescreening impractical, said Victoria Grand, YouTube’s head of communications and policy.
Instead, just as YouTube relies on its users to provide content, it relies on them to police the content. The site posts its “community guidelines”, which prohibit incitements to violence, hate speech, bomb-making instructions, and postings by a member of a designated terrorist organization. A signed-in YouTube user who objects to a video clicks on the “flag” beneath it and indicates the reasons for a complaint by clicking on a label: for instance, “nudity”, “child abuse”, “animal abuse”, or “mass advertising”.
In the case of terrorism-related material, objections could fall in the categories “violent or repulsive conduct”, including subcategories for “physical attack” or— in a label added last November after complaints about Mr. Awlaki— “promotes terrorism”. Militant messages could be “hateful or abusive content” with a subcategory for “promotes hatred or violence”. Then YouTube reviewers look at the flagged videos with the assistance of sophisticated software. Any video that violates the company’s guidelines is removed, Ms. Grand said. “We encourage our users to continue to bring this material to our attention,” she said. “We review flagged videos around the clock.”
The system has prevented YouTube from succumbing to the otherwise inevitable flood of pornography, which is directed to reviewers by software that scans uploaded videos for flesh tones. Computers also give priority to the review of videos with a high “flag-to-view ratio”, suggesting that many viewers are upset about it. Software bumps to a low priority videos that have previously been reviewed, as well as those flagged by users who have a record of, say, objecting to every Justin Bieber video.
YouTube explained this system, but declined to say how many employees review videos, what percentage are reviewed, and how many are removed, either over all or specifically relating to Mr. Awlaki. But Ms. Grand, the company official, explained the importance of context. A video that shows bullying (a banned category) might be permitted if it is intended to educate the public about the hazards of such behavior.
The variety and volume of Mr. Awlaki’s YouTube material makes it more difficult than might be supposed to decide its fate. Should his sermon on what makes a good marriage come down? His account of the final moments of the Prophet Muhammad? His counsel on the proper diet for a good Muslim? Such material does not violate any YouTube standard. But there is evidence that those inspired by Mr. Awlaki to plot violence usually were first drawn by his engaging lectures, including Major Nidal Malik Hasan, the Army psychiatrist charged in the Fort Hood shootings; the young men who planned to attack Fort Dix, New Jersey; and the 21-year-old British student who told the police she stabbed a member of Parliament last May after watching a hundred hours of Awlaki videos.
Even Mr. Awlaki’s most incendiary material appears in widely varying contexts on YouTube. A long interview he gave last year justifying violence against Americans, for instance, appears in some videos with the logo of al-Qaeda’s media wing, but in others as excerpted in newscasts by CNN and al-Jazeera.
Representative Anthony Weiner, a Democrat from New York and a prominent Congressional voice in calling for YouTube to remove Mr. Awlaki’s material (he can be seen doing so on YouTube), said he recognized that the company is “wrestling with a difficult issue” and opposed any government ban, which would be likely to violate constitutional protections for free speech. Still, Mr. Weiner said, he thinks YouTube “could do a better job,” adding, “I’d give them a C, with an opportunity to improve.”
It may be that the crowdsourcing that drives YouTube, its reliance on the masses, becomes the ultimate answer to violent messages on the site, more than company censors. Anti-jihad activists with names like the YouTube Smackdown Corps patrol the site constantly, flagging what they consider to be offensive material.
At a site called Jihadi Smackdown of the Day (“Countering the cyber-jihad, one video at a time”), the links for past YouTube videos of Mr. Awlaki now usually lead to a standard message: This video has been removed as a violation of YouTube’s policy.
Rico says people are undoubtedly loading them faster than YouTube can hunt 'em down...

No comments:

 

Casino Deposit Bonus