|
本帖最后由 kaleege 于 2013-9-11 13:48 编辑
最近在社科领域火翻天的一篇研究就是哈佛GaryKing教授对中国互联网审查的研究。他在大陆开了家网站,然后用这个网站发各种消息,通过统计方法研究什么样特征的信息会被审查。结论大概是:网络审查标准不是骂政府而是不能动员群众上街。
http://blogs.wsj.com/chinarealtime/2013/08/30/an-inside-look-at-chinas-censorship-tools/?mod=WSJBlog
An Inside Look at China’s Censorship Tools
If you can’t beat China’s censors, why not join them?
That’s what a Harvard University professor decided to do, in a creative effort to learn firsthand just how censorship in China works.To get inside the system, professor Gary King and two Ph.D. students started their own fake social network over the past year, which—while it never formally went online—allowed them to reach out to some of China’s many companies offering censorship software. Their results, published this week, show the wide array of tools that social media companies like Sina Corp. and Tencent Holdings Ltd. can harness to control information as required by authorities.
Thanks to their software acquisition—purchased from a company that Mr. King declined to name—the Harvard team found a diverse array of tools at their disposal, which allowed them to screen and delete posts according to different keywords and categories, as well as block posts based on user, length of post or time of day.
The team’s research also helps shed light on a persistent question in China—namely, just how many censors are employed in the country. In Mr. King’s experience, the company recommended that his team hire two to three full-time censors for every 50,000 users. If that same formula was used at Sina Weibo, China’s most popular microblogging platform, the company would employ somewhere between 2,160 and 3,240 censors to cover its 54 million daily active users.
In an interview, Mr. King said his exposure to the broad range of censorship software on the market suggests that China’s government is “allowing competition and innovation in censorship technology.”
“They have top-down control of outcomes, but not of the particular process. It’s not a bad strategy,” he said.
Mr. King’s findings closely mirror those of another study this year, which found that Sina’s censors are able to scrub sensitive content within just minutes of most postings. Still, he said, his research also illuminated the limitations of such technology.
Specifically, while software provided to Mr. King’s group allowed them to flag posts with sensitive keywordsfor further inspection before permitting them to go live, even many pro-government posts they tested would still get caught in the dragnet. For example, he said, a post praising the government’s “anti-corruption” policies might be flagged because it contained the word “corruption.”
Despite such flaws, based on conversations with the software provider, Mr. King said such a keyword system remains popular, in part because it works as a buffer and helps give censors time to catch up on the flood of potentially sensitive posts hitting China’s social media sites at any given time.
Mr. King said this week that the study’s findings also reinforce his previous work on Chinese censorship, which found that posts with the potential to stir collective action—for example, those related to protests or Tibetan self-immolations—tend to be most heavily targeted, not the posts that are simply critical of the Chinese government.
Still, Mr. King is careful to note that a number of the functions provided by the so-called “censorship” software they tried aren’t unique to China. For example, in places such as the U.S., he says, many websites use also software that requires all comments to go through a moderator before appearing online. “We think of it as avoiding spam, but you could use the software for censoring,” he said. “It’s just that in China the software is more sophisticated.”
–Paul Mozur
|
|