It has always been obvious that although the government had been successful in outlawing a number of printed publications, the fact that each of these could be downloaded from multiple places on the internet made the bans more or less redundant. So it is not surprising that the Daily Telegraph is reporting today that the federal government intends to take its war against “hate literature” to the internet.
THE Federal Government is considering “screening” technology to stop terrorist groups from recruiting vulnerable young members in Australia over the internet.
Attorney-General Philip Ruddock told The Daily Telegraph yesterday the software plan – still in its infancy – was just one option in the escalating online war against terrorists.
“At the moment the internet is the biggest problem in this war and we are only going after people we can get our hands on, but that is changing,” Mr Ruddock said.
“We are looking at ways and means of using technology that detects hate publications and removes them.
“To do it effectively we will need the help of law enforcement agencies in the US and Europe.”
So how could they achieve this? The government could create a national filter — like China — and require all web traffic to pass through it. Sites that had previously been identified as ‘extremist’ would be banned, as would sites containing terms or phrases identified as markers of ‘hate speech’. The use of a probabilistic classifier such as the naive Bayes classifiers used in spam filtering might work provided the government has a suitable corpus of ‘hate publications’ to use as an input and provided people were willing to accept being prevented from accessing a broad range of legitimate material that was misclassified due to common word occurrences.
However, despite all this, it is still technically impossible to prevent people from accessing whatever they want on the internet. For example, freely available technology such as the popular Onion Router allow people to pass through government and ISP filtering proxies with ease. The government may be able to have sites taken down — though I suspect it might be hard in the United States where free speech is taken quite seriously — but that would just mean other sites reappear with the same content. At worst, it might be a minor inconvenience for determined wannabe jihadis to have to install TOR, the Freenet client or whatever tools appears next in response to the new filtering technologies.
As Boing Boing’s Cory Doctorow put it in a Guardian op-ed earlier in the month:
But every filtering enterprise to date is a failure and a disaster, and it’s my belief that every filtering effort we will ever field will be no less a failure and a disaster. These systems are failures because they continue to allow the bad stuff through. They’re disasters because they block mountains of good stuff. Their proponents acknowledge both these facts, but treat them as secondary to the importance of trying to do something, or being seen to be trying to do something. Secondary to the theatrical and PR value of pretending to be solving the problem.
Rather than waste time and money on pointless technical ’solutions’ to an insolvable ‘problem’, the government should just accept that this sort of literature exists and some people will read it. Some will read it because they believe in it, and others will read it because they want to refute it. Forcing it underground or out of the public view won’t make it go away but it will make it much harder to debunk it. In the mean time, a lot of legitimate information will be misclassified as ‘hateful’ and ‘extremist’ and, to use the word in the article, “screened”.