10 comments

  1. I think you’re being a bit kind to Burnham. He isn’t just wrong about this, he’s wrong about everything – ID cards, copyright, libel etc. His natural New Labour centralising model of command and control just doesn’t work so far as Internet content is concerned. But what’s wrong with the alternative? Why shouldn’t parents (eg, me) be responsible? I have parental controls turned on on the machines in my house and my kids are not allowed to use their PCs in their bedrooms, and these seem like reasonable ways to deal with the problem. If someone has a web site with child porn on it, then prosecute them, don’t persecute everyone else.

    I feel quite strongly about this, because I can see a real slippery slope here and before you know it we’ll have to apply for some sort of licence in order to have our blogs!

  2. I’m probably less of a knee-jerk libertarian than the rest of my geek peer group. I agree with you that Burnham may have expressed an important thought rather poorly.

    Your response seems most sane; the problem isn’t that “all censorship is bad,” but rather that “attempts at censorship are becoming futile.” What we need to do, I think, is agree a new set of behaviours, protocols and tools.

    One of my favourite historical anecdotes has it that when James I’s courtiers wanted to punish him for some political transgression, they did so by refusing to censor his incoming mail. As king he received a large amount of what we would now call spam.

    This tells us something about censorship, I think; that it is – as you say – just another kind of filter. We already filter our email. We allow news editors (or the Digg crowd, or Google) to filter our news. Like millions, I let music labels filter my musical taste.

    Censorship is an emotive term. Almost no-one would stand up in support of such a thing (although it’s worth bearing in mind that censorship has been the norm for far longer than “free speech”, and still exists in all nations where “free speech” rights exist.)

    But everyone would support “editorial control” or “filters,” I think.

  3. The discussion is a good one, and one that needs to be had. In my mind, there are 2 important assumptions that are being made (wrongly, imho) though:

    1. The assumption that the internet makes “nastiness” available where it wasn’t before. I don’t remember childhood as being particularly “nice” – simply because the world isn’t a nice place. Death and violence aren’t just in the domain of films, or adult websites. They happen in the playground, on the news, at our feet. (One of the most shocking moments I can think of from my youth was, on my way to school, coming across a dead cat that had been run over.)

    Of course, that’s not to say we should throw all the nasty stuff at our kids 🙂 What I mean is that being exposed to the nasty stuff is part of growing up. Our job is not necessarily simply to filter stuff out so that kids don’t see it. Our job is to prepare them for it and deal with it when it happens. This “progression”, I think, is *more* important than the simple filtering debate.

    2. The assumption that kids are “innocent”. In my day, I seem to recall that my peers and I were actually quite good at going out and finding the “nasty” stuff by ourselves if we weren’t otherwise shown how. I’m pretty sure most of my friends had seen Robocop way before the 18 mark, for instance. By the time we were 8, we were vying for social status, symbols of “maturity” and to be trendsetters in recognition of the power that adults held. We routed around parental control *ourselves*, because we wanted to, not because somebody told us to. Why? Because we could. Because nobody really explained to us *why* things were good or bad. We discovered our own ethics based on practicality and, above all, social status. Hacking a filter bestowed more status than anything else – which is funny considering the various attempts to open up the Great Firewall of China these days.

    Have another look at this sentence:

    “genuinely useful filters for use by parents, schools and even consenting adults”

    Here’s my question – where are the genuinely useful filters for children, for teenagers, even for young adults? Or, more basically, *what* are they? Kids have wants that aren’t illegal or immoral. Maybe if we showed them how to capitalise on those genuine wants, instead of trying to shove more restrictions on to them, we’d respect them more and they us.

  4. Is this only about “kids”? I think it’s about me, as well.

    Have I become desensitized to all sorts of content from years of looking at things like rotten.org?

    I’m sure others would say that I “can choose not to look.” This is (in my opinion) a weak argument; placing all responsibility on the reader and none on the publisher runs counter to all our real-world experience.

    Addressing the age issue:

    Putting an 18-cert on a video/DVD places a duty on the person who sells or rents you that object.

    Putting an 18-cert on a website is (as we all know) meaningless.

    We wouldn’t condone (I think) selling cigarettes to a minor.

    Why do we all pretend that information is somehow LESS harmful than (say) cigarettes?

  5. @Dave Birch I’m happy to do the filtering myself but I’m going to need some decent software. Where are the good filtering tools, by the way? As far as I can tell there are three or four monster Windows apps that will cleverly exclude references to ‘Scunthorpe’ and nod through the neo-Nazis. The industry needs to come up with half a dozen viable filtering schemes for me to choose from. I guess the economics is difficult… OpenDNS is cool, though, and could form the basis of a good filtering scheme.

  6. @Mat Morrison Censorship will probably fail because it’s based on the old central model but the politicians will keep trying if we don’t provide some usable tools to put users in control of the stream of content they’re exposed to. Logically these tools will live out at the edge somewhere (like the fascinating OpenDNS). What’s needed is a business model (or a public value justification) for operating these filters. With really slick, adaptive, accountable filters we’ll never need censorship…

  7. @Graham I don’t want to be glib but there’s no reason to believe that everything you describe doesn’t still apply. For every instance of ‘filter-seeking behaviour’ there’ll be plenty of ‘filter-avoiding behaviour’ from the teens. The important thing is that the filters at least exist and that an effort has been made to provide a business case. I just want it to be possible for me to protect my kids from being frightened or exploited while I can!

  8. Steve, I’ve been using NetNanny on the kids PC for years – works perfectly for me. The kids can ask for overrides when it blocks something innocent, and I can tinker with the categories that get blocked. Also, it emails me with every ‘block’ it makes, so that I can keep a vague eye on whether the kids are actively trying to see things they shouldn’t – i.e. it allows me to fulfill my proper parental responsibility and talk to them about it. The only downside is that it runs on PC only, so the poor old kids get a PC to use. But they don’t seem to mind …
    I’d rather not have the govt set up my filters for me, thanks very much.

Leave a comment

Your email address will not be published. Required fields are marked *