Geeks and Internet industry types like to say that Andy Burnham, our Minister for Culture, doesn’t get the Internet. They’re wrong. He gets the Internet all right. He just doesn’t like it. He doesn’t like its pretensions to autonomy and ungovernability and in particular he doesn’t like its inability to protect kids from stuff they shouldn’t be exposed to.
How should the net respond to Burnham’s increasingly pointed attacks? Should we gleefully point out how ‘clueless’ he is? Should we celebrate his irrelevance or the inevitability of his eventual enlightenment by the unstoppable, unarguable net?
No. We should listen to him and recognise that he speaks for millions of people – parents in particular – for whom the net is a frightening thing: a place where it’s difficult to control your exposure to content and experience. A place that contrasts badly, for instance, with the parts of the media where you can exert control (selecting a movie to watch from an age-rated list, for instance). We should acknowledge that these concerns are real and can be addressed.
And why not? Control of our experience of content is vital – you might almost call it a right. Can we reasonably promote increased access to the earth’s ultimate information resource when we can’t offer users anything better than crude control over the experience? Should we really say “hey, here’s all of human knowledge, culture and experience. Some of it will freak you out but there’s nothing we can do about that. Get used to it.”
As a parent, I should be able to send my kids (ten, nine and five) onto the net in the reasonable expectation that they won’t be frightened or exploited or upset. It really is not enough to say that the only way to guarantee that is to sit at their shoulders as they go online, ready to jump in and curtail the experience if I think it’s going wrong (especially once they’re experiencing it from multiple connections at home, from an iPhone on the school bus or in the school’s ICT suite).
It’s possible to dismiss Burnham’s concerns as those of a nervous n00b or an instinctive authoritarian. We could just say “get over yourself: plunge in, you’ll love it! The good stuff always exceeds the bad and most of the time you’ll never see anything that upsets you” (which is roughly what I say to novices) but that’s not enough. It just fails to acknowledge the actual reality of a wide-open net governed not by historic scarcity but by rip-roaring plenty. The net’s structure: the structure we love and celebrate – distributed, flat, open and permissive – virtually guarantees that it will contain content that will upset many users.
The idea that we should just grin and bear it (or, for instance, require parents to ride shotgun at all times) is ridiculous. It’s especially ridiculous when you consider the sheer amount of time and energy we net professionals put into filtering and sorting and discriminating in our own net lives. We love the range and accessibility of the net but hate the unordered and unproductive soup of content that makes it hard to get things done and prioritise our lives. For a decade now, a significant proportion of start-up businesses have been in exactly this filtering business: providing tools to control the unmediated rush of content.
In fact many of us are excitedly contributing to a revision of the net’s early indiscriminate structure called the semantic web. We engage in (I’m going to give it a name) filter-seeking behaviour and we actively create filters every time we tag a blog post or a photo.
What we should do in response to Burnham’s reflex rejection of the net’s openness and permissiveness is get on and provide the filters people need. The net’s made of computers after all. If we can build filters as powerful and useful as the DNS, Facebook, Google, del.icio.us, Twitter or RSS feeds (they’re all filters of one kind of another) why can’t we shield kids from scary or upsetting content flexibly, adaptively and automatically? If we can constantly improve the relevance and usefulness of search results why can’t we filter out nastiness and offense for our kids in an intelligent way?
If we as an industry can’t hook together metadata, algorithms, user experience and human editorial effort to provide genuinely useful filters for use by parents, schools and even consenting adults, we won’t long be able to resist the arguments of Burnham and others for restrictions on the supply-side: the content itself. We need to recognise the legitimacy of human filter-seeking behaviour and acknowledge that the continued existence of the wide-open net depends in large part on our ability to filter its experience for vulnerable users.
I think you’re being a bit kind to Burnham. He isn’t just wrong about this, he’s wrong about everything – ID cards, copyright, libel etc. His natural New Labour centralising model of command and control just doesn’t work so far as Internet content is concerned. But what’s wrong with the alternative? Why shouldn’t parents (eg, me) be responsible? I have parental controls turned on on the machines in my house and my kids are not allowed to use their PCs in their bedrooms, and these seem like reasonable ways to deal with the problem. If someone has a web site with child porn on it, then prosecute them, don’t persecute everyone else.
I feel quite strongly about this, because I can see a real slippery slope here and before you know it we’ll have to apply for some sort of licence in order to have our blogs!
I’m probably less of a knee-jerk libertarian than the rest of my geek peer group. I agree with you that Burnham may have expressed an important thought rather poorly.
Your response seems most sane; the problem isn’t that “all censorship is bad,” but rather that “attempts at censorship are becoming futile.” What we need to do, I think, is agree a new set of behaviours, protocols and tools.
One of my favourite historical anecdotes has it that when James I’s courtiers wanted to punish him for some political transgression, they did so by refusing to censor his incoming mail. As king he received a large amount of what we would now call spam.
This tells us something about censorship, I think; that it is – as you say – just another kind of filter. We already filter our email. We allow news editors (or the Digg crowd, or Google) to filter our news. Like millions, I let music labels filter my musical taste.
Censorship is an emotive term. Almost no-one would stand up in support of such a thing (although it’s worth bearing in mind that censorship has been the norm for far longer than “free speech”, and still exists in all nations where “free speech” rights exist.)
But everyone would support “editorial control” or “filters,” I think.
The discussion is a good one, and one that needs to be had. In my mind, there are 2 important assumptions that are being made (wrongly, imho) though:
1. The assumption that the internet makes “nastiness” available where it wasn’t before. I don’t remember childhood as being particularly “nice” – simply because the world isn’t a nice place. Death and violence aren’t just in the domain of films, or adult websites. They happen in the playground, on the news, at our feet. (One of the most shocking moments I can think of from my youth was, on my way to school, coming across a dead cat that had been run over.)
Of course, that’s not to say we should throw all the nasty stuff at our kids 🙂 What I mean is that being exposed to the nasty stuff is part of growing up. Our job is not necessarily simply to filter stuff out so that kids don’t see it. Our job is to prepare them for it and deal with it when it happens. This “progression”, I think, is *more* important than the simple filtering debate.
2. The assumption that kids are “innocent”. In my day, I seem to recall that my peers and I were actually quite good at going out and finding the “nasty” stuff by ourselves if we weren’t otherwise shown how. I’m pretty sure most of my friends had seen Robocop way before the 18 mark, for instance. By the time we were 8, we were vying for social status, symbols of “maturity” and to be trendsetters in recognition of the power that adults held. We routed around parental control *ourselves*, because we wanted to, not because somebody told us to. Why? Because we could. Because nobody really explained to us *why* things were good or bad. We discovered our own ethics based on practicality and, above all, social status. Hacking a filter bestowed more status than anything else – which is funny considering the various attempts to open up the Great Firewall of China these days.
Have another look at this sentence:
“genuinely useful filters for use by parents, schools and even consenting adults”
Here’s my question – where are the genuinely useful filters for children, for teenagers, even for young adults? Or, more basically, *what* are they? Kids have wants that aren’t illegal or immoral. Maybe if we showed them how to capitalise on those genuine wants, instead of trying to shove more restrictions on to them, we’d respect them more and they us.
Is this only about “kids”? I think it’s about me, as well.
Have I become desensitized to all sorts of content from years of looking at things like rotten.org?
I’m sure others would say that I “can choose not to look.” This is (in my opinion) a weak argument; placing all responsibility on the reader and none on the publisher runs counter to all our real-world experience.
Addressing the age issue:
Putting an 18-cert on a video/DVD places a duty on the person who sells or rents you that object.
Putting an 18-cert on a website is (as we all know) meaningless.
We wouldn’t condone (I think) selling cigarettes to a minor.
Why do we all pretend that information is somehow LESS harmful than (say) cigarettes?
@Dave Birch I’m happy to do the filtering myself but I’m going to need some decent software. Where are the good filtering tools, by the way? As far as I can tell there are three or four monster Windows apps that will cleverly exclude references to ‘Scunthorpe’ and nod through the neo-Nazis. The industry needs to come up with half a dozen viable filtering schemes for me to choose from. I guess the economics is difficult… OpenDNS is cool, though, and could form the basis of a good filtering scheme.
@Mat Morrison Censorship will probably fail because it’s based on the old central model but the politicians will keep trying if we don’t provide some usable tools to put users in control of the stream of content they’re exposed to. Logically these tools will live out at the edge somewhere (like the fascinating OpenDNS). What’s needed is a business model (or a public value justification) for operating these filters. With really slick, adaptive, accountable filters we’ll never need censorship…
@Graham I don’t want to be glib but there’s no reason to believe that everything you describe doesn’t still apply. For every instance of ‘filter-seeking behaviour’ there’ll be plenty of ‘filter-avoiding behaviour’ from the teens. The important thing is that the filters at least exist and that an effort has been made to provide a business case. I just want it to be possible for me to protect my kids from being frightened or exploited while I can!
Steve, I’ve been using NetNanny on the kids PC for years – works perfectly for me. The kids can ask for overrides when it blocks something innocent, and I can tinker with the categories that get blocked. Also, it emails me with every ‘block’ it makes, so that I can keep a vague eye on whether the kids are actively trying to see things they shouldn’t – i.e. it allows me to fulfill my proper parental responsibility and talk to them about it. The only downside is that it runs on PC only, so the poor old kids get a PC to use. But they don’t seem to mind …
I’d rather not have the govt set up my filters for me, thanks very much.