Tag Archives: history

What’s wrong with atheists?

I’m an atheist. Just getting that out of the way. Because this is about a problem that I have with atheists. Not all atheists. Just the strident ones, the humourless ones who form and join clubs, who campaign and complain and object. The ones who picket shopping malls when they provide prayer rooms but not ‘rational contemplation rooms’. Those ones.

The source of my problem is simple enough. Atheists are wrong. To be clear: they’re not especially wrong. They’re just roughly as wrong as everyone else. And, like everyone else, from far enough away they’re almost completely wrong. I can say this with certainty. We’ve got plenty of evidence. Thousands of years of it. Neolithic astronomers could line up the stones for the equinox but were wrong about everything else. Copernicus knew the planets orbited the sun but, we can see, got practically everything else wrong. The Papal inquisition was wrong. But so was Galileo. Newton was wrong. Darwin was wrong. Even the mighty Darwin. The splendid edifice of his scholarship is intact and still uniquely influential but, across the decades, large parts of it have been revised, replaced, dropped – as they should. The flat-earthers and the ether/phlogiston merchants – they were all wrong. But then, later on, so was Einstein. Being wrong is more-or-less universal (everyone’s wrong) and more-or-less eternal (all the time). And the more time passes, the more wrong we all are.

To make it more obvious, go back a bit further. Go back ten thousand years, in fact. To the time of the first big settlements and the beginning of farming and the origin of written language and inquiry into the world. What did we know then that isn’t now known to be wrong? Clue: almost nothing. See what I mean?

Now wind forward ten thousand years from the present day: from out there, from as far into the future as we’ve come since the last ice age, almost everything we take for granted now is going to be wrong. Horribly, fundamentally wrong. Wrong in ways that will ripple through human knowledge and force us to revise even our most basic assumptions about the world. Wrong in ways that will make our future selves laugh as they look back and wonder how any of us – believers or non-believers – managed to dress ourselves in the morning.

But, you’ll protest, it’s not about being right or wrong, its about the method. Rational inquiry – the scientific method – actually depends on being regularly, consistently wrong. And, of course, you’ll be right. The big difference between the scientific method and the invisible fairies crowd is the tolerance for being wrong, the constant readiness to check your thought against reality and revise it. The religious folk have a fixed worldview. In fact, their worldview depends on nothing changing: on invariant laws handed down by Gods. Case closed, surely?

But no. Not at all. Rewind again (go the whole ten thousand if you want). Examine the thought of an earlier era – the myths and laws and creation stories of that time. See where I’m going with this? Are they really invariant? Are they even, in fact, recognisable? Do the beliefs that animated the irrational folk of earlier eras still apply? No, they don’t. They’ve been overturned, thrown out and replaced – dozens, hundreds, thousands of times. Objects of worship, origin stories, social and ritual elements: are any the same now as they were in earlier periods? Hardly any. It turns out that just because religious people say their beliefs are eternal and unvarying, it doesn’t mean they actually are. They shift and change constantly. The Vatican, which persecuted and executed astronomers, now operates an important observatory. Muslims, Jews, Buddhists – they change their minds all the time, constantly (when looked at from the right distance) revising and updating their beliefs, quietly dropping the stuff that’s incompatible with current models.

So, rational folk (like me) are as wrong as everyone else and – more than that – have no monopoly on a readiness to update their thought as they acquire new knowledge. And this is what upsets me about the assertive hard-core of atheists/secularists/rationalists – the ones who put ‘atheist’ in their Twitter bios, do stand-up comedy about the silly believers, sue the council for putting on carol concerts and all the rest. Being slightly less wrong than the God botherers doesn’t make you right. We should have the humility to recognise that – over the long run – we’re all gloriously, irredeemably wrong.

Update 30/04: James O’Malley has posted an interesting response to this post called, naturally, ‘What’s Right with Atheists‘!

Tim Berners-Lee’s most important decision

Of the dozens of design decisions that TBL made during 1989, all of which continue to shape the way we build and use the web twenty-five years later, the most important was not requiring permission to link. Seems obvious now – a non-feature in fact – but it’s the reason you’re not reading this on Xanadu (or Prestel or on an X500 terminal or something). The logic of the times – embedded in those other systems – was that documents and data sources had owners and that you couldn’t just link to them without some kind of formal permission. Permission was defined as a system-to-system, technical handshake kind of thing or a person-to-person interaction, like a phone call, or, God forbid, a contract and some kind of payment. TBL’s judgement was that the community he was building his system for – the academics and engineers – wouldn’t want that and that the spontaneity of the hyperlink would trump the formality of permission. And, of course, he was right. It’s the spontaneously-created hyperlink that triggered the marvellous, unstoppable promiscuity of the World Wide Web. It explains the web’s insane rate of growth and the profusion of web services. It’s the root of all this.

Seven things it’s worth remembering about Wikileaks

Before its inglorious founder takes it down with him or before it’s chased off the Internet by enraged governments, it’s worth remembering what Wikileaks was before it became a cause celebre:

  1. It used to be a wiki. It stopped being a wiki in 2010.
  2. It was an anonymous drop-box. Whistleblowers could deposit documents without fear of being identified. This was the radical core of Wikileaks. They say that submissions are still accepted but the drop-box was switched off in 2010 too.
  3. It was about using the Internet’s open, peer-to-peer, symmetric-in-all-directions architecture to return power to ordinary people inside dumb corporations and repressive regimes. The kind of thing we always said the Internet was for. But it was also anarchic and unaccountable. It made free speech advocates and netheads queasy.
  4. There was something glamorous and edgy about it. It was morally complicated, like a le Carré plot. All those secrets and their forced disclosure, the chaos and panic that their untimely release caused, the attacks from government black-hats, the comicbook torrent of documents fired in its direction. And whatever you think of Assange – hero, demagogue, victim, criminal – he’ll be an important figure when the histories of the first decades of the Internet era are written.
  5. It became home to documents removed from the public record by courts or governments. It claimed a status above national law and essentially demolished the super-injunction and the cosy media blackout. This was bound to make it of interest to lawyers and governments right from the start.
  6. It was run by a maverick and his mates, so governance and accountability looked weak. Wikileaks contained the seeds of the Assange meltdown from the beginning. We could have anticipated all this (maybe not the Ecuadorian embassy balcony bit).
  7. It was a trial-run for a full-on infowar, for authority’s fight-back against the unruly net. Payment processors, service providers, media partners and sponsors all came under huge pressure and mostly buckled. The net’s apparently ungovernable, distributed, supra-national structure turned out to provide hardly any protection at all. We learn that a determined state supported by compliant corporations can damage or destroy an outlaw entity like Wikileaks. That’s an important lesson for you cyberpunks. You’re gonna need a bigger boat.

(update, 22 August, I collapsed the eight things into seven.)

NTK: “exactly the same thing, 15 years late”

Fifteen years ago, when it was all fields round here, Danny O’Brien and Dave Green – who were well-known in underground gaming/comedy/tech/confectionery circles – began to publish an email newsletter for and about the British tech community. It was a joy from the beginning – authentic, funny, playful, insightful… Geek storytelling that is probably already consulted as a primary source for the arrival of this whole network/digital/computer thing… NTK is on the newsstands again, in a clever time-shifted form. Sign up here.

Bowblog is ten years old today

I adapted the name from Kevin Werbach’s Werblog, which seemed like a cool thing to do at the time. I’d been blogging for a few years before that (since the twentieth century) but only fragments survive. The archive suggests I used to update it a lot more too – back before Twitter. Anyway, I’ll tweet a few anniversary posts. Here’s the first one, from 21 May 2002, which includes a link to a column I used to have in The Guardian’s Online section (then under the legendary Vic Keegan’s editorship).

Igor Stravinsky, Tupac Shakur and the uncanny

The Player Piano was the Tupac Hologram of its day.

The most thrilling of our inventions are the ones that return to us a person we’ve lost or that recall a scene from the past that we couldn’t have experienced or a place we couldn’t have known. There’s a rush, a kind of zipwire effect. WOOSH. BANG. You’re there. And sometimes these experiences are so vivid they cross over into the uncanny and the hairs on the back of your neck stand up. A list of these moments would be a long one, but try this ultra-vivid portrait of the Carusos in 1920. The rush here is a compound effect of a fabulous technology, as-yet unmatched in the digital era – a large, glass negative – plus the amazing light on that New York terrace and those eyes. Blimey. Or this: the first view of the earth from the moon. Tell me you didn’t shiver (and note, also, that in order to qualify as ‘uncanny’ it doesn’t need to be a hyper-real simulation of a human).

The player piano is another piece of nineteenth century tech that’s highly productive of the uncanny. The knowledge that the sound you’re hearing, when the paper roll begins to turn, reproduces in detail the actual playing of a long-dead musician – not the acoustic effect but an actual mechanical trace, recorded as holes punched in paper – changes the effect startlingly. The fact that sometimes that musician was the composer – Gershwin or Rachmaninov or Stravinsky – makes it more uncanny still. I was lucky enough to be standing next to one of these player pianos – a kind of half mechanical-half human steampunk cyborg – ten days ago in a Broadcasting House studio, as its owner Rex Lawson brought to life one of Stravinsky’s amazing piano rolls. It was a remarkable experience: Stravinsky was very much in the room. Here’s a video I made of that strange encounter of machine, memory and music:

And a few days later, Tupac ‘appeared’ at Coachella, turning the uncanny dial up a few notches but instantly reminding me of that Stravinsky experience. I wish I’d been there. Everyone who was says that it was amazing – and some were so freaked out by Tupac’s ‘appearance’ they declared that they disapproved, that it was somehow disrespectful. And the Tupac hologram, which was a synthesis of historic appearances and some mindblowing 3D simulation, is from the same family of technologies as the player piano. Spooky.

You know, actual curation

Patrick Keiller's 'Robinson Institute' at Tate Britain

Everyone’s going on about curation these days. We’re all curators now. But yesterday I witnessed some of the old-fashioned variety, the kind they do in art galleries, and I was blown away.

I took two of my kids to Tate Britain (four different modes of transport: train, tube, boat and bus – I suspect that’s what they’ll remember about the day). First I dragged them round Patrick Keiller’s ‘Robinson Institute’ which, in truth, was my main reason for schlepping across London (like I said, four modes of transport…). I love Keiller’s films (although I haven’t seen Robinson in Ruins yet) and I was really excited to see what he’d come up with in an art gallery. It’s a stunning exhibit – works from the Tate’s collection are brought together with passages from Keiller’s films, books, film stills and artefacts of his own (over 120 works in all).

This is curation as storytelling as art. The connections Keiller makes are provocative, funny, illuminating. Nineteenth century romantic and picturesque imagery (landscapes, landowner portraits, animal pictures) interleaved with documents of resistance to enclosure, maps, signposts and other inscriptions made by humans on the landscape. Also those Keiller signature images of mysterious and desolate scientific and military establishments and quite a lot of post-war conceptual art. And the persistent Robinson cosmic entrainment stuff is here: meteors, geological patterns, lay lines and other psycho-geo tropes. It’s magically done. A situationist people’s history. A visual poem.

And the designers have done simple things to parenthesise the content – the works are offset from the gallery walls in a kind of linear zig-zag that gives the choice a kind of scrapbook-feel. The Tumblr/Pinterest generation will get this. It’s a cheeky, delirious intellectual walkabout.

Next (after the compulsory visit to the cafe for cake, obviously) we walked through to the Clore Gallery and caught what I learn was the second-to-last last day of another beautiful specimen of the curator’s art. David Blayney Brown is the man behind the wonderful ‘Romantics‘, a show that mashes up the work of the Clore’s anchor tenant, JMW Turner, with that of his contemporaries to tell the story of Romanticism in a way that was hugely and pleasurably engaging for an art history pygmy like myself (I notice that the broadsheet reviews for the show when it opened nearly two years ago were pretty snooty about the accessible format – I think this kind of curation with a personality will put critics’ noses out of joint – it seems to be straying onto their territory).

This is (was, sorry!) a highly-visible kind of curation – opinionated and full of information about the period and the context. Big, assertive statements about the context and the work are printed in huge type alongside pictures grouped together in ‘pods’. It’s a really vigorous narrative, full of energy and ideas. I came away with a sense of the flow of events and the interaction of personalities that I’d never have got from the mute curation of the old school. Gripping storytelling about art.

And the whole experience (not the cake, obviously, or the boat) was a quite bracing reminder that this curation business is really not about pointing, in a sort of dilatory way, at stuff we like the look of (I called it ‘the curatorial twitch’ in an earlier post), but about the hard graft of assembling artefacts, information, context and inspiration to tell really important stories (see the previous post about Radio 3′s awe-inspiring week of Schubert output for an example of how to do this on the radio).

The second-best book about twentieth century music

'Thus, from the birth of radio circa 1922 to its death by TV and reruns in the mid-1940s, there was almost enough work for all the talent in a ballooning country, and all bets were off concerning the incidence of genius.' Quote from 'The House that George Built' by Wilfrid Sheed

Everybody knows the best book about Twentieth Century music is Alex Ross’s The Rest is Noise but there’s another brilliant book set in the same period – Wilfrid Sheed’s The House That George Built, a history of the golden age of American popular music. It’s about the generations of American songwriters, starting at the turn of the twentieth century in what Sheed calls ‘the piano era’, who essentially invented what we now know as popular music.

It’s sub-titled ‘with a little help from Irving, Cole and a crew of about fifty’ and it’s told through the abbreviated life stories of the dozens of lyricists and composers who grafted on Broadway, on Tin Pan Alley and in Hollywood to make us all song addicts. It’s warm and entertaining and full of mad insights into the psychology and economics and aesthetics of pop music.

It’s also a catalogue of amazing songs – from Basin Street Blues to Body and Soul to Baby it’s Cold Outside to April in Paris. I’ve created a Spotify playlist for each section. The artists are a bit variable – performers from the other end of the Twentieth Century aren’t as well-represented as they ought to be on Spotify – and there are a few gaps but it’s an amazing mosaic of song. Let me know if you’ve found better versions.

Is that it for the PC?

A vintage IBM PC

The latest Mac OS is the first that can only be bought from an app store, from a tightly-integrated, locked-down, official source. I reckon that’s pretty much it for the freerange, open platform we call the PC.

Googling myself the other day, I found this article from The Guardian nine years ago.

It’s about the unexpected persistence of the Personal Computer. My point was that the general purpose lump on your desk was already then a dinosaur, overdue for replacement by:

a swarm of gadgets variously attached to your person, colonising your home (at about ankle height), discreetly re-stocking your fridge or representing your interests on the net while you sleep.

The anti-PC forces were then strong, or at least numerous. You had thin-client efforts from Oracle, Sun and various startups, a bunch of clunky ‘Internet-on-your-TV’ products (got some of those in my loft), WAP stuff and the first generation of web apps that offloaded your PC’s functions to the net. So, even then, things didn’t look great for the PC.

But, I proposed, the PC persisted because it offered us a kind of autonomy and control that was profoundly liberating. The PC (in its various flavours) had, after all, entirely changed the world for a lot of us about twenty years previously, precisely because it was a blank slate, an autonomous zone. Do you remember the rather daunting feeling of powering up a new computer in those days? The “What do I do now?” feeling that forced you to a) learn a programming language, b) publish a fanzine or c) write a screenplay – because there simply wasn’t anything else to do. I said, back then:

…for the young, the PC is a liberated zone, a place of permission, autonomy, creativity and of almost unlimited possibilities. Very few man-made things can ever have carried so much meaning, condensed so much value and potential for action.

But now, nearly a decade after that article and thirty years after the revolution began, it looks like the PC may finally have reached its sell-by date. The whole complicated, liberating architecture is collapsing. Steve Jobs used the phrase “the post-PC era” in his keynote on Monday. A BBC manager told me the other day: “we’ve done some research, the PC’s finished as a platform”.

And its replacement doesn’t look anything like as liberating. That ‘swarm’ of devices has arrived but without the messy, unfinished, frankly out-of-control software/hardware ecosystem that produced the generations of iconoclastic hackers and creators busy remaking the world for us in business, politics and culture in 2011.

It drove us all crazy while we fought with it to install printers and format newsletters and debug compilers but we’ll remember the stack of hardware and software that makes up the PC as a place of enormous freedom – to tinker, to modify, to fix, to build and invent.

And will the new, closed platforms evolve into sophisticated tools for creation and invention? Probably. But will they also limit our access to the hardware, close off the OS and force us to add new functionality and content via monopoly commercial gateways? Yes they will.

And what kind of creative culture will emerge from the next thirty years of gorgeous, integrated, properly-finished but utterly closed platforms? Will these post-PC platforms diminish the impatient, inventive hacker mindset that the old platforms produced? Or will the geeks just invent a new one and move on?

Really suffering for your art

Everyone says music is getting more physical again. We continue to get our daily sounds from ever more insubstantial sources, floating above us like those glittering landscapes in Neuromancer, but we’re going to more concerts and festivals than ever and buying more stuff while we’re at it (merch. fancy limited editions. Even musical instruments are booming).

Turns out we love schlepping around for some actual, physical experience of music in an actual physical place as much as we love the disembodied bits. But there’s twenty-first Century physical and there’s eighteenth Century physical.

I’m reading a terrific book called 1791: Mozart’s Last Year, by H.C. Robbins Landon (who died last year). And it’s essentially a catalogue of grim physical trials – of epic journeys (in horse-drawn carriages quite often bought specially for the trip), of intolerable living conditions and diabolical food provided by hateful grandees who never paid their bills, of mysterious debilitating illnesses and (of course) of lives cut short by service to art (and to miserable patrons). The book’s full of enervating phrases like the one at the top (which is from an account of a dinner performance by Mozart) and:

The mail-coach with four horses left Vienna at eight o’clock in the morning and took three days, with twenty-one post stations, to arrive at Prague in the morning

(a trip to Prague to perform at a coronation). And here’s a job ad from Vienna in the period:

A musician is wanted, who plays the piano well and can sing too, and is able to give lessons in both. The musician must also perform the duties of a valet-de-chambre…

(My italics). And then, of course, there was the final, ghastly physicality of his early death:

Suddenly he began to vomit – it spat out of him in an arch – it was brown, and he was dead.

(and that’s from a book based on his wife’s recollections, quoted by Landon).

What I’m left with is an image of the musician as grafter, as under-appreciated, barely-recognised labourer in the fields of art. Sacrifice, privation, hunger, physical collapse – evidently the necessary preconditions for creation in that golden age.