Tag Archives: history

Seven things I learnt from the British Library’s Magna Carta show

The British Library has a terrific, totally absorbing show about Magna Carta – which is the cornerstone of world democracy or a sort of baronial shopping list weirdly granted in a field by a King who didn’t mean it – depending on your perspective. It includes two original 1215 manuscripts and dozens of other beautiful documents. It’s not enormous but there is a lot of reading so the audio guide is worth the money. I’m not a historian – or even very bright – so I learnt a lot, like for instance:

1. Magna Carta’s actual connection to the present day is unbelievably tenuous. The whole thing was repealed a couple of months after it was agreed, the Pope (who was technically in charge at the time) rubbished the enterprise completely (which is what reluctant signatory King John wanted him to do all along) and hardly any of the charter’s provisions survive in law. That it has any influence at all should be a surprise. That it’s the central text of representative democracy and the rule of law all over the place is mind-blowing. This is how pieces of paper (parchment) become totems, people.

2. The first one isn’t the important one. Later ‘editions’ of Magna Carta, copied out by monarchs, bishops, lawyers, barons – each introducing their own variations, glosses, limitations, expansions – have been more important in the formation of law and practice. Henry III’s 1225 version is probably the most influential and the nearest to a definitive Magna Carta.

3. Magna Carta didn’t make it into print for nearly 300 years. The first printed edition was published in London in 1508 (Caxton got going in 1473) and the first English translation wasn’t printed until 1534. That’s when its influence exploded. Hardly anyone knew it existed before that – the constitution nerds and rule-of-law geeks of their day. Once it could be passed around, though, in compact printed form, its language began to be used in laws, cited in disputes with overbearing monarchs, quoted in the popular prints. So – you guessed this already – the long-term influence of Magna Carta is actually all about advances in content distribution technology.

Part of the 1689 Bill of Rights
4. The Bill of Rights of 1689 is a much more important document. It’s an actual act of Parliament to begin with, using recognisable legal language, and most of its provisions actually survive in law. It’s the Bill of Rights that we have to thank for the modern idea of ‘civil rights’. Many later documents owe a lot to the 1689 Bill of Rights – not least its American namesake (if you Google ‘Bill of Rights’ the English one doesn’t show up until page two) and the European Convention on Human Rights (PDF). I’m happy to learn that the resonant phrase “certain ancient rights and liberties” is from the Bill of Rights. It’s also, incidentally, unbelievably beautiful. Whoever wrote out the original document had the most exquisite roundhand. It makes Magna Carta look shabby.

5. The Cato Street conspiracy is one intense story. And it’s got the lot: a government spy, a honey trap, a ridiculous, hopelessly bodged plan straight out of a Tarantino movie and a brutal response from the state, including the last judicial beheading to take place in England. The conspirators set out not to assassinate a statesman; they set out to assassinate all of them – the whole cabinet anyway. Their beef was, er, vague, but hinged on the oppression triggered by the wave of European revolutions that preceded it. And Magna Carta was cited in the defence when the case came to trial.

Poster for Chartist meeting, Carlisle, 1839, from the National Archives
6. The Chartists knew how to design a poster. As I said, I’m no historian but the orthodoxy is that the Chartists achieved almost nothing. They were after the vote for working men but it was decades before suffrage was extended meaningfully (and did you know that it was 1918 before all men over 21 could vote?). Fear of dissent and revolution meant the Chartists were harried out of existence before they could produce any change. But, while they were active, they were great communicators and the first movement to make really smart use of mass protest, of what we’d now call ‘the street’. This poster, which is in the National Archives, is absolutely beautiful. A vernacular letterpress masterpiece. We should all aspire to such clarity (there are others, like this one, for a meeting at Merthyr Tydvil in 1848 and this one, for a meeting in Birmingham in the same year. All lovely).

7. 1935 was the 720th anniversary of the signing of Magna Carta so, unaccountably, a year before that, a great pageant was held at Runnymede, site of the signing.

Advertised as a celebration of English democracy, the pageant engaged some 5000 actors, 200 horses and 4 elephants, who over eight days performed eight historical scenes, the centrepiece being a recreation of the sealing of Magna Carta. (Apparently the elephants were withdrawn at the last minute.)

The pictures and this Pathé newsreel suggest a very English blend of eccentric and noble, camp and dignified. I’d love to have been there. This BL blog post suggests something rather splendid and rousing: ‘It’s a Knockout’ meets a BBC Four history doc.

What’s wrong with atheists?

I’m an atheist. Just getting that out of the way. Because this is about a problem that I have with atheists. Not all atheists. Just the strident ones, the humourless ones who form and join clubs, who campaign and complain and object. The ones who picket shopping malls when they provide prayer rooms but not ‘rational contemplation rooms’. Those ones.

The source of my problem is simple enough. Atheists are wrong. To be clear: they’re not especially wrong. They’re just roughly as wrong as everyone else. And, like everyone else, from far enough away they’re almost completely wrong. I can say this with certainty. We’ve got plenty of evidence. Thousands of years of it. Neolithic astronomers could line up the stones for the equinox but were wrong about everything else. Copernicus knew the planets orbited the sun but, we can see, got practically everything else wrong. The Papal inquisition was wrong. But so was Galileo. Newton was wrong. Darwin was wrong. Even the mighty Darwin. The splendid edifice of his scholarship is intact and still uniquely influential but, across the decades, large parts of it have been revised, replaced, dropped – as they should. The flat-earthers and the ether/phlogiston merchants – they were all wrong. But then, later on, so was Einstein. Being wrong is more-or-less universal (everyone’s wrong) and more-or-less eternal (all the time). And the more time passes, the more wrong we all are.

To make it more obvious, go back a bit further. Go back ten thousand years, in fact. To the time of the first big settlements and the beginning of farming and the origin of written language and inquiry into the world. What did we know then that isn’t now known to be wrong? Clue: almost nothing. See what I mean?

Now wind forward ten thousand years from the present day: from out there, from as far into the future as we’ve come since the last ice age, almost everything we take for granted now is going to be wrong. Horribly, fundamentally wrong. Wrong in ways that will ripple through human knowledge and force us to revise even our most basic assumptions about the world. Wrong in ways that will make our future selves laugh as they look back and wonder how any of us – believers or non-believers – managed to dress ourselves in the morning.

But, you’ll protest, it’s not about being right or wrong, its about the method. Rational inquiry – the scientific method – actually depends on being regularly, consistently wrong. And, of course, you’ll be right. The big difference between the scientific method and the invisible fairies crowd is the tolerance for being wrong, the constant readiness to check your thought against reality and revise it. The religious folk have a fixed worldview. In fact, their worldview depends on nothing changing: on invariant laws handed down by Gods. Case closed, surely?

But no. Not at all. Rewind again (go the whole ten thousand if you want). Examine the thought of an earlier era – the myths and laws and creation stories of that time. See where I’m going with this? Are they really invariant? Are they even, in fact, recognisable? Do the beliefs that animated the irrational folk of earlier eras still apply? No, they don’t. They’ve been overturned, thrown out and replaced – dozens, hundreds, thousands of times. Objects of worship, origin stories, social and ritual elements: are any the same now as they were in earlier periods? Hardly any. It turns out that just because religious people say their beliefs are eternal and unvarying, it doesn’t mean they actually are. They shift and change constantly. The Vatican, which persecuted and executed astronomers, now operates an important observatory. Muslims, Jews, Buddhists – they change their minds all the time, constantly (when looked at from the right distance) revising and updating their beliefs, quietly dropping the stuff that’s incompatible with current models.

So, rational folk (like me) are as wrong as everyone else and – more than that – have no monopoly on a readiness to update their thought as they acquire new knowledge. And this is what upsets me about the assertive hard-core of atheists/secularists/rationalists – the ones who put ‘atheist’ in their Twitter bios, do stand-up comedy about the silly believers, sue the council for putting on carol concerts and all the rest. Being slightly less wrong than the God botherers doesn’t make you right. We should have the humility to recognise that – over the long run – we’re all gloriously, irredeemably wrong.

Update 30/04: James O’Malley has posted an interesting response to this post called, naturally, ‘What’s Right with Atheists‘!

Tim Berners-Lee’s most important decision

Of the dozens of design decisions that TBL made during 1989, all of which continue to shape the way we build and use the web twenty-five years later, the most important was not requiring permission to link. Seems obvious now – a non-feature in fact – but it’s the reason you’re not reading this on Xanadu (or Prestel or on an X500 terminal or something). The logic of the times – embedded in those other systems – was that documents and data sources had owners and that you couldn’t just link to them without some kind of formal permission. Permission was defined as a system-to-system, technical handshake kind of thing or a person-to-person interaction, like a phone call, or, God forbid, a contract and some kind of payment. TBL’s judgement was that the community he was building his system for – the academics and engineers – wouldn’t want that and that the spontaneity of the hyperlink would trump the formality of permission. And, of course, he was right. It’s the spontaneously-created hyperlink that triggered the marvellous, unstoppable promiscuity of the World Wide Web. It explains the web’s insane rate of growth and the profusion of web services. It’s the root of all this.

Seven things it’s worth remembering about Wikileaks

Before its inglorious founder takes it down with him or before it’s chased off the Internet by enraged governments, it’s worth remembering what Wikileaks was before it became a cause celebre:

  1. It used to be a wiki. It stopped being a wiki in 2010.
  2. It was an anonymous drop-box. Whistleblowers could deposit documents without fear of being identified. This was the radical core of Wikileaks. They say that submissions are still accepted but the drop-box was switched off in 2010 too.
  3. It was about using the Internet’s open, peer-to-peer, symmetric-in-all-directions architecture to return power to ordinary people inside dumb corporations and repressive regimes. The kind of thing we always said the Internet was for. But it was also anarchic and unaccountable. It made free speech advocates and netheads queasy.
  4. There was something glamorous and edgy about it. It was morally complicated, like a le Carré plot. All those secrets and their forced disclosure, the chaos and panic that their untimely release caused, the attacks from government black-hats, the comicbook torrent of documents fired in its direction. And whatever you think of Assange – hero, demagogue, victim, criminal – he’ll be an important figure when the histories of the first decades of the Internet era are written.
  5. It became home to documents removed from the public record by courts or governments. It claimed a status above national law and essentially demolished the super-injunction and the cosy media blackout. This was bound to make it of interest to lawyers and governments right from the start.
  6. It was run by a maverick and his mates, so governance and accountability looked weak. Wikileaks contained the seeds of the Assange meltdown from the beginning. We could have anticipated all this (maybe not the Ecuadorian embassy balcony bit).
  7. It was a trial-run for a full-on infowar, for authority’s fight-back against the unruly net. Payment processors, service providers, media partners and sponsors all came under huge pressure and mostly buckled. The net’s apparently ungovernable, distributed, supra-national structure turned out to provide hardly any protection at all. We learn that a determined state supported by compliant corporations can damage or destroy an outlaw entity like Wikileaks. That’s an important lesson for you cyberpunks. You’re gonna need a bigger boat.

(update, 22 August, I collapsed the eight things into seven.)

NTK: “exactly the same thing, 15 years late”

Fifteen years ago, when it was all fields round here, Danny O’Brien and Dave Green – who were well-known in underground gaming/comedy/tech/confectionery circles – began to publish an email newsletter for and about the British tech community. It was a joy from the beginning – authentic, funny, playful, insightful… Geek storytelling that is probably already consulted as a primary source for the arrival of this whole network/digital/computer thing… NTK is on the newsstands again, in a clever time-shifted form. Sign up here.

Bowblog is ten years old today

I adapted the name from Kevin Werbach’s Werblog, which seemed like a cool thing to do at the time. I’d been blogging for a few years before that (since the twentieth century) but only fragments survive. The archive suggests I used to update it a lot more too – back before Twitter. Anyway, I’ll tweet a few anniversary posts. Here’s the first one, from 21 May 2002, which includes a link to a column I used to have in The Guardian’s Online section (then under the legendary Vic Keegan’s editorship).

Igor Stravinsky, Tupac Shakur and the uncanny

The Player Piano was the Tupac Hologram of its day.

The most thrilling of our inventions are the ones that return to us a person we’ve lost or that recall a scene from the past that we couldn’t have experienced or a place we couldn’t have known. There’s a rush, a kind of zipwire effect. WOOSH. BANG. You’re there. And sometimes these experiences are so vivid they cross over into the uncanny and the hairs on the back of your neck stand up. A list of these moments would be a long one, but try this ultra-vivid portrait of the Carusos in 1920. The rush here is a compound effect of a fabulous technology, as-yet unmatched in the digital era – a large, glass negative – plus the amazing light on that New York terrace and those eyes. Blimey. Or this: the first view of the earth from the moon. Tell me you didn’t shiver (and note, also, that in order to qualify as ‘uncanny’ it doesn’t need to be a hyper-real simulation of a human).

The player piano is another piece of nineteenth century tech that’s highly productive of the uncanny. The knowledge that the sound you’re hearing, when the paper roll begins to turn, reproduces in detail the actual playing of a long-dead musician – not the acoustic effect but an actual mechanical trace, recorded as holes punched in paper – changes the effect startlingly. The fact that sometimes that musician was the composer – Gershwin or Rachmaninov or Stravinsky – makes it more uncanny still. I was lucky enough to be standing next to one of these player pianos – a kind of half mechanical-half human steampunk cyborg – ten days ago in a Broadcasting House studio, as its owner Rex Lawson brought to life one of Stravinsky’s amazing piano rolls. It was a remarkable experience: Stravinsky was very much in the room. Here’s a video I made of that strange encounter of machine, memory and music:

And a few days later, Tupac ‘appeared’ at Coachella, turning the uncanny dial up a few notches but instantly reminding me of that Stravinsky experience. I wish I’d been there. Everyone who was says that it was amazing – and some were so freaked out by Tupac’s ‘appearance’ they declared that they disapproved, that it was somehow disrespectful. And the Tupac hologram, which was a synthesis of historic appearances and some mindblowing 3D simulation, is from the same family of technologies as the player piano. Spooky.

You know, actual curation

Patrick Keiller's 'Robinson Institute' at Tate Britain

Everyone’s going on about curation these days. We’re all curators now. But yesterday I witnessed some of the old-fashioned variety, the kind they do in art galleries, and I was blown away.

I took two of my kids to Tate Britain (four different modes of transport: train, tube, boat and bus – I suspect that’s what they’ll remember about the day). First I dragged them round Patrick Keiller’s ‘Robinson Institute’ which, in truth, was my main reason for schlepping across London (like I said, four modes of transport…). I love Keiller’s films (although I haven’t seen Robinson in Ruins yet) and I was really excited to see what he’d come up with in an art gallery. It’s a stunning exhibit – works from the Tate’s collection are brought together with passages from Keiller’s films, books, film stills and artefacts of his own (over 120 works in all).

This is curation as storytelling as art. The connections Keiller makes are provocative, funny, illuminating. Nineteenth century romantic and picturesque imagery (landscapes, landowner portraits, animal pictures) interleaved with documents of resistance to enclosure, maps, signposts and other inscriptions made by humans on the landscape. Also those Keiller signature images of mysterious and desolate scientific and military establishments and quite a lot of post-war conceptual art. And the persistent Robinson cosmic entrainment stuff is here: meteors, geological patterns, lay lines and other psycho-geo tropes. It’s magically done. A situationist people’s history. A visual poem.

And the designers have done simple things to parenthesise the content – the works are offset from the gallery walls in a kind of linear zig-zag that gives the choice a kind of scrapbook-feel. The Tumblr/Pinterest generation will get this. It’s a cheeky, delirious intellectual walkabout.

Next (after the compulsory visit to the cafe for cake, obviously) we walked through to the Clore Gallery and caught what I learn was the second-to-last last day of another beautiful specimen of the curator’s art. David Blayney Brown is the man behind the wonderful ‘Romantics‘, a show that mashes up the work of the Clore’s anchor tenant, JMW Turner, with that of his contemporaries to tell the story of Romanticism in a way that was hugely and pleasurably engaging for an art history pygmy like myself (I notice that the broadsheet reviews for the show when it opened nearly two years ago were pretty snooty about the accessible format – I think this kind of curation with a personality will put critics’ noses out of joint – it seems to be straying onto their territory).

This is (was, sorry!) a highly-visible kind of curation – opinionated and full of information about the period and the context. Big, assertive statements about the context and the work are printed in huge type alongside pictures grouped together in ‘pods’. It’s a really vigorous narrative, full of energy and ideas. I came away with a sense of the flow of events and the interaction of personalities that I’d never have got from the mute curation of the old school. Gripping storytelling about art.

And the whole experience (not the cake, obviously, or the boat) was a quite bracing reminder that this curation business is really not about pointing, in a sort of dilatory way, at stuff we like the look of (I called it ‘the curatorial twitch’ in an earlier post), but about the hard graft of assembling artefacts, information, context and inspiration to tell really important stories (see the previous post about Radio 3’s awe-inspiring week of Schubert output for an example of how to do this on the radio).

The second-best book about twentieth century music

'Thus, from the birth of radio circa 1922 to its death by TV and reruns in the mid-1940s, there was almost enough work for all the talent in a ballooning country, and all bets were off concerning the incidence of genius.' Quote from 'The House that George Built' by Wilfrid Sheed

Everybody knows the best book about Twentieth Century music is Alex Ross’s The Rest is Noise but there’s another brilliant book set in the same period – Wilfrid Sheed’s The House That George Built, a history of the golden age of American popular music. It’s about the generations of American songwriters, starting at the turn of the twentieth century in what Sheed calls ‘the piano era’, who essentially invented what we now know as popular music.

It’s sub-titled ‘with a little help from Irving, Cole and a crew of about fifty’ and it’s told through the abbreviated life stories of the dozens of lyricists and composers who grafted on Broadway, on Tin Pan Alley and in Hollywood to make us all song addicts. It’s warm and entertaining and full of mad insights into the psychology and economics and aesthetics of pop music.

It’s also a catalogue of amazing songs – from Basin Street Blues to Body and Soul to Baby it’s Cold Outside to April in Paris. I’ve created a Spotify playlist for each section. The artists are a bit variable – performers from the other end of the Twentieth Century aren’t as well-represented as they ought to be on Spotify – and there are a few gaps but it’s an amazing mosaic of song. Let me know if you’ve found better versions.

Is that it for the PC?

A vintage IBM PC

The latest Mac OS is the first that can only be bought from an app store, from a tightly-integrated, locked-down, official source. I reckon that’s pretty much it for the freerange, open platform we call the PC.

Googling myself the other day, I found this article from The Guardian nine years ago.

It’s about the unexpected persistence of the Personal Computer. My point was that the general purpose lump on your desk was already then a dinosaur, overdue for replacement by:

a swarm of gadgets variously attached to your person, colonising your home (at about ankle height), discreetly re-stocking your fridge or representing your interests on the net while you sleep.

The anti-PC forces were then strong, or at least numerous. You had thin-client efforts from Oracle, Sun and various startups, a bunch of clunky ‘Internet-on-your-TV’ products (got some of those in my loft), WAP stuff and the first generation of web apps that offloaded your PC’s functions to the net. So, even then, things didn’t look great for the PC.

But, I proposed, the PC persisted because it offered us a kind of autonomy and control that was profoundly liberating. The PC (in its various flavours) had, after all, entirely changed the world for a lot of us about twenty years previously, precisely because it was a blank slate, an autonomous zone. Do you remember the rather daunting feeling of powering up a new computer in those days? The “What do I do now?” feeling that forced you to a) learn a programming language, b) publish a fanzine or c) write a screenplay – because there simply wasn’t anything else to do. I said, back then:

…for the young, the PC is a liberated zone, a place of permission, autonomy, creativity and of almost unlimited possibilities. Very few man-made things can ever have carried so much meaning, condensed so much value and potential for action.

But now, nearly a decade after that article and thirty years after the revolution began, it looks like the PC may finally have reached its sell-by date. The whole complicated, liberating architecture is collapsing. Steve Jobs used the phrase “the post-PC era” in his keynote on Monday. A BBC manager told me the other day: “we’ve done some research, the PC’s finished as a platform”.

And its replacement doesn’t look anything like as liberating. That ‘swarm’ of devices has arrived but without the messy, unfinished, frankly out-of-control software/hardware ecosystem that produced the generations of iconoclastic hackers and creators busy remaking the world for us in business, politics and culture in 2011.

It drove us all crazy while we fought with it to install printers and format newsletters and debug compilers but we’ll remember the stack of hardware and software that makes up the PC as a place of enormous freedom – to tinker, to modify, to fix, to build and invent.

And will the new, closed platforms evolve into sophisticated tools for creation and invention? Probably. But will they also limit our access to the hardware, close off the OS and force us to add new functionality and content via monopoly commercial gateways? Yes they will.

And what kind of creative culture will emerge from the next thirty years of gorgeous, integrated, properly-finished but utterly closed platforms? Will these post-PC platforms diminish the impatient, inventive hacker mindset that the old platforms produced? Or will the geeks just invent a new one and move on?