Monarchy: to be bothered or not to be bothered?

A night-time aerial view of the convoy of vehicles carrying the body of Queen Elizabeth along the Westway in London
The late Queen entering London via the Westway

Is everyone on the British left anti-monarchist? Does anyone even care?

What does the left think about monarchy? Seems obvious, right? Off with their heads! But no, there’s some complexity here and it connects closely with the weird (almost unique on planet earth) constitutional arrangements that persist here in the archipelago.

On the left there are basically two positions on the monarchy. Not on monarchy in general — only one there really — but on Britain’s actually existing monarchy, the Crown-Constitutional Parliamentary state that’s been locked in here since the 17th Century.

In position one, the monarchy is an unequivocally, catastrophically bad thing—a major impediment to meaningful popular sovereignty and an aspect of Britain’s backward machinery of state. Britain’s monarchy, in this view, is a vital contributor to the country’s long-term decline, solidified in the retreat from empire and the disastrous deindustrialisation of the post-war period.

In this perspective, identified with the British ‘New Left’ since the 1950s and developed in great detail in the pages of New Left Review by brilliant writers like Perry Anderson and Tom Nairn, the settlement that secured the monarchy in 1688 locked in the dominance of sclerotic aristocracy, land-owning elite and the compliant institutions that sustain them. The result is a country that, paradoxically, became a nation-state first and industrialised first but failed fully to make the transition from ancien régime to modernity, whose progress from feudalism to capitalism is still incomplete. In the New Left worldview ceremony, deference and acceptance of hierarchy have naturalised and hardened aristocratic dominance and neutralised the popular radicalism that has expanded democracy elsewhere in the capitalist world.

The other left anti-monarchist perspective, embodied these days by lefties belonging to populist or ‘realist’ strands of the tradition, is much less bothered. These anti-monarchists oppose the unelected power of the monarchy (obvs) but would probably be quite happy to leave the Windsors where they are and get on with the class war. In fact, for left-populists, the fervent opposition to the crown and its institutions embodied by that earlier generation of left-wingers is actually damaging to the cause. For them, the critics and theorists who developed the declinist narrative of the New Left—whose animating idea was that Britain is stuck in a deferential mire and can aspire only to a steady loss of status, relevance and prosperity—are a bit FBPE, a bit ‘metropolitan liberal elite’ and not least because opposing the monarchy puts them at odds these days with a clear majority of the British working class.

It’s not just the left who oppose the monarchy, of course. There’s an indignant centrist/liberal republican movement too – in fact they’ve been at it for longer and they’re more organised than the left. In the Crown’s 19th century slump it was largely Whig/Liberal radicals like Charles Dilkes who opposed the monarchy and, in the present day, the Liberal party itself still has a robust republican strand. The young Liz Truss was not alone in her vituperative opposition to the royals.

The rehabilitation of the monarchy that’s taken place since the end of the 19th Century, described by Tom Nairn in his brilliant and thoroughly New Left book The Enchanted Glass, has, to put it bluntly, worked. The Crown-Parliamentary state won. In the middle ages, monarchs cowered in their palaces, derided publicly. When they ventured into the streets kings and queens were often booed, some were so often away crusading (or hunting, or carousing) they might not be known to their subjects at all.

Labour leadership and conference officials bow during a minute's silence for Queen Elzabeth at the 2022 Liverpool conference
Labour Conference 2022: national anthem at one end and Red Flag at the other

Working people who, until the early 19th Century, were suspicious of or even hostile towards royalty are, especially in the most recent decades of the Queen’s reign, mostly in favour (the numbers might be a bit different among fans of The Crown and the young, of course). Principled opposition to monarchy will just put the left in conflict with the working class and leave lefties in a very familiar position—thrashing about trying to explain why ordinary people don’t agree with them. It all begins to look a bit Remoaner/People’s Vote/Russian money.

So now, a left wing party or movement that put much energy into opposing the monarchy, that set out policies aiming at an elected head of state or even a slightly less anti-democratic settlement—one that, for instance, removed the monarch’s powers of consent to new laws—could only damage its prospects with working people. Leave the monarchs alone, lefties, they’re not worth it.


Over on Goodreads, I reviewed Nairn’s beautifully-written and highly-entertaining page-turner The Enchanted Glass which dates from the period before the annus horribilus but has two excellent forewords by the author bringing the story well into the 21st Century.

I’d go so far as to say that this is a beautiful book. Funny, angry, imaginative – an unforgiving demolition of the fantasies and self-deceptions of Britain’s backward, complacent, destructive tolerance of the invented rituals of modern royalty – the damage done to our democracy, the permanent strangulation of popular sovereignty, the narrowing of our national potential, the bleak prospect of unstoppable decline made inevitable by our unthinking acceptance of the Crown-constitutional status quo.

Published in 1988, there is absolutely nothing dated or irrelevant about this book – in fact one of the most fascinating aspects of the book is the constant, startling correspondences we find between Britain towards the end of the Thatcher revolution and Britain after the Brexit revolt.

The scope of the book is not limited in any way to a critique 0f monarchy or monarchism. My understanding of Britain’s constitutional weirdness (Nairn’s wonderful name for Crown-constitutional Britain is ‘Ukania’) and the powerful parasitic grip of Britain’s social and economic elites has been so enhanced by his excoriating, wide-ranging critique of City, Crown, complacent Parliament, self-interested administrative class, complicit media elite (and so on). I feel I have a new super-power.

Nairn is a brilliant writer, his language sparkles and surprises – you’ll find yourself stopping to look up words you’ve never heard that always turn out to be perfect for the job. It’s an absolute joy – and it took me a long time to read, mainly because I couldn’t help myself from highlighting brilliant passage after brilliant passage on the Kindle (it’s a cause of immense Bezos-directed resentment that I can’t share my highlights with you because I read the book in a non-Amazon format! Sort it out Jeff!).

Nairn, who as of this review, is still with us, developed his powerful argument for the backwardness of Britain’s constitutional arrangements and for the inevitable decline of the state and the polity across his many decades as writer (and editor) at New Left Review. If you have a subscription you can read the entire archive of his writing (and of his pal Perry Anderson – also still with us – another brilliant writer with whom he worked closely) online.

Review of The Enchanted Glass, from Goodreads

I’ve been bingeing on texts about monarchy lately, for obvious reasons. I also recently reviewed David Cannadine’s influential paper ‘The Context, Performance and Meaning of Ritual : The British Monarchy and the ‘Invention of Tradition’ c. 1820–1977’ which is in a terrific book of essays edited by Eric Hobsbawm and Terence Ranger. The essay, published in 1977, explains how the rituals and customs invented or updated in the 19th Century rescued Britain’s Crown from irrelevence or even obliteration in a revolution like those that wiped out the monarchies of Europe one after the other (Hobsbawm’s essay in the same book, a wider-ranging survey of the twilight of monarchy across Europe, is also worth a read).

Another good read in this context is a David Edgerton’s 20th Century history The rise and fall of the British nation, which aims to dismantle the whole declinist New Left narrative. Perry Anderson, owner but not quite sole proprietor of the declinist story, predictably enough dismantled Edgerton’s dismantling in the pages of New Left Review (you might need a subscription to read that one).

Listen to this episode of Bungacast, the podcast from the people who brought you ‘The end of the end of history‘ for a good illustration of the modern, populist left perspective on monarchy—dismissive, derisory but definitely not bothered.

Ten times the Labour party stood behind striking workers in Britain

Actually, there aren’t any. Sorry.

The story is that Labour is the only major socialist party in the world that emerged directly from organised labour—every other important party—from the DSA to the SPD to the PS and the JSP—was the product of an actual revolution or of a popular socialist movement. Labour founders Keir Hardie and Arthur Henderson had both been union leaders and many early Labour parliamentarians were well-known workplace leaders or campaigners for workers’ rights.

(note labour and Labour are used throughout, for obvious reasons).

RMT members holding banners and placards on a picket line in 2022 - photo from the RMT
RMT members on a picket line in 2022 – photo from the RMT

So there’s a logic to the statement that Labour is ‘the party of organised labour’ or ‘the Parliamentary wing of the trade union movement’. And to the reminders that it’s the unions who still largely fund the party. And to the shock and upset amongst supporters when Labour’s parliamentary leadership fails to support union action or even opposes it.

His Majesty’s loyal opposition

It turns out, though, that the will of those early Labour leaders – and of their comrades at the top of the union movement for that matter – was not to win a victory for workers, to challenge or overthrow the parties of power at the time, to replace or diminish the landowner and business elites, or even to offer a pro-worker counterweight in the Commons. The will of those leaders—as of the current generation—was always to gain access, to join the club, to get their bums on the green benches and to form a polite left-hand hump to the Crown-Parliamentary camel, supplanting the previous occupants of the less-favoured benches and becoming ‘His Majesty’s Loyal Opposition’.

This sounds cynical. I don’t mean for a moment to discount the contribution of those pioneer socialists to the pushing back of the multi-century stasis of Tory (and Whig) domination, the epochal introduction into an ancient elite legislature of working people. And, of course, individual Labour members have provided the backbone to countless labour disputes over the years—but it is vital to be clear-eyed about this. Labour in Parliament, from its very beginnings, was not a workers’ party. In the present day it’s a progressive party, a party of the Parliamentary centre-left, but it is not a workers’ party.

So there’s nothing new or surprising in the Labour party distancing itself from the interests of working people—do you remember the grim spectacle of Neil Kinnock making a flying visit to a miners’ strike picket, right at the end of the strike and at 5 a.m. so as to miss the reporters? (See if you can find a photo. There are none). During the long strike Kinnock the miner’s son called for a national ballot and didn’t once ask workers to respect NUM picket lines. In fairness, the strike was perhaps the greatest challenge that a modernising Labour leader could possibly face—and we know that Kinnock was conflicted and unhappy about the position he had to take. It became the most iconic—and relevant—statement of Labour’s labour ambivalence of the post-Thatcher era.

Going back further, almost to the origins of the party, during the First World War, Labour and the unions agreed an ‘industrial truce‘ in the national interest (Labour ministers joined the coalition government). After the war, Labour continued to oppose all instances of labour militancy and, in the build-up to the 1926 general strike, as the climate worsened and employers tried to force through wage cuts, the Labour leadership mediated ineffectively. When the strike came, though, they opposed it.

Leave it to the Rotary Club

When the Jarrow crusaders marched to London ten years later they had to depend on a strange alliance of Quakers, rogue trade unionists and the Rotary Club for food and support along the way—the Labour party didn’t turn up (local MP Ellen Wilkinson was a charismatic exception). 20th Century history is studded with examples like this. Even earlier, when Churchill moored a battleship in the Mersey to bring a little jeopardy to the 1911 Liverpool dockers’ strike, the Labour party, already a force in Parliament, was nowhere to be seen.

In the rough years between the wars there was an explosion of labour activism and confidence—in the face of the great depression, active government repression, blacklisting and a hostile judiciary. Wal Hannington’s National Unemployed Workers Movement moved mountains—organising big marches and actions all over the country. Its leadership was convicted under ancient mutiny laws and imprisoned—right at the sharp end of the workers’ struggle—but for Labour it was a bit too Communist. The party stood back. Likewise, the local councils who defied ancient, repressive laws to hold down the rates and to protect the poor did so without Labour support. In Poplar dozens of councillors—mostly Labour of course, including future leader George Lansbury—were imprisoned for their defiance. The Parliamentary Party leadership opposed their action (in the nineties, you won’t be surprised to learn, Neil Kinnock scolded Labour councillors prosecuted and surcharged for not paying the poll tax).

You’re on your own, ladies

Even when in government the party failed to support strikers. The Grunwick workers were defeated and humiliated while the party withheld support, although in scenes familiar to us now, individual MPs, including cabinet ministers, showed up at the picket line (the record shows that Shirley Williams et al waited until the strike was 40 weeks old and essentially already crushed to offer their calculated solidarity, though). The underpaid women at Ford’s Dagenham plant were left high and dry by a serving Labour government, winning only partial parity, with the half-hearted support of then Secretary of State Barbara Castle.

Castle’s own contribution to labour relations was to lay the foundations for 1974 legislation that withdrew important rights. It was this law that first introduced the requirement for strike ballots – and when the Tories introduced their own anti-union legislation in 1992 it essentially just consolidated Labour’s (the title of the act artfully just adds the word ‘consolidation’ to the name of Labour’s 1974 law). When Blair came to power, of course, he moderated but did not remove the Thatcher ‘reforms’ and actually introduced new limits on legitimate action to meet the requirements of his new backers in business and the media.

One of the biggest strikes of the entire period, taking place right at the heart of the state—in its very guts you might say—the now mostly-forgotten 1971 postal workers’ strike, lasted for seven weeks, had overwhelming support from Post Office workers who had been almost uniquely badly-treated in the post-war period. The strike became a template for Tory government opposition to industrial actionRoyal Mail’s monopoly on delivering letters was suspended in an effort to circumvent the strike’s enormous impact. The strike ended without agreement—a dispiriting defeat. The workers were awarded a backdated 9% pay increase and some changes to working patterns after an inquiry but this didn’t even match what they’d been offered before the strike. No one was happy. Individual Labour MPs, including Tony Benn, who’d been Postmaster General under Harold Wilson in the sixties but by this point was on the back benches, supported the strike. Wilson himself, from the opposition front bench, walked a familiar line, saying that the union’s demands were not unreasonable but advocating independent arbitration by a court of inquiry. He opposed the strike.

The one big win

The extraordinary sequence of slow-downs and strikes that brought about the three-day week and the infamous powercuts in the early seventies is still the only industrial action that has ever brought down a UK government. Heath’s battle with miners and power workers was surely the high-water mark for labour activism in Britain—bringing together workers, party members and movement in a way not seen before or since. It was a highly-effective action, using modern communications to coordinate the strikes and winning significant public support for the cause. The workers won and so did Labour. The Parliamentary Labour Party, while in opposition under Harold Wilson, actually supported the pay claims of the miners (often in House of Commons debates) and, once in office, agreed two 35% pay rises for the miners in the space of two years. But did Labour support the strikes that brought about their 1974 election victory? What do you think?

The series of strikes we now know as the Winter of Discontent was triggered by a Labour government’s imposition of wage controla 5% cap on pay increases. The subsequent industrial action took the form of a battle between state and workers, one which the . Fascinating, of course, that the present round of disputes is, at least in principle, more diffuse, pitting workers against dozens of individual employers—many in categories that did not even exist in the last era of union militancy—but that, even in the absence of a government-imposed wage cap, the state is still profoundly present.

The next generation

In the era of apps and zero-hour contracts, the strikes, walkouts and protests by gig workers, outsourced workforces and workers resisting ‘fire and rehire’ policies would seem to offer a useful opportunity for Labour to remake its association with a new generation of workers and with an updated labour activism for the social media era—with vivid new causes that have revived support for workers in Britain, especially amongst the young. No chance.

I’m not a historian (no shit, Steve) but it’s been an instructive exercise this, searching for Labour support for striking workers over the years of its existence. For me a fascinating and quite urgent reminder that Labour’s role across the modern period has been much more about achieving and sustaining a position in the Westminster constitutional fabric—holding on to what still feels like a wobbly foothold in the institutions at all cost—than about actually transferring power to working people, or even improving their conditions of work or their pay. The choice was pretty simple: take a polite role in the ancient theatre of the Parliamentary system or work for emancipation, popular sovereignty and worker control. You know the rest.

(Can you think of a time that Labour officially supported an industrial action, in or out of office, in the party’s entire history? Leave a comment).

And my scan of the party’s history suggests that it would really be wrong to expect more from the current leadership while in opposition or in government. For Starmer to even acknowledge what looks to many like an important shift in the terms of the national argument in favour of working people and organised labour would be not only to risk a monstering from the Tory press but would also defy literally the entire institutional history of his party. He leads a centrist party that must, almost as a condition of its existence, retain an even and unsupportive distance from its own organised labour wing.

So it seems obvious that Starmer, Reeves et al will not have any difficulty finding good, sensible, tactically-savvy reasons for withholding support from organised labour once they’re in power too. The difficult truth for the leadership of a progressive party in Britain is that there is literally no circumstance in which it is tactically correct to support a strike.

Let’s face it, if the Tolpuddle Martyrs were to come back to life and join the party tomorrow morning, Starmer would have issued a statement, suspended their memberships, conducted a disciplinary and kicked them out by lunchtime.

Competence can fuck off

I learn that Photoshop is thirty. The small revelation that goes with this information is that I’ve been using Photoshop for thirty years.

A screenshot from a very early version of Photoshop

That’s more than half of my life so far. I began using it in my twenties, at the other end of the 1990s, under Margaret Thatcher, under George H.W. Bush, before the first Gulf War, before the Internet had escaped from the Universities (before the web had escaped from that cave under Geneva).

The other revelation, the bigger one tbh, is that it’s possible for a person to spend three decades using a tool fairly regularly without ever acquiring more than the most elementary competence. I’m still a total amateur. I have no idea how to do any but the most basic tasks. Most of the tools and functions are mysterious to me. It’s a huge, deep, layered artefact — like one of those infinitely recursive mind-toys in Borges (or maybe one of Tim Morton’s hyperobjects).

But there follows another revelation. That maybe there’s nothing wrong with this. That using an important tool—a vital set of practices, a complex cultural gadget—without actually mastering it, is okay. Or at least okay for me. That the constant, low-grade anxiety produced by not being very good at things—or being okay at lots of things—might be wrong, self-destructive, stupid.

Even that, for me, this might be the right way to do things: a workable strategy, an appropriate response to the complexity of the tool-world, the contemporary mess of shit that I’m supposed to learn. Maybe I should just leave perfection, competence and mastery to the deep-but-narrow types. It obviously makes some people happy to know what all the modes on this sodding thing do. Good for them. I’ll be over here, fiddling ineffectively.

Seven things I learnt from the British Library’s Magna Carta show

The British Library has a terrific, totally absorbing show about Magna Carta – which is the cornerstone of world democracy or a sort of baronial shopping list weirdly granted in a field by a King who didn’t mean it – depending on your perspective. It includes two original 1215 manuscripts and dozens of other beautiful documents. It’s not enormous but there is a lot of reading so the audio guide is worth the money. I’m not a historian – or even very bright – so I learnt a lot, like for instance:

1. Magna Carta’s actual connection to the present day is unbelievably tenuous. The whole thing was repealed a couple of months after it was agreed, the Pope (who was technically in charge at the time) rubbished the enterprise completely (which is what reluctant signatory King John wanted him to do all along) and hardly any of the charter’s provisions survive in law. That it has any influence at all should be a surprise. That it’s the central text of representative democracy and the rule of law all over the place is mind-blowing. This is how pieces of paper (parchment) become totems, people.

2. The first one isn’t the important one. Later ‘editions’ of Magna Carta, copied out by monarchs, bishops, lawyers, barons – each introducing their own variations, glosses, limitations, expansions – have been more important in the formation of law and practice. Henry III’s 1225 version is probably the most influential and the nearest to a definitive Magna Carta.

3. Magna Carta didn’t make it into print for nearly 300 years. The first printed edition was published in London in 1508 (Caxton got going in 1473) and the first English translation wasn’t printed until 1534. That’s when its influence exploded. Hardly anyone knew it existed before that – the constitution nerds and rule-of-law geeks of their day. Once it could be passed around, though, in compact printed form, its language began to be used in laws, cited in disputes with overbearing monarchs, quoted in the popular prints. So – you guessed this already – the long-term influence of Magna Carta is actually all about advances in content distribution technology.

Part of the 1689 Bill of Rights
4. The Bill of Rights of 1689 is a much more important document. It’s an actual act of Parliament to begin with, using recognisable legal language, and most of its provisions actually survive in law. It’s the Bill of Rights that we have to thank for the modern idea of ‘civil rights’. Many later documents owe a lot to the 1689 Bill of Rights – not least its American namesake (if you Google ‘Bill of Rights’ the English one doesn’t show up until page two) and the European Convention on Human Rights (PDF). I’m happy to learn that the resonant phrase “certain ancient rights and liberties” is from the Bill of Rights. It’s also, incidentally, unbelievably beautiful. Whoever wrote out the original document had the most exquisite roundhand. It makes Magna Carta look shabby.

5. The Cato Street conspiracy is one intense story. And it’s got the lot: a government spy, a honey trap, a ridiculous, hopelessly bodged plan straight out of a Tarantino movie and a brutal response from the state, including the last judicial beheading to take place in England. The conspirators set out not to assassinate a statesman; they set out to assassinate all of them – the whole cabinet anyway. Their beef was, er, vague, but hinged on the oppression triggered by the wave of European revolutions that preceded it. And Magna Carta was cited in the defence when the case came to trial.

Poster for Chartist meeting, Carlisle, 1839, from the National Archives
6. The Chartists knew how to design a poster. As I said, I’m no historian but the orthodoxy is that the Chartists achieved almost nothing. They were after the vote for working men but it was decades before suffrage was extended meaningfully (and did you know that it was 1918 before all men over 21 could vote?). Fear of dissent and revolution meant the Chartists were harried out of existence before they could produce any change. But, while they were active, they were great communicators and the first movement to make really smart use of mass protest, of what we’d now call ‘the street’. This poster, which is in the National Archives, is absolutely beautiful. A vernacular letterpress masterpiece. We should all aspire to such clarity (there are others, like this one, for a meeting at Merthyr Tydvil in 1848 and this one, for a meeting in Birmingham in the same year. All lovely).

7. 1935 was the 720th anniversary of the signing of Magna Carta so, unaccountably, a year before that, a great pageant was held at Runnymede, site of the signing.

Advertised as a celebration of English democracy, the pageant engaged some 5000 actors, 200 horses and 4 elephants, who over eight days performed eight historical scenes, the centrepiece being a recreation of the sealing of Magna Carta. (Apparently the elephants were withdrawn at the last minute.)

The pictures and this Pathé newsreel suggest a very English blend of eccentric and noble, camp and dignified. I’d love to have been there. This BL blog post suggests something rather splendid and rousing: ‘It’s a Knockout’ meets a BBC Four history doc.

What’s wrong with atheists?

Mel Brooks as Moses holding the tablets containing the original 15 commandments in his grerat film History of the World Part 1
Mel Brooks as Moses receives the 15 Commandments from God (you thought there were ten, right?)

I’m an atheist. Just getting that out of the way. Because this is about a problem that I have with atheists. Not all atheists. Just the strident ones, the humourless ones who form and join clubs, who campaign and complain and object. The ones who picket shopping malls when they provide prayer rooms but not ‘rational contemplation rooms’. Those ones.

The source of my problem is simple enough. Atheists are wrong. To be clear: they’re not especially wrong. They’re just roughly as wrong as everyone else. And, like everyone else, from far enough away they’re almost completely wrong. I can say this with certainty. We’ve got plenty of evidence. Thousands of years of it. Neolithic astronomers could line up the stones for the equinox but were wrong about everything else. Copernicus knew the planets orbited the sun but, we can see, got practically everything else wrong. The Papal inquisition was wrong. But so was Galileo. Newton was wrong. Darwin was wrong. Even the mighty Darwin. The splendid edifice of his scholarship is intact and still uniquely influential but, across the decades, large parts of it have been revised, replaced, dropped – as they should. The flat-earthers and the ether/phlogiston merchants – they were all wrong. But then, later on, so was Einstein. Being wrong is more-or-less universal (everyone’s wrong) and more-or-less eternal (all the time). And the more time passes, the more wrong we all are.

To make it more obvious, go back a bit further. Go back ten thousand years, in fact. To the time of the first big settlements and the beginning of farming and the origin of written language and inquiry into the world. What did we know then that isn’t now known to be wrong? Clue: almost nothing. See what I mean?

Now wind forward ten thousand years from the present day: from out there, from as far into the future as we’ve come since the last ice age, almost everything we take for granted now is going to be wrong. Horribly, fundamentally wrong. Wrong in ways that will ripple through human knowledge and force us to revise even our most basic assumptions about the world. Wrong in ways that will make our future selves laugh as they look back and wonder how any of us – believers or non-believers – managed to dress ourselves in the morning.

But, you’ll protest, it’s not about being right or wrong, its about the method. Rational inquiry – the scientific method – actually depends on being regularly, consistently wrong. And, of course, you’ll be right. The big difference between the scientific method and the invisible fairies crowd is the tolerance for being wrong, the constant readiness to check your thought against reality and revise it. The religious folk have a fixed worldview. In fact, their worldview depends on nothing changing: on invariant laws handed down by Gods. Case closed, surely?

But no. Not at all. Rewind again (go the whole ten thousand if you want). Examine the thought of an earlier era – the myths and laws and creation stories of that time. See where I’m going with this? Are they really invariant? Are they even, in fact, recognisable? Do the beliefs that animated the irrational folk of earlier eras still apply? No, they don’t. They’ve been overturned, thrown out and replaced – dozens, hundreds, thousands of times. Objects of worship, origin stories, social and ritual elements: are any the same now as they were in earlier periods? Hardly any. It turns out that just because religious people say their beliefs are eternal and unvarying, it doesn’t mean they actually are. They shift and change constantly. The Vatican, which persecuted and executed astronomers, now operates an important observatory. Muslims, Jews, Buddhists – they change their minds all the time, constantly (when looked at from the right distance) revising and updating their beliefs, quietly dropping the stuff that’s incompatible with current models.

So, rational folk (like me) are as wrong as everyone else and – more than that – have no monopoly on a readiness to update their thought as they acquire new knowledge. And this is what upsets me about the assertive hard-core of atheists/secularists/rationalists – the ones who put ‘atheist’ in their Twitter bios, do stand-up comedy about the silly believers, sue the council for putting on carol concerts and all the rest. Being slightly less wrong than the God botherers doesn’t make you right. We should have the humility to recognise that – over the long run – we’re all gloriously, irredeemably wrong.

Update 30/04: James O’Malley has posted an interesting response to this post called, naturally, ‘What’s Right with Atheists‘!

Tim Berners-Lee’s most important decision

British Library digitised image from page 161 of '1763. Combined History of Shelby and Moultrie Counties ... With illustrations, etc'
A handshake, from the British Library on the Flickr Commons

Of the dozens of design decisions that TBL made during 1989, all of which continue to shape the way we build and use the web twenty-five years later, the most important was not requiring permission to link. Seems obvious now – a non-feature in fact – but it’s the reason you’re not reading this on Xanadu (or Prestel or on an X500 terminal or something). The logic of the times – embedded in those other systems – was that documents and data sources had owners and that you couldn’t just link to them without some kind of formal permission. Permission was defined as a system-to-system, technical handshake kind of thing or a person-to-person interaction, like a phone call, or, God forbid, a contract and some kind of payment. TBL’s judgement was that the community he was building his system for – the academics and engineers – wouldn’t want that and that the spontaneity of the hyperlink would trump the formality of permission. And, of course, he was right. It’s the spontaneously-created hyperlink that triggered the marvellous, unstoppable promiscuity of the World Wide Web. It explains the web’s insane rate of growth and the profusion of web services. It’s the root of all this.

Seven things it’s worth remembering about Wikileaks

Before its inglorious founder takes it down with him or before it’s chased off the Internet by enraged governments, it’s worth remembering what Wikileaks was before it became a cause celebre:

  1. It used to be a wiki. It stopped being a wiki in 2010.
  2. It was an anonymous drop-box. Whistleblowers could deposit documents without fear of being identified. This was the radical core of Wikileaks. They say that submissions are still accepted but the drop-box was switched off in 2010 too.
  3. It was about using the Internet’s open, peer-to-peer, symmetric-in-all-directions architecture to return power to ordinary people inside dumb corporations and repressive regimes. The kind of thing we always said the Internet was for. But it was also anarchic and unaccountable. It made free speech advocates and netheads queasy.
  4. There was something glamorous and edgy about it. It was morally complicated, like a le Carré plot. All those secrets and their forced disclosure, the chaos and panic that their untimely release caused, the attacks from government black-hats, the comicbook torrent of documents fired in its direction. And whatever you think of Assange – hero, demagogue, victim, criminal – he’ll be an important figure when the histories of the first decades of the Internet era are written.
  5. It became home to documents removed from the public record by courts or governments. It claimed a status above national law and essentially demolished the super-injunction and the cosy media blackout. This was bound to make it of interest to lawyers and governments right from the start.
  6. It was run by a maverick and his mates, so governance and accountability looked weak. Wikileaks contained the seeds of the Assange meltdown from the beginning. We could have anticipated all this (maybe not the Ecuadorian embassy balcony bit).
  7. It was a trial-run for a full-on infowar, for authority’s fight-back against the unruly net. Payment processors, service providers, media partners and sponsors all came under huge pressure and mostly buckled. The net’s apparently ungovernable, distributed, supra-national structure turned out to provide hardly any protection at all. We learn that a determined state supported by compliant corporations can damage or destroy an outlaw entity like Wikileaks. That’s an important lesson for you cyberpunks. You’re gonna need a bigger boat.

(update, 22 August, I collapsed the eight things into seven.)

NTK: “exactly the same thing, 15 years late”

Fifteen years ago, when it was all fields round here, Danny O’Brien and Dave Green – who were well-known in underground gaming/comedy/tech/confectionery circles – began to publish an email newsletter for and about the British tech community. It was a joy from the beginning – authentic, funny, playful, insightful… Geek storytelling that is probably already consulted as a primary source for the arrival of this whole network/digital/computer thing… NTK is on the newsstands again, in a clever time-shifted format. Sign up here.

Bowblog is ten years old today

I adapted the name from Kevin Werbach’s Werblog, which seemed like a cool thing to do at the time. I’d been blogging for a few years before that (since the twentieth century) but only fragments survive. The archive suggests I used to update it a lot more too – back before Twitter. Anyway, I’ll tweet a few anniversary posts. Here’s the first one, from 21 May 2002, which includes a link to a column I used to have in The Guardian’s Online section (then under the legendary Vic Keegan’s editorship).

Igor Stravinsky, Tupac Shakur and the uncanny

(a post from 2012, which is pretty uncanny in itself)

The Player Piano was the Tupac Hologram of its day.

The most thrilling of our inventions are the ones that return to us a person we’ve lost or that recall a scene from the past that we couldn’t have experienced or a place we couldn’t have known. There’s a rush, a kind of zipwire effect. WOOSH. BANG. You’re there. And sometimes these experiences are so vivid they cross over into the uncanny and the hairs on the back of your neck stand up. A list of these moments would be a long one, but try this ultra-vivid portrait of the Carusos in 1920. The rush here is a compound effect of a fabulous technology, as-yet unmatched in the digital era – a large, glass negative – plus the amazing light on that New York terrace and those eyes (those eyes are what Barthes would have called this photograph’s ‘punctum‘). Or this: the first view of the earth from the moon. Tell me you didn’t shiver (and note, also, that in order to qualify as ‘uncanny’ it doesn’t need to be a hyper-real simulation of a human).

The player piano is another piece of nineteenth century tech that’s highly productive of the uncanny. The knowledge that the sound you’re hearing, when the paper roll begins to turn, reproduces with truth the actual playing of a long-dead musician – not the acoustic effect but an actual mechanical trace, punched into a paper tape by the actual force of the player’s fingers – changes the effect startlingly.

The fact that sometimes that musician was the composer – Gershwin or Rachmaninov or Stravinsky – makes it more uncanny still. I was lucky enough to be standing next to one of these player pianos – a kind of half human-mechanical hybrid steampunk cyborg – ten days ago in a Broadcasting House studio. Its owner Rex Lawson rolled it up to the studio Steinway and attached it like a grabbing symbiont to the keyboard and then brought to life one of Stravinsky’s amazing piano rolls (and acting as much more than an operator – more of a second player). It was a remarkable experience: Stravinsky was very much in the room. Here’s a video I made of that strange encounter of machine, memory and music:

Rex Lawson ‘playing’ Stravinsky on a Pianola player piano in 2012

And, as if in confirmation that we live in strange times, a few days later, Tupac ‘appeared’ at Coachella, turning the uncanny dial up a few notches but instantly reminding me of that Stravinsky experience. I wish I’d been there, of course. Everyone who was says that it was amazing – and some were so freaked out by Tupac’s ‘appearance’ they declared that they disapproved, that it was somehow disrespectful. And the Tupac hologram, which wasn’t actually a hologram, but a projected synthesis of historic appearances and some clever 3D simulation, is from the same family of technologies – a direct descendent, in fact, of Rex Lawson’s rattling, mechanical Playel time machine. Spooky.

Tupac’s hologram ‘appears’ at Coachella 2012