Losing your soul

Losing your soul

It’s been said that vampire mythology has remained compelling for generation after generation because the myths lend themselves to the preoccuptions or anxieties of each generation. In Bram Stoker’s time this might have been an anxiety about industrialisation, or perhaps the alien East; more recently we’ve seen vampires transformed into romantic outsiders (the Twilight twaddle) or away from the mainstream, into parables of capitalist excess, AIDS, or homosexuality more generally (Perfect Creature, Daywalkers, True Blood and so on).

I caught up with film director Neil Jordan’s most recent excursion into the vampire world, Byzantium. It’s good to have a violent edge restored to the myth after the teenage goo of the Twilight saga, though it’s a much smaller film than Jordan’s previous Interview with the Vampire, and has had less success. That’s not surprising because despite a strong cast it was disappointing in many ways, needing more time and space to explore the ideas it was throwing up.

All the same along the way it raised, perhaps inadvertently, an interesting question about one of the most common features of horror myths, the idea that you might have immortality in return for your soul.

In the more religious ages of Christopher Marlowe, Goethe, or even Stoker, it’s apparent that the loss of the soul would be more straightforward (and frightening) than it is to us now. In those times it was (a little) clearer what might be meant by the soul, the part of us destined in any case for immortality either in heaven or hell. For Doctor Faustus it was the pact with the devil, the choice between an extraordinary life here on earth and the possibility of eternal bliss in the afterlife. This idea spilt over into vampire myths. In most 20th century versions, while some victims are simply turned into monsters by other monsters, there are often those who will willingly choose the vampire fate rather than die (this idea persists strongly in True Blood).

In Byzantium the loss of the soul was clearly presented as the price you’d pay for immortality, and yet it was not apparent in the film, nor in the broader context of our secular lives, what exactly it is that we would lose through this bargain. In earlier versions of the vampire myth, certainly in Stoker’s Dracula, and Christopher Lee’s version for Hammer, the vampire is reduced to a kind of bestiality, with no real emotional attachments and not much more than an instinct to survive. In this “monster” version the soul becomes the thing that makes us distinctly human, our capacity to care for others, the value we give to emotions and (arguably) through those emotions to ethical behaviour.

Religious belief is rooted in this idea, but it doesn’t depend on religious belief. In his contribution to the RSA’s Spirituality project Iain McGilchrist asked whether it still made any sense to talk of the soul in a secular world, and came up with a positive answer: in his account the concept of soul is a way of understanding the fullness of how we are in the world, the fullness of our experience, without necessarily invoking metaphysical beliefs about the divine or life beyond death.

In this view we might say that the idea of the soul is an aspect of consciousness, or even a generous concept of consciousness (hard to say what that distinction really means, but that’s part of the problem of discussing consciousness). We could even say it’s part of what we mean to have a soul, that we should care about the question of what it means to have a soul.

Thinking in this way means trying to achieve a better understanding of what might be possible through epistemology, about what we can and cannot know.

Thinking this way has to acknowledge the relatively recent realisation that our sense of identity is subject to the physical reality of our brains, that certain kinds of brain damage may alter our personalities radically, or destroy the memories on which our sense of identity depends. It seems to me that this inconvenient truth makes it very difficult to maintain the idea, common to pretty well all religions, that life on this earth is (in Keats’ phrase) a vale of soul making, with the fruits of those labours only fully realised in a life beyond this one.

This concept of the persisting soul itself depends on a concept of some kind of persisting identity, as well as the belief that this identity, this self or soul is responsible for its own development. But if a bang on the head can send all the development onto a different track then it’s hard to see how we can be held responsible for it, or to put it in a less judgemental way, which soul/identity is going to persist into another kind of life?

On the other hand, and this is part of McGilchrist’s wider thinking, it’s naïve to say the least to equate electrical or chemical activity in the brain with “thought” or indeed consciousness. The equation is made because we can correlate the two, but we shouldn’t confuse correlation with equation (or identity). We need a better account of embodied consciousness, without naive materialism but also without resorting to the tangles of metaphysics (let alone religious metaphysics). It’s plausible that a secular concept of soul offers a way of doing this.

This is all very well for debate in a philosophy seminar. It doesn’t make for particularly gripping drama. One of the problems with Byzantium, and indeed many modern vampire stories, is that they want to put some weight on the loss of soul, the price to be paid, without having any way of taking seriously what this could mean. So our lead characters far from being monstrous appear to be persistently human, with a full range of emotional and moral concerns, apart from the fact that they routinely have to cut into other’s veins and consume all their blood. The stories depend on this persistence of humanity to command our interest and sympathy.

Byzantium’s vampires are not even excluded from the daylight. Theirs is a subtler burden, the pain of living secretly among humans and knowing yourself to be different (hence the ready analogy with queerness). This doesn’t really seem like soullessness, more a fairly common aspect of human experience.

There is another sense in which choosing immortality will immediately estrange us from our humanity, our soul. Uncomfortable though it might sometimes be, our sense of what it is to be human really might depend on our mortality, on the fact that we age and die. This doesn’t make the prospect of death any more welcome in itself, but it does make it more acceptable. As Tennyson’s Tithonus complains “me only cruel immortality/Consumes”. The force of that “me only” falls on “consumes”: Tithonus is not the only immortal, but the immortal gods are made of different stuff, their ageless eternity quite unlike the withered Tithonus who persists only as a “walking shadow” in the world. We might reasonably wish for more time with better health, but in the end even the futility of wanting more is part of what it is to be human.

The bitter irony is that for fundamentalists of all stripes, the promise of an afterlife, of an existence more important than life on earth, or even a cause they think might live on through their action, is enough to make them forget their humanity and destroy what life we do know is real. There be monsters.

The mind of the reader

The mind of the reader

Sensitivity to context has long been a cardinal virtue in writing, a prerequisite both in creative and business work. In the second part of this short essay on good writing I want to look at how creative writers address context, and how business communication frequently gets it wrong.

Business writing always starts with three (deceptively) simple questions about context: who are your audience, what are they thinking, and what would you like them to think?

Creative writers in contrast cannot really know who might be reading their work. You could say that one of the things distinguishing creative from commercial writing is that the former must to a large extent create its own context, set its own rules and expectations.

This hasn’t always been so. Limited literacy and publishing meant that pre-Romantic writers (and poets in particular) could assume that their readers were drawn from a much smaller group in society, whose tastes and expectations were easier to anticipate. Intelligent writers might still play with ideas of the familiar or unfamiliar reader, with the very specialised manner of speaking that is poetry, but the Romantic movement changed things. Romanticism might have preceded mass literacy, but it marked the beginning of a conscious attempt to seek a common language, held in common with ordinary people, reflecting a belief in the possibility of a shared experience that transcended class and circumstance. The post-Romantic Tennyson, certainly the first and still one of the few poets ever to become rich on the sales of his work alone, spent his literary career worrying about the relationship between the personal and the common, about what he could mean when he wrote the word “I” (not so much as a philosophical problem as an imaginative one).

This concern stayed with post-Victorian literature. TS Eliot wrote of “the escape from personality” offered by poetry. Towards the end of the 20th century a large movement within literary criticism went down a blind alley attempting to uncouple the idea of a writer-as-creating-intelligence from the (culturally conditioned) responses of the reader, apparently finding the cultural conditioning, the context, more interesting than the creativity (it is not). At the same time, outside these rarefied and stultifying critical airs, another cultural current elevated the status of the creative mind to an untouchable position, as if creative earnestness was enough in itself to command respect (it is not).

Good writers have had to think through and with these uncertainties.

Business writing faces what looks like a simpler task. It has a given context, however broadly defined that may have been. Indeed the definition will often be very broad, including diverse interest groups. It’s been said that if this definition starts to divide into primary and secondary audiences you’ve already doomed your communication to failure. This might be an overstatement but it underlines the importance of understanding exactly who you’re talking to, what it is you’re trying to do, which includes an understanding of what success would look like.

It’s fair to say that the poor quality of business communication all around us is rooted in a lack of proper thought about context.

It’s not too difficult to find examples of blatant bad practice, the times when a business seems to have decided it wasn’t worth worrying about polish or professionalism. It never seems particularly surprising when something like a railway station announcement is wrapped in mealy-mouthed verbiage (no surprise even if the announcement has been pre-recorded using a professional voiceover), so you’ll hear that you’re being watched by cameras “for the purposes of safety and security” when “for” would have meant exactly the same thing (why use one word when you can use four).

It’s not too hard to find examples of corporate jargon being slipped unreflectively into inappropriate contexts. Ironically sales and marketing people, who should be better aware of jargon than most, are often the worst offenders. I don’t know whether to laugh or cry when in my local multiplex cinema I’m told I can buy popcorn or ice cream in the foyer “from the concession stand”. It’s true that being in the business I am I can work out what a concession stand is, but I don’t really know why I’m being asked to consider the cinema’s sub-letting arrangements rather being told how most easily I can spend some more money.

This misuse of jargon reflects a thoughtlessness about context. Other misuses around the idea of brand outside marketing audiences can have far more serious consequences than my irritation, but that’s a bigger topic.

These are stupid, blatant mistakes, but I’m more interested in subtler errors of judgement, which reveal something about the pressures on business-speak and why they need to be resisted.

Here’s a statement from Microsoft’s new CEO Satya Nadella, talking about the acquisition of Nokia’s mobile phone business, as reported on The Register IT news site.

“The mobile capabilities, hardware design expertise, and world-class manufacturing and supply chain operations they bring will help us drive innovations in devices to delight our customers,” Nadella said of the acquisition.

I wish Nadella well (and I use Microsoft products all the time), but one of the things he needs to do is find his authentic voice. I appreciate that as he took over he needed to reassure as much as excite, but his first letter to staff and the film released to accompany it had clumsy PR hands all over it, a story which sounded like it had been derived from true things about him, but which didn’t sound like he was saying it, or at least it sounded like whatever he thought he ought to say, rather than what he might have said left to himself.

“Authenticity” is an idea that gets knocked around quite a lot and amusingly enough in business minds can often prompt the question “how do I achieve authenticity?” The quote about the Nokia acquisition starts off reasonably enough, summing up the benefits, but then hits a jarring false note when he says this is all about bringing “delight” to Microsoft’s customers.

Why is this a false note? It’s because it sounds like it’s come straight out of a management book, and that’s a problem because it’s then being used out of context. While the creation of products that “delight our customers” might be a reasonable imperative for product developers (not least because few Microsoft products lately have offered that delight) it can only sound presumptuous as a more general statement. It makes it sound as though all is so well in the Microsoft world that the company can now focus on higher level pleasures rather than reworking the basics to get them right. Nadella knows that this needs to be done, and since in the real world he is overseeing changes to put things right he should be letting that real world guide what he says. It’s the old PR fantasy, that repeating something for long enough will make it so, but this can only be remotely feasible (and I’d argue even then is not desirable) when the people you’re talking to are not faced with directly contrary evidence.

Speaking as if readily available truths were not real is one of the most extreme examples of loss of context, a retreat into a kind of self-regarding bubble which pays insufficient attention to whatever lies beyond the self.

This imperative to think through context raises another important point about authenticity. In the world of sales, being a brash loudmouth might constitute a certain kind of authenticity, but if may not be the most effective way of getting people to listen to you. We need to find ways of being true to our own voices (and all of us have more than one) which at the same time respect the expectations and concerns of our audiences.

For instance any experienced copywriter will have been faced with briefs that demand “punchy” copy. There’s a time and a place for brevity, or even for words that hit hard, but there are as many contexts in which a gentler tone (which doesn’t preclude brevity) is likely to be better received: by and large people don’t like being punched.

A real concern is that someone who could be so unreflective and insensitive about the meaning of his or her words in a brief is not going to be able to distinguish good from bad copy written in response to that brief, and here must lie part of the explanation of why we all must contend with daily showers of corporate bullshit: there are too many unskilled people in the communication business.

This is not to suggest that business people need to be better artists. It wouldn’t be a bad thing perhaps, but it’s not realistic, and nor is it necessary. We need rather to remove business communication from the bubble it’s blown round itself, to connect it again to the worlds in which business people themselves live when they’re not at work.