Posts Tagged ‘Language’

“But the word feminist, it doesn’t sit with me, it doesn’t add up. I want to talk about my problem that I have with it. First of all, on a very base level, just to listen to it. We start with fem. That’s good…¬†Ist. I hate it. I hate it. Fail on ist. It’s just this little dark, black, it must be hissed. Ist! It’s Germanic but not in the romantic way. It’s just this terrible ending with this wonderful beginning…¬†

Let’s rise up a little bit from my obsession with sound to the meaning. Ist in it’s meaning is also a problem for me. Because you can’t be born an ist. It’s not natural…¬†So feminist includes the idea that believing men and women to be equal, believing all people to be people, is not a natural state…

And so unless somebody comes up with a better one ‚Äď and please do ‚Äď my pitch is this word. Genderist. I would like this word to become the new racist. I would like a word that says there was a shameful past before we realized that all people were created equal.”

– Joss Whedon, during this hot mess of a speech

When you posit that two of the main problems with the word feminist are the offputting phonetics and unnatural implications of its final syllable, then promptly suggest a replacement word that uses the exact same fucking syllable¬†in the exact same fucking placement while changing the part you claimed was great – which backflip you manage to perform in the space of a single, pre-prepared speech – it’s probably time to sit all the way down and shut the fuck up about feminism.

Listen, Joss Whedon: you’ve made some cool, transformative, feminist shit, plus a bunch of other stuff – or sometimes the same stuff! – which is awesome despite being problematic on multiple fronts, though as always, YMMV. That much is undeniable. But you’ve also done some truly fucked-up things, like firing Charisma Carpenter for being pregnant, planning to have Inara gang raped in order to make Mal Reynolds a hero, and repeatedly racefailing your representations of POC, especially the women; and now you’ve got the gall to stand there and proclaim the ineffectiveness of feminism at a conceptual level – to agree, in effect, with Elle UK’s recent attempt to rebrand the movement¬†– because you don’t like the word?

Before we proceed any further, let’s get one thing straight: there are times and places for changing our language on the basis of what a particular term originally implied, or of what it continues to imply. Language is important and sneaky; it changes our thinking without our even realising it, and when we make a conscious effort to reclaim that process – to be clear and unambiguous, to avoid causing hurt, and to set aside long-standing biases better left as historical footnotes – that is an important, a powerful thing. But this is not the case with the many successive attempts to rebrand feminism; to replace it with words like equalist or genderist , which invariably involve the removal of that disquietingly feminine prefix. Rather than redressing a lexicographical wrong, it’s a way of downplaying the role and relevance of women within their own movement in order to make others feel more comfortable with the concept of equality, a form of taxological silencing derived from the same logic which recently saw¬†a female speaker ejected from the Michigan House of Representatives for saying ‘vagina’ while talking about abortion. For as long as the word feminism is deemed both radical and confrontational for its use of the feminine prefix, it will remain a necessary word precisely because of how perfectly our cultural uneasiness with women’s rights is reflected in our uneasiness with a term that dares to make them its focus.

Because linguistically, feminism is a word rooted firmly in the female quest for equality, an origin story which speaks of combat against oppression, not its perpetuation. Which isn’t to say that the movement has never been oppressive, either then or now. Early white feminists routinely threw women of colour under the same bus Rosa Parks and Claudette Colvin before her¬†were forced to the back of, openly spouting racist views¬†and stealing the foundations of modern feminism from the women of Iroquois Confederacy, a practice all too often continued today by the erasure of the feminist contributions of WOC, the endorsement of men like Hugo Schwyzer, the aggressive Islamaphobia of Femen (an organisation, coincidentally, which is run by men), and Caitlin Moran’s assertion that she “literally couldn’t give a shit about” the representation of WOC in media, to say nothing of the repeated transphobic abuse and cissexist attitudes of radical feminists towards trans women and their inclusion in feminist spaces. Which is why womanism has arisen as a separate institution to feminism – as a way for black women especially, but WOC generally, to discuss their rights and needs without being spoken over, condescended to, misappropriated, elided or otherwise ignored by white feminists too oblivious to their own privilege to realise that, as per the words of Flavia Dzodan, feminism will be intersectional or it will be bullshit.

All of which is a way of saying: there are many good reasons to discuss the future of feminism, its relationship with oppression and the way this intersects with our use of language. The failures of the movement – and there are many – are not derived from its nomenclature, but are rather a disappointment to all that it should encompass, but doesn’t. With so much toxic history bound up in exclusionary feminist thinking, it may well be that the best answer, long term, is to find ourselves a new title and start afresh. But when Joss Whedon comes out, completely ignores the existence of such conversations, suggests that race is a comparable side-issue to gender rather than a major intersection with it and says that, no, the way to move feminism forward is to rebrand it using a word ¬†he invented all by himself, because apparently the true spirit of feminism is best encapsulated by our uncritical capitulation to a powerful white guy who cracks jokes about the Taliban and publicly shames Katy Perry while telling the rest of us what we’re doing wrong? FUCKING NO.

In Whedon’s recent adaptation of Much Ado About Nothing –¬†¬†a film I otherwise loved – there’s a single ugly moment that perfectly encapsulates the nature of his fail. Brought to the altar to wed a woman he thinks is an unknown substitute for his beloved Hero, whom he presumes dead, the guilt-wracked Claudio declares his intent to marry her “even were she an Ethiope” – which is to say, even if she were ugly or otherwise socially unacceptable. Being as how this is 2013, rather than 1599, when the play was first written, Whedon could easily have changed this line, removing or altering it without any loss of drama. Instead, he chose to emphasise it, cutting quickly to the disapproving face of a nearby black woman – someone he might well have hired just for that single purpose, given the otherwise lilywhite casting – for a comic beat as Claudio speaks the line. It was jarring and awful and needless, and more than anything else of Whedon’s I’ve seen recently, it reminded me that here is someone who needs to have his shit called out, and loudly. Because if you can put that much conscious thought and planning into making a joke about the ugliness of black women and still get up and call yourself a feminist, then something in your view of the world is seriously wrong.

Advertisements

Recently, several writers I respect have been blogging about¬†backstory, exposition and simplicity. The first of those posts, by Patrick O’Duffy, got me thinking about what backstory really means. Heading into a novel, it’s quite usual for me to have dedicated reams of wordage to figuring out who my characters are, what they’re like, what major events (if any) have defined them, how they relate to everyone else in the story, and where they might end up. Depending on the narrative, anything from all to none of this information might prove to be plot-critical; even so, there’s a decent chance that a reasonable portion of it will get used. Once upon a time, I’d have been happy calling that backstory, but having read O’Duffy’s piece, the term no longer feels applicable. Or, more accurately, it doesn’t seem to apply in quite the same way. As a word, backstory¬†is suggestive of information that has already been¬†superseded¬†by the coming narrative – ¬†the sort of character-blurb you might write into an obliging box on a D&D character sheet in the sure and certain knowledge that anything you say, no matter how personally relevant, will have no bearing whatsoever on the coming adventure. At least, that’s my memory of high school level RPGing, anyway; whatever personality I gave my character would be as¬†detached¬†from the main narrative as if I’d bothered to try and impose a fictitious history on my avatar in¬†Neverwinter Nights. In such gaming scenarios, the importance of backstory is reduced to a fairly binary set of good/evil questions designed to shape your personal morality, such as: will my character kick this puppy? Should I steal the gold from the old lady, or give her more to buy medicine? Will I help the druids defend the trees, or shall I fight their preachy asses? (Note: I am probably the only person in the entire world who helps the druids at that point. Some NPCs just ask to be eaten by bears.)

But writing a novel, it seems to me, is a markedly different endeavour. If the story is¬†analogous¬†to the gaming campaign, then the characters – and their histories – have ceased to be detached from the main quest arc: there are no more NPCs, because every character is a potential party member. RPG campaigns constrain the narrative in that certain characters exist only to help the protagonists forward. The helpful tavern wench cannot suddenly join the quest, no matter how resourceful, brave and clever her backstory might prove her to be. But then, why would you give an NPC backstory beyond what’s necessary to explain the aid they give the protagonist? The answer highlights a significant, crucial difference between pantsers and plotters, viz: for pantsers, the wench can always join the party. Backstory grows organically, so that any random secondary character might suddenly leap into the limelight and refuse to leave without being granted six soliloquies and a curtain call. For plotters, however, such things are fixed from the outset: the relevant leads have already been chosen, and the wench is not among them. Which might go a long way towards explaining why some plotter-writers are leery of backstory – any details they include must, of necessity, be plot-relevant; and if it’s plot-relevant, then it’s not backstory, which instead becomes a label for all the information that had no place in the main narrative. In this context, therefore, suggesting that writers should keep backstory out of their writing doesn’t mean their characters shouldn’t have history; only that said history should be¬†relevant.

But for some of us, to paraphrase Faith from Buffy the Vampire Slayer, there is no such animal as irrelevant history. Pantser or plotter, if you’re in it for the characters, then the nitty-gritty of their lives – past or present, regardless of the degree of plot-importance – will always be meaningful.¬†Which is where we come to Chuck Wendig’s post on exposition, because this is not, contrary to how it might appear, an excuse to dump any old crap about the protagonist into the story and call it plot-critical. Exposition is a question of structure, not content: if you’re going to flesh out your characters, then it shouldn’t be at the expense of readability. Relevant to the plot and relevant to the character aren’t mutually exclusive conditionals – in fact, they ought to overlap. But if we were to render the story as a Venn diagram, it shouldn’t be mandatory for the two circles to appear as one: there’s plenty of room for play. As Aliette de Bodard’s piece on simplicity points out, economical stories aren’t necessarily better than expansive ones; in fact, there’s a lot to be said for sprawl.

A slight aside, at this point: the other day, I was mulling over the sameness of mainstream Hollywood films. Specifically: why is the stereotypical Five Man Band¬†so ubiquitous, and why do so many movies keep failing the Bechdel Test? Trying to tease out the cause of the problem – using, as my case study, the appalling Captain America – it suddenly struck me that backstory might be the missing element, with narrative oversimplification a major contributing factor. Consider the following premise: that Hollywood films will usually focus on the exploits of a single protagonist, with any secondary characters set to orbit the lead like satellites. Because of the time constraints inherent to cinema as a medium, this creates a strong impetus to make every interaction count, and if the story is meant to focus on the protagonist, then the natural default, script-wise, is to ensure that the vast majority of conversations are held either with or about the lead. If, as is so often the case, the protagonist is male, this sets the film up for near-guaranteed failure of the Bechdel test, for the simple reason that the secondary characters – regardless of gender – aren’t allowed to have superfluous conversations. This also means that the secondary characters don’t matter in and of themselves. It’s the difference between writing about a hero and his gang, and writing an ensemble cast: the two stories might have the same number of characters in identical roles, but the distinction is one of emphasis. A Five Man Band is there to support a single leader, whose personal struggles dominate the narrative – but in an ensemble, everyone matters equally.

Hollywood is not good at ensembles.

This is particularly evident when existing stories are adapted to the big screen. It’s generally assumed that any adaptation must, of necessity, pare back the secondary character development in order to allow a sharper focus on the Main Plot. Though done in the name of time-sensitivity, what this actually means is that, far too often, all the nuance which attracted people to the story in the first place – the worldbuilding, the detail and the cast as a whole – gets butchered in translation. Audiences react badly to such treatment because they can see what’s missing: there are holes where better characterisation (among other things) should be. But here’s the kicker – this is just as true of original feature films. All scripts go through multiple drafts, and if you assume that relevant information isn’t being lost in those cuts, I’d invite you to think again. Right now, the Hollywood default is to pick a protagonist, deny them backstory, throw them into an adventure with a bunch of NPC Pokemon sans the evolutionary moonstone, and hope that events are strong enough to carry them forwards. This is what happens when we demand utility from every conversation while simultaneously acting under time constraints and ¬†focusing exclusively on immediate, rather than past, events; and it is not my favourite thing.

Which is why, to return to the earlier point, worldbuilding and backstory are two of the qualities I look for most in a narrative. Stories without sprawl, while nonetheless capable of being utterly awesome,¬†tend to feel like closed ecosystems. Combine Ebert’s Law of Economy of Characters with The Law of Conservation of Detail, add a dash of¬†Chekhov’s Gun, and you can start to see what I mean. Such stories aren’t predictable, per se – though this is can definitely be a problem – but are rather defined by absolute catharsis. They’re murder mysteries without the red herrings, worlds where you can’t go off-mission and explore the map, meals without any delicious leftovers to be used for future cookery and consumption. Speaking of his Discworld novels, Terry Pratchett has said that he created the city of Ankh-Morpork as somewhere that would keep going once the book is closed; the sort of place where the characters have lives to be getting on with even after the story ends. The Discworld might well exist on the back of four elephants stuck to a giant turtle flying through space, but it feels real, because its many stories, inhabitants and cities are – just like our own world – awash in irrelevant detail. To wankily quote myself, I’ve said before that:

The stock premise of epic fantasy ‚Äď defeating the Dark Lord to save the kingdom ‚Äď has always sat awkwardly with me, if only because it so often comes to feel as though the world in question only exists as a setting for that one battle, and not as a realm in its own right…¬†Ultimate confrontations with ancient evil¬†are fine, to be sure, but they don‚Äôt lend much to the idea of a world which, left to its own devices, will just be a world: one where good and evil are intermingled in everyday human activity, rather than being the sole province of warring gods and their acolytes.

It’s a view I stand by, and something I think it’s important to remember. More and more often, it feels like arguments about writing in the SFF community – such as the recent Mary Sue debate, for instance – hinge on a fundamental failure to distinguish between bad writing and narrative tropes and decisions exacerbated by bad writing, as though the inclusion of specific ideas, character traits or story-forms¬†¬†is the real problem, and not, as might actually be the case, the quality of their execution. Point being, I think we’ve started to become a bit too deeply invested in streamlined narratives. We talk about trimming the dead weight from stories the same way one might imagine some shark-smiled management consultant talking about axing the creative department over budgetary concerns; as though the story is a high-profile office in which can be found no room for cheerful, eccentric sentences who wear colourful shirts on Friday and eat all the biscuits at meetings. Stories without foible, indulgence or quirk, but where everything must arrive at 9am sharp in a business suit with a briefcase.¬†In fact, it strikes me as telling that much of the language we use to discuss the improvement of books is simultaneously fat-phobic, sports-centric and corporate. Bad books are flabby, soft and bloated; good books are lean, raw and hard-hitting. Or maybe I’m just projecting.

In my own writing, I tend to sit somewhere in the middle of the pantser/plotter continuum, which isn’t particularly unusual. Though I almost always start with a single protagonist as a narrative focal point, my casts invariably grow in the worldbuilding process, and while I do write out copious backstory for my original characters, I’m still frequently surprised when bit-players queen themselves, or when planned protagonists turn out to be happy in the background. I chart my main plot points and narrative arc, but leave everything else to chance – often with unexpected results. Some writers are far more rigid; others are far more lax. But if this blog had a point, it was the realisation that the reason my stories tend to end up with so many main characters is because I inevitably become involved with their backstories. As has been pointed out by innumerable people, every character is the hero of their own adventure – and as I’m now nearly 40,000 words into a new novel, jumping between POVs while wrangling multiple events, this felt like a good time to stop and discuss what that actually means. Thanks to O’Duffy, I’ve come away with a much stronger concept of what backstory is – to me, to others and in general. Thanks to Wendig, I’ve got a sharper idea of how to apply it without turning my story into a swamp of boring detail. And thanks to Bodard, I’ve realised the importance of sprawl – not just in the worlds I already love, but in the creation of my own.

I’ve just been reading this interesting post¬†over at Katharine Kerr’s blog about trying to define what constitutes a work of literary fiction, as opposed to a work of genre fiction. She also talks about the dangers of arguing against litfic and the literary establishment by way of derogatory strawman arguments, not only because this is exactly the kind of negative pigeonholing SFF fans and writers have always railed against when it’s turned our way, but also because it’s unhelpful in trying to understand what literary fiction actually is. It’s an interesting question, but in trying to answer it, I ended up in quite a different place to where I started. Rather than hijack the conversation, therefore, I’m going to take the comment I left as a starting point for answering a slightly different question: how might a lack of named literary subgenres be impeding the success of women literary writers?

As a casual glance at the blogsphere will reveal, there’s been considerable recent debate in SFF quarters about the feminisation of epic fantasy and the nihilism of gritty fantasy, conversations that have been in no small part facilitated by our ability to distinguish between the different SFF subgenres. We know that Tolkien is the Mitochondrial Eve of fantasy writers: one way or another, all our works descend from his. But as with the human race, things have grown more and more diverse with successive generations, such that trying to use fantasy as an exclusive, catch-all terminology has become, except in the most general sense, both useless and inaccurate. Without a language of subgenre terms with which to discuss these differences, it seems inevitable that SFF writing would automatically default to using Tolkien as a blueprint for all new stories – and indeed, up until very recently, we can see that this was the case. Only when writers started trying to break new ground did an alternate language spring up to try and describe their efforts. Partly, it must be admitted, this happened as a means of distancing such heretical works from their canonical predecessors, but also because it was suddenly necessary to look back over everything that had happened since Tolkien and ask whether fantasy could – or should – be more than just the same old Eurocentric, medieval vision of elves, dwarves, men and halflings fighting a succession of ultimate Dark Lords over and over again.

Suddenly, fantasy ceased to be a universal term, and once we started talking and thinking in subgenres, it became easier to understand why new types of story were arising; to pinpoint the tropes their authors wanted to address or change, and for what reasons. True, it also became harder to classify individual works, and the need to fit each and every book into a particular box is something we’re all still arguing about. But the fact is that language is important. Naming a thing allows us greater control over it, and that’s true regardless of whether we’re talking about the magic of Earthsea or the politics of Earth. Consider, for instance, this article by feminist philosopher Jennifer Saul, wherein she talks about the significance of language in feminism. To quote:

“Languages may also lack words for things that matter a great deal to women. This sort of gap is another way that a language can be seen as encoding a male worldview. The term ‚Äėsexual harassment‚Äô, for example, is a recent feminist innovation. Women’s discussion of their experiences led them to see a certain common element to many of their problems, and as a result they invented the term ‚Äėsexual harassment‚Äô. Once the problem was named, it became much easier to fight sexual harassment, both legally and by educating people about it.”

Which brings me to the matter of the Orange Prize – or rather, to the recent suggestion that an equivalent award is necessary to combat sexism in the Australian literary scene. It’s none too encouraging a sign when women take steps to set themselves apart from men, not because they want or ought to, but because discrimination has left them with no other means of achieving success. For an intelligent and comprehensive rundown on the issue, I highly recommend this excellent piece by writer Benjamin Law, wherein he says, among other things:

“If you take Brookner‚Äôs insistence that a meritocracy exists, what are you supposed to make of the raw figures?¬†Books written by women are reviewed less. Women win fewer literary prizes. If that is a meritocracy, then you have to buy into the argument that books by women must be inherently inferior. I can‚Äôt accept that. The danger on relying on meritocracy is assuming one actually exists.”

But what, I hear you cry, does any of this have to do with SFF subgenres? Only that women SFF writers seem to have a stronger platform from which to argue their case for equality, simply because their dialogue about content, bias and narrative is so much more linguistically robust than in the literary community. This is not to say that the problems outlined by the recent VIDA statistics on the representation of women vs men in literary reviews are absent in SFF; indeed, it has been demonstrably proven that they aren’t. But when it comes to the question of awards, it doesn’t seem unduly optimistic to say that change is in the air. The Hugo Award nominees for Best Novel in 2011, for instance,¬†are all, with one exception, women, and the situation is identical when it comes to the Nebula. The 2010 Campbell Award was won by a woman, Seanan McGuire, and for the purposes of my argument about subgenre, it doesn’t strike me as irrelevant to note that McGuire’s debut novel, Rosemary and Rue, was the first work of urban fantasy to win its author the Campbell, nor that it did so in competition with another female-authored UF novel: Gail Carriger’s Soulless.

So much of the debate I’ve seen about the disenfranchisement of women literary writers centers on anger at the perception of women’s novels as being “domestic” (where such a label is pejorative) compared to those written by men, which naturally deal with Big Themes and Important Issues. What has always struck me about this complaint – aside from the fact that it appears to be correct, both intuitively and in terms of critical perception – is the difficulty these writers seem to have articulating the problem. They talk about literature and literary fiction as a single entity, grasping after a magical phrase that will allow them to explain simultaneously why women might indeed be more prone to writing about domestic topics, why this is not a bad thing, and why it still counts as literature. Because such amorphous justifications are exactly what subgenre terminology exists to prevent, allowing us to acknowledge that two different types of storytelling are related to one another, that they share a common ancestry and ultimately a common genre, but that their conventions and approaches may nonetheless be very, very different. As in the case of last year’s Hugo Award nominees for Best Novel, it allows us to put vastly different works like China Mieville’s The City & The City, Paolo Bacigalupi’s The Windup Girl and Catherynne M. Valente’s Palimpsest on the same ballot, despite the fact that the first is heavily noir/New Weird, the second is dystopian SF, and the third is urban fantasy/mythpunk.

It also puts the SFF community in an excellent position to discuss tropes and archetypes. A communal, cultural resource like TV Tropes provides a go-to lexicon for discussing narrative structure in shorthand, with many such terms finding their way into the mainstream dialogue as a matter of course. Perhaps it’s because the origins and inspirations of SFF are so heavily rooted in jargon-heavy areas like mythology, science, linguistics, pop culture and folklore that the community has taken so readily to isolating and naming its parts; alternatively, it seems reasonable to assume that any group of people who spend a significant proportion of their intellectual lives reading made-up languages, interpreting new cultures and memorising invented systems of magic will inevitably come to appreciate the deep precision and relevance of language. Whatever it is, the literary community doesn’t seem to share it – or if it does, then to nowhere near the same extent.

As more than one would-be inventor of slanguage has come to realise, there’s no telling which new terms will enter our collective vocabularies or die a series of quick deaths. But as corny and New Age as it might seem, it strikes me that the writers most deeply concerned with the state of literary fiction and its biases against women could do a lot worse than trying to coin some terms of their own: to name the archetypes they wish to invert or criticise and thereby open up the discussion. If authors can be thought of as magicians in any sense, then the root of our power has always rested with words: choosing them, arranging them and – most powerfully – inventing them. Sexism won’t go away overnight, and nor will literary bias. But until then, if we’re determined to invest ourselves in bringing about those changes, it only makes sense to arm ourselves with a language that we, and not our enemies, have chosen.

So, OK. As those of you who’ve known me for any length of time can attest – and as I have once or twice admitted in the writing of this blog – I am a zeusdamn stubborn, conservative person. It is actually very irksome! Because stubbornness and conservatism are not behaviours I consciously cultivate; are in fact the very antithesis of the behaviours I like, let alone try to cultivate; and yet they are apparently innate enough that I am constantly forced to suspect myself of them, to press the ever-present bruise of my own laziness in order to determine whether I am being honest and discerning as opposed to reactionary and biased at any given time. As I am simultaneously the kind of person who goes around recommending books and films (for instance) to all and sundry with the expectation that they start to adopt my tastes, this makes me very close to belonging to two categories of person with whom I am otherwise deeply uncomfortable: hypocrites and preachers.

My only saving grace is the fact that I recognise this at least some of the time, and am actively struggling to change. But for most of my life, that hasn’t been true, with the end result that now, slightly less than a month out from my 25th birthday, I’m starting to wonder exactly how many awesome things I’ve been missing out on for no greater reason than my own intransigence. Which is, itself, a conceit, because I mean, come on: twenty-freaking-five. It’s not like I’m Citizen Kane crying out for Rosebud on my deathbed, here. Despite the fact that I’ve been married for three and a bit years, and in serious relationships for five-odd years before that, and in the midst of becoming a published author for about two years, and have finished a Bachelors degree, and have moved first states and now countries, and held down a frankly surprising variety of the sort of jobs I never really knew existed until I started applying for them, and all the sort of gunk that seems to fill up your late teens and early twenties if you’re lucky enough to live in a first world nation where you speak the national language and have been relatively well-off your whole life and have never had to contend with poverty or civil war or persecution or any major trauma; despite all that, I am, by the standards of both my own culture and the scientific community, barely out of adolescence. I am young.

But I am also much less young than I was even a year ago, or the year before that, or the year before that; and even though as a teenager it would never have occurred to me that I could sit here and be almost 25 and¬†so very different now to how I was then, I can still – just – stretch to remembering my teenage self, her views and preoccupations and ignorances, without universally cringing at how utterly infantile and stupid they were, so that any sense I used to have that I was already grown up must only ever have been wrong. I feel torn: can I deny that I’ve grown since then, and that those changes have been increasingly positive? No, I can’t: but does that automatically mean that whatever I used to be is therefore rendered incorrect, reprehensible? Psychologists say that one of the key stages of childhood development is the tendency to first disdain and then throw away those trappings of whatever age we have just outgrown, like a fledgeling tweenager tossing out her toys. I must still be a child, then, because more and more, I feel like every step I take to change myself is simultaneously a battle to refrain from mocking, not plastic horses and skipping games, but previous ideologies.

Once, as a first year university student, I wrote an angry letter to a Sydney newspaper about its inflammatory coverage of a series of car crashes involving adolescent drivers. It was terrible, yes, and those people had been stupid, but their reactionary condemnation of all youthful drivers – the suggestion that driving curfews be implemented, limitations imposed on the ability of teens to carry passengers – was out of line. No matter how much they raised the age limit for acquiring a driving license, I argued, and even taking into account whatever risk-taking predispositions we could all agree were more likely in the young, a significant part of the problem would still be inexperience behind the wheel. Some things you simply cannot learn through shortcuts, or any way but the hard way: sooner or later, we all make mistakes, because suffering their consequences is how humans learn, and even if nobody was ever allowed in a car before the age of 27, new drivers would still account for their fair share of accidents. Not because of their age: because they were new. And in the mean time, given that adult drivers would continue to account for the other eighty-something percent of accidents, what would happen if we broke the statistics down into age brackets? Would we find that the most elderly drivers were the least accident-prone, or that the probability of accidents would regularly decrease with age? Does getting older always make you better?

Turning five did not make me morally superior to my two-year-old self; just older and physically different. Turning fifteen did not make me morally superior to my twelve-year-old self;¬†just older and physically different. The same will be true again when I turn twenty-five, and thirty-five, and every age after that. In so many of these blogs, I’ve written about the frustrations I felt as a teenager, how it was hard to get adults to take me seriously and how they all appeared to have gone through a brainwashing machine at some point or emerged fully formed from alien pod-plants. Even though I could understand things at fourteen that were incomprehensible to my four-year-old self, that greater proximity to the adult world made it seem as though adulthood was a static realm towards which I was both inexorably travelling and closer to reaching than ever, so that any suggestion of considering how much I’d already changed as a way of anticipating how much farther I had yet to go would have seemed futile, insulting; as though, on the cusp of adulthood, I still deserved to be reminded of – judged by – those things I’d outgrown;¬†as though I hadn’t really grown up at all.

Which, of course, I hadn’t, because the whole idea was a lie. Nobody ever grows up. We just grow. But our language, which betrays so much of culture, suggests otherwise: hierarchies are linear, top to bottom: growing up means growing better. Nobody grows down. And yet up connotes even more than that. It makes us think of a fixed destination when there is none; it makes us want to not only cast off who we were, but disparage it as unnecessary, as though the very notion of ever being someone else is embarrassing, taboo; as though that prior person were utterly unrelated to every single subsequent incarnation.

Tonight, I have been reading Lilith’s Brood by Octavia E Butler, a single novel made from the collection of a trilogy of novels: Dawn, Adulthood Rites, and Imago. Having only just reached the start of the second of these, I came across a particularly beautiful quote. It is the reason I stopped to write this post; to consider why I had never read Butler before now, despite having heard of her, and to wonder if perhaps the reason I find her so moving, so compelling, is because I am reading her now. Would any of my earlier selves have understood?

Butler asks:

“Trade means change. Bodies change. Ways of living must change. Did you think your children would only look different?”

And I answer:

Not any more.

For some time now, I’ve been a serial language learner. In primary school, my Year 3 teacher spoke Japanese and taught some of it to¬†my class, which we dutifully learned. Hearing of this,¬†my grandmother, who taught Japanese immigrants to speak English after World War II, gave me the books she’d used to study the language herself. In this context, I started taking extra-curricular Japanese lessons. I was not, however,¬†a dedicated student: detesting repetitive practice in the manner of children who otherwise learn so quickly as to find it tiresome, I made no effort to learn my katakana or kanji, and despite the fact that I enjoyed counting and making origami figures, the lessons eventually stopped.¬†My occasional childhood encounters with¬†Japanese¬†culture, however, continued: first in the form of Miyoko Kyushu, an exchange student who stayed with my family for several weeks, and then in the guise of new neighbours, who, though Norweigian by descent, had lived in Japan for many years. All three sons learned the language, while both parents spoke it fluently. Like Miyoko, they¬†kept the Japanese tradition of bringing gifts to one’s hosts, so that when we first met the Johansens at a welcoming-the-neighbours barbeque, the¬†wooden geisha doll, Japanese picture book and hand-sewn juggling balls Miyoko had given me found company with¬†a puzzlingly-shaped Japanese bag and several boxes of sticky (but delicious) Hello Kitty candies. With the exception of these last edible items, I still have everything else. Like my knowledge of Japanese numbers, it seems, they’ve never quite slipped away.

In high school, I learned French and German as part of the school curriculum. Some words from each have stuck with me, such as counting sequences, greetings and a handful of random nouns, although somewhat inexplicably, I’ve also retained a teaching song in French detailing the birthday gifts received by a fictitious singer from his various relatives. Around the same time, I decided that archaeology was my destined career, and was advised that the best languages to learn for this were Latin (for the antiquity) and German (for reasons which now seem both dubious and odd). Given that I went to a public school, such a decision was problematic: with seventeen interested students deemed not enough to sustain a full class, I ended up taking German after school, while for Latin, I was forced to resort to a correspondence course.

When I changed schools the following year, the German didn’t last; but Latin did. I kept it up through all of highschool, even taking advanced Latin units for the HSC despite my appalling grasp of grammar. Once again, my lack of enthusiasm for rote learning saw any chance at fluency well and truly shot, although my pronunciation skills and stock vocabulary were generally on par. By the time university rolled around, my interests had swung from archaeology to the history of the Middle East, such that, rather than continuing Latin, I started learning Arabic instead. I stuck it out for one year, but was still, ultimately, a lazy student: I simply¬†couldn’t (or wouldn’t)¬†motivate myself to do the required homework and memorisation necessary for learning a spoken language, despite the fact that learning a new script had proved a sinch – after all, I used to invent alphabets in class when I was bored, memorise them in that hour, then write to myself in that cypher for a day, or a week, or however long it took me to lose interest or start again. But vocal fluency is different. Historically, I’ve been unjustly apathetic in this regard, perhaps because¬†I find it frustrating to have to actually work at acquiring¬†a new language, when in¬†almost every other discipline – the exception being maths, which I’ve never liked – I’m able to osmose¬†comprehension with a¬†comparative lack of effort, especially when interested in the subject.¬†That’s the irony of native intelligence: without a competitive drive, learning becomes purely a matter of convenience. And I’m not a competitive person.

For a while, then, I stopped learning languages – until a few weeks ago, when a friend offered to teach a beginner’s course in Mandarin Chinese. I went to four or five of his classes, and had a good time: if nothing else, I can now count to ten in Mandarin, and at least in the short term, I can recognise certain words and written characters. As with Arabic, however, there’s a strong chance I’ll forget most, if¬† not all of it, although my track record suggests that if anything stays, it will be the numbers. This might seem paradoxical, given my dislike of maths, but remembering things in sequence is always easier than remembering them individually, at least for me.

Subsequently, since stopping the Mandarin classes, I’ve been thinking about my history with trying and failing to acquire new languages. I like the idea of being bilingual,¬† but short of actually moving to a non-English-speaking country, could I ever convince myself to put in the required effort? Certainly, I’m more dedicated¬†now than I was then, and more patient; this time around, it was time constraints which caused the change of heart,¬†not lack of interest. Which brings me back to Japanese, the first language¬†I ever tried to learn, and the one which, oddly, I still have the most to do with. Although my foray into learning karate ended several years ago, I still remain extremely interested in anime. Since discovering anime and manga through a friend at the start of high school, I’ve never wavered in my affection for the genre, and although at times it’s been a secondary interest, I’m currently undergoing a surge of renewed fandom. Which makes me realise that, far from having forgotten the little Japanese I learned as a child, I’ve actually built upon it, albeit in a highly specalised area. Thanks to the catchy themes of shows like Cowboy Bebop and¬†Evangelion,¬†I’ve taken the time to write down and memorise the written-English¬†phoenetics to several Japanese songs, learning them by heart. Through comics, interested Googling and contextual exposure, I’ve picked up the various Japanese terms of address, the rules governing their usage, and a smattering of vocab. Cumulatively, this represents the greatest interest I’ve ever directed towards learning a language, despite having nothing to do with academics. And it’s been fun.

All of which leads me to conclude that, if I were to sit down as an adult and properly¬†attempt a language, in my own time and of my own volition, I’d be well advised to try Japanese, coming full circle. And all for the geekiest, laziest¬†possible reason. Which makes me grin.

Ah, irony!

I was in a fey mood last night, but ‘fey’ didn’t quite seem to cover it. Burdened with the need to update my Facebook status accurately and appropriately, I scanned my knowledge of the English language for a suitable adjective – fruitlessly. Finally, after many minutes of struggle, I put on my thinking boots and¬†invented¬†a new¬†word: mnemencholy, derived from mneme (memory) and melancholy (sadness). Content at last, I slept.

On waking, I discovered that the illustrious Nick Harkaway, that well-known Englishman and little-known lexicographer, had already found my word and proceeded to blog a better definition for mnemencholia than I could possibly articulate. I am therefore stealing it; or rather, approving it for future usage. So, for those who are interested, mnemencholia (from mnemencholy) now officially means:

“Nostalgic sorrow brought on by recollection; melancholia triggered by an object, phrase, or scent and its associated memories; the wide sense of understanding and regret rising from the apprehension of one’s own history.”

Awesome.

I love the idea of¬†neologisms. Above any other quirk, I¬†cherish¬†the malleability of the English language. It rewards linguistic creativity, and, indeed, encourages it. There’s something profoundly satisfying in creating or stumbling on a new term, particularly if we find it clever, or funny, or apt, or (especially) all three. I love that crazy, screwball, onomatopoeic¬†slang like woot and clusterfuck can breed successfully in darkness, like forest mushrooms. I love that Shakespeare¬†has left us with Shylock and seachange; that A. A. Milne gave us heffalump, tigger and wol;¬†that crazy British aristocrats gave us sandwich, sundowner and¬†pukka while equally crazy Londoners gave us yob and Cockney rhyming slang. I love that tactile imagery like whale tail, muffin top and¬†bridezilla made their way to the dictionary, while gribblies, grock and meme are increasingly of the now.

What I don’t like, however, is corporate jargon. I shudder at every mention of swings and roundabouts, blue sky thinking, synergistics, action items or actioning tasks. Some people might (and, indeed, have) called that hypocritical, but the difference is one of joy and functionality. Corporate jargon doesn’t delight in itself. It isn’t clever, nor do buzzwords become popular because people enjoy their use. Rather, they become awkward, mechanical mainstays, often more cumbersome and less helpful than the plain language they replace. Technical jargon, in its proper sense, means words that are part of a specialised¬†vocabularly, as in the medical, legal and IT professions, but this is not true of corporate jargon. It obfuscates, generalises, hinders. Many terms¬†grow, not from¬†playful creativity, but uncorrected¬†malapropisms. Whereas slang¬†is viral in the¬†digital sense, passing rapidly¬†by word of mouth through a series of enthusiastic adapters, corporate jargon is a virus in the medical sense,¬†infiltrating healthy cells and using them to manufacture new infections, which then spread¬†through a mixture of¬†force, proximity and submission.¬†Cliches, at least, began as¬†sturdy concepts: their very effectiveness¬†lead to overfamiliarity, like playing a favourite song so¬†frequently that it becomes¬†unbearable. The best mutate into aphorisms. Not so corporate jargon, which is¬†propagated¬†purely on¬†the basis of necessity, and not¬†effectiveness. ¬†¬†

In short, good language is just another way of thinking clearly, or creatively, or at all. Like all new things, neologisms need to be tested, experimented with, tried on – our choice of slang is just as relevant¬†to our personalities as our¬†taste in clothes, films or music, and yet, quite often, we fail to even make a conscious decision about the words we use, or the circumstances under which we use them. Language, it’s been said, is the most singular achievement of our species, and¬†even without an alphabet, it’s still something unbelievably special.

So don’t take¬†your speech for granted. Read up on collective nouns (they’re pretty awesome); put old words into new contexts;¬†watch Joss Whedon shows; read Scott Westerfeld or Shakespeare or Kaz Cooke or Geoffrey McSkimming or anyone at all; think. But more than that, have fun.

It’s what words are for.

I’m thoroughly fed up with the deluge of patriotic, nationalistic advertising during the Olympics coverage. Top offenders include Telstra, with their motifs of manufacturedly-diverse Australians clustered around mobile phones to watch the Games; Qantas, with their children’s choir singing in the shape of a kangaroo about which island continent they call home; and Panasonic, who have shamelessly co-opted almost the entire swim team in order to sell more cameras. The Commonwealth Bank also rates a mention, not so much¬†due to¬†patriotism, but because their bizarre series of forcedly-post-modern, let’s-mock-American-marketeers-while-simultaneously-selling-home-loans commercials are currently broadcast on Channel 7 at the rate of approximately ten to the half hour.

When it comes to bafflement, however, Red Rooster takes the cake. Their most recent campaign slogan, ‘it’s gotta be red’, has been frotting around the airwaves for most of 2008, but has been quixotically altered in honour of the Olympics. ¬†‘Notice how well red goes with China?’ their ads ask – and for the life of me, I cannot tell whether irony is intended, or if the fact that red¬†is traditionally synonymous with communism has managed to completely escape the marketing gurus of a giant American – that is to say, capitalist – corporation. Surely, a part of me thinks, this can’t be the case. Someone, somewhere must have pointed out that China’s flag is red for a reason. But if that be so, then the irony is unintended, and therefore equally perturbing in its implications: that a capitalist company has,¬†on the one hand, publicly commented on how well communism suits China;¬†and on the other,¬†is now using¬†this fact¬†to sell chicken.

Truly, the mind boggles.