Posts Tagged ‘Politics’

When I think about the state of global politics, I often imagine how it’s going to be viewed in the future.  My reflex is to think in terms of high school history textbooks, but that phrase evokes a specific type of educational setup that already feels anachronistic – that of overpriced, physical volumes written specifically for teaching teenagers a set curriculum, rather than because they represent good historical summaries in their own right. I think about our penchant for breaking the past down into neatly labelled epochs, and wonder how long it will take for some sharp-tongued future historian to look at the self-professed Information Age, as we once optimistically termed it, examine its trajectory through the first two decades of the new millennium, and conclude that it should be more fittingly known as the Disinformation Age.

With that in mind, here’s my hot take on what a sample chapter from such a historical summary might look like:

Chapter 9: Perprofial Media, Propaganda and Power

Perprofial, adj: something which is simultaneously personal, professional and political. 

When Twitter, the first widespread micro-blogging platform, was launched in 2006, no one could have predicted that, barely eleven years later, this new perprofial medium would have irrevocably changed the political landscape. Earlier social media sites, such as Facebook, were foremost a digital extension of existing personal networks, with aspirational connections an afterthought; traditional blogging, by contrast, began as a form of mass broadcast diarising which steadily – though not without hiccups – osmosed the digital remnants of print-era journalism. But from the outset, Twitter was a platform whose users could both listen and be listened to, a sea of Janus-headed audience-performers whose fame might as easily precede that particular medium as be enabled by it, unless it was both or neither. The draw of enabling the unknown, the upcoming, the newly-minted and the long-established to all rub shoulders at the same party – or at least, to shout around each other from the variegated levels of an infinite, Escheresque ballroom – was considered just that: a draw, instead of – as it more properly was – a Brownian mob-theory engine running in 24/7 real time without anything like a Chinese wall, a fact-checker or a control group to filter the variables.

The true point at which Twitter stopped being a social media outlet and became a Trojan horse at the gates of the Fifth Estate is now a Sorites paradox. We might not be able to pinpoint the exact time and date of the transition, but such coordinates are vastly less important than the fact that the switch itself happened. What we can identify, however, is the moment when the extrajudicial nature of the power wielded by perprofial platforms became clear at a global level.

Though Donald Trump’s provocative online statements long preceded his tenure as president, and while they had consistently drawn commentary from all corners, the point at which his tweets were publicly categorised as a declaration of war by North Korean authorities was a definite Rubicon crossing. As Twitter could – and did – ban users for issuing threats of violence in violation of its Terms of Service, it was argued, then why should it allow a world leader to openly threaten war? If the “draw” of the platform was truly a democratising of the powerful and the powerless, then surely powerful figures should be held to the same standards as everyone else – or even, potentially, to more rigorous ones, given the far greater scope of the consequences afforded them by their fame.

But first, some context. At a time of resurgent global fascism and with educational institutions increasingly hampered by the anti-intellectual siege begun some sixty years earlier, when the theory of “creationism” was first pitched as a scientific alternative in American public schools, the zeitgeist was saturated with the steady repositioning of expertise as toxic to democracy. Early experiments in perprofial media, then called “reality television,” had steadily acclimated the public to the idea that personal narratives, no matter how uninformed, could be a professional undertaking – provided, of course, that they fit within an accepted sociopolitical framework, such as radical weight loss or the quest for fame. At the same time, the rise of the internet as a lawless space where anyone could create and promote their own content, regardless of its quality, created an explosion of self-serving informational feedback loops which, both intentionally and by accident, preyed on the uncritical fact-absorption of generations taught to accept that anything written down in an approved book – of which the screen was seemingly just a faster, more convenient extension – was necessarily true.

The commensurate decline of print-based journalism was the final nail in the coffin. To combat the sharp loss of revenue necessitated by a jump from an industry financed by a cornered market, lavish advertising revenue and a locked-in pay-per-issue model to the still-nebulous vagaries of digital journalism, where paid professional content existent on the same apparent footing as free amateur blogging, corners were cut. Specialists and sub-editors were let go, journalists were alternately asked or forced to become jacks of all trades, and content was recycled across multiple outlets. All of these changes were drastic enough to be noticeable even to the uninitiated; even so, the situation might still have been salvageable if not for the fact that, in looking to compete in this new environment, the bulk of traditional outlets made the mistake of assuming that the many digital amateurs of the blogsphere were, in aggregate, equivalent to their old nemesis, the tabloid press.

Scandal-sheets are a tradition as old as print journalism, with plenty of historical overlap between the one and the other. At some time or another, even the most reputable papers had all resorted to sensationalism – or at least, to real journalism layered with editorial steering – in an effort to wrest their readerships back from the tabloids, but always on the understanding that their legacy, their trustworthiness as institutions, was established enough to take the moral hit. But when this same tactic was tried again in digital environs, the effect was vastly different. Still struggling with web layouts and paywalls, most traditional papers were demonstrably harder and less intuitive to navigate than upstart blogs, and with not much more to boast in the way of originality (since they’d sacked so many writers) or technical accuracy (since they’d sacked so many editors), the decision to switch to tabloid, clickbait content – often by hiring from the same pool of amateur bloggers they were ultimately competing with, leveraging their decaying reputations as compensation for no or meagre pay in a job market newly seething with desperate, unemployed writers – backfired badly. Rather than reclaimed readerships, the effect was to cement the idea that the only real difference between professional news and amateur opinion wasn’t facts, or training, or integrity, but a simple matter of where you preferred to shop.

The internet had become an information marketplace – quite literally, in the case of Russia bulk-purchasing ads on Facebook in the lead-up to the 2016 US presidential election. In Britain, the success of the Leave vote in the Brexit referendum was attributed in part to voters having “had enough of experts” – the implication being that, contrary to the famous assertion of Isaac Asimov, many people really did think their ignorance was just as good as someone else’s knowledge. Though Asimov was speaking specifically of American anti-intellectualism and a false perception of democracy in the 1980s, his fears were just as applicable some forty years later, and arguably moreso, given the rise of perprofial media.

In the months prior to his careless declaration of war, then-president Trump made a point of lambasting what he called the “fake news media”, which label eventually came to encompass every and any publication, whether traditional or digital, which dared to criticise him; even his former ally, Fox News, was not exempt. In the immediate, messy aftermath of the collapse of print journalism, this claim rang just nebulously true enough to many that, with so many trusted papers having perjured themselves with tabloid tactics, Trump was able to situate himself as the One True Authority his followers could trust.

It’s important to note, however, that not just any politician, no matter how sociopathic or self-serving, could have pulled off the same trick. The ace in Trump’s sleeve was his pre-existing status as a king in the perprofial arena of reality television, which had already helped to re-contextualise democracy – or the baseline concept of a democratic institution, rather – as something in which expertise was only to be trusted if supported by success, where “success” meant “celebrity”. Under this doctrine, those who preached expertise, but whom the listener had never heard of, were considered suspect: true success meant fame, and if you weren’t famous for what you knew, then you must not really be knowledgeable. By the same token, celebrities who claimed expertise in fields beyond those for which they were famous were also criticised: it was fine to play football or act, for instance, but as neither skill was seen to have anything to do with politics, the act of speaking “out of turn” on such topics was dismissed as mere self-aggrandising. Actual facts had nothing to do with the matter, because “actual facts” as a concept was rendered temporarily liminal by the struggle between amateur and professional media.

With such “logic” to support him, Trump couldn’t lose. What did his lack of political qualifications matter? He’d still succeeded at getting into politics, which meant he must have learned by doing, which meant in turn that his fame, unlike that of other celebrities, made him an inviolate authority on political matters. Despite how fiercely he was opposed and resisted, his repeated, defensive cries of “fake news!” rang just true enough to sow doubt among those who might otherwise have opposed him.

And so to Twitter, and a declaration of war. By historical assumption, Trump as president ought to have been the most powerful man in the world, but by investing so much of that power in a perprofial platform – one to whose rules of conduct he was personally bound, without any exemption or extenuation on account of his office – he had, quite unthinkingly, agreed to let an international corporation place extrajudicial sanctions, not only on the office of the presidency, but through Trump as an individual and his investiture as the head of state, on a declaration of war.

In the next chapter: racism, dogwhistles and spinning the Final Solution.

*

History is, of course, what we make of it. Right now, I just wish we weren’t making quite so much.

 

   

 

 

 

Advertisements

Warning: spoilers for Shin Godzilla.

I’ve been wanting to see Shin Godzilla since it came out last year, and now that it’s available on iTunes, I’ve finally had the chance. Aside from the obvious draw inherent to any Godzilla movie, I’d been keen to see a new Japanese interpretation of an originally Japanese concept, given the fact that every other recent take has been American. As I loaded up the film, I acknowledged the irony in watching a disaster flick as a break from dealing with real-world disasters, but even so, I didn’t expect the film itself to be quite so bitingly apropos.

While Shin Godzilla pokes some fun at the foibles of Japanese bureaucracy, it also reads as an unsubtle fuck you to American disaster films in general and their Godzilla films in particular. From the opening scenes where the creature appears, the contrast with American tropes is pronounced. In so many natural disaster films – 2012, The Day After Tomorrow, Deep Impact, Armageddon, San Andreas – the Western narrative style centres by default on a small, usually ragtag band of outsiders collaborating to survive and, on occasion, figure things out, all while being thwarted by or acting beyond the government. There’s frequently a capitalist element where rich survivors try to edge out the poor, sequestering themselves in their own elite shelters: chaos and looting are depicted up close, as are their consequences. While you’ll occasionally see a helpful local authority figure, like a random policeman, trying to do good (however misguidedly), it’s always at a remove from any higher, more coordinated relief effort, and particularly in more SFFnal films, a belligerent army command is shown to pose nearly as much of a threat as the danger itself.

To an extent, this latter trope appears in Shin Godzilla, but to a much more moderated effect. When Japanese command initially tries to use force, the strike is aborted because of a handful of civilians in range of the blast, and even when a new attempt is made, there’s still an emphasis on chain of command, on minimising collateral damage and keeping the public safe. At the same time, there’s almost no on-the-ground civilian elements to the story: we see the public in flashes, their online commentary and mass evacuations, a few glimpses of individual suffering, but otherwise, the story stays with the people in charge of managing the disaster. Yes, the team brought together to work out a solution – which is ultimately scientific rather than military – are described as “pains-in-the-bureaucracy,” but they’re never in the position of having to hammer, bloody-fisted, on the doors of power in order to rate an audience. Rather, their assemblage is expedited and authorised the minute the established experts are proven inadequate.

When the Japanese troops mobilise to attack, we view them largely at a distance: as a group being addressed and following orders, not as individuals liable to jump the chain of command on a whim. As such, the contrast with American films is stark: there’s no hotshot awesome commander and his crack marine team to save the day, no sneering at the red tape that gets in the way of shooting stuff, no casual acceptance of casualties as a necessary evil, no yahooing about how the Big Bad is going to get its ass kicked, no casual discussion of nuking from the army. There’s just a lot of people working tirelessly in difficult conditions to save as many people as possible – and, once America and the UN sign a resolution to drop a nuclear bomb on Godzilla, and therefore Tokyo, if the Japanese can’t defeat it within a set timeframe, a bleak and furious terror at their country once more being subject to the evils of radiation.

In real life, Japan is a nation with extensive and well-practised disaster protocols; America is not. In real life, Japan has a wrenchingly personal history with nuclear warfare; America, despite being the cause of that history, does not.

Perhaps my take on Shin Godzilla would be different if I’d managed to watch it last year, but in the immediate wake of Hurricane Harvey, with Hurricane Irma already wreaking unprecedented damage in the Caribbean, and huge tracts of Washington, Portland and Las Angeles now on fire, I find myself unable to detach my viewing from the current political context. Because what the film hit home to me – what I couldn’t help but notice by comparison – is the deep American conviction that, when disaster strikes, the people are on their own. The rich will be prioritised, local services will be overwhelmed, and even when there’s ample scientific evidence to support an imminent threat, the political right will try to suppress it as dangerous, partisan nonsense.

In The Day After Tomorrow, which came out in 2004, an early plea to announce what’s happening and evacuate those in danger is summarily waved off by the Vice President, who’s more concerned about what might happen to the economy, and who thinks the scientists are being unnecessarily alarmist. This week, in the real America of 2017, Republican Rush Limbaugh told reporters that the threat of Hurricane Irma, now the largest storm ever recorded over the Atlantic Ocean, was being exaggerated by the “corrupted and politicised” media so that they and other businesses could profit from the “panic”.

In my latest Foz Rants piece for the Geek Girl Riot podcast, which I recorded weeks ago, I talk about how we’re so acclimated to certain political threats and plotlines appearing in blockbuster movies that, when they start to happen in real life, we’re conditioned to think of them as being fictional first, which leads us to view the truth as hyperbolic. Now that I’ve watched Shin Godzilla, which flash-cuts to a famous black-and-white photo of the aftermath of Hiroshima when the spectre of a nuclear strike is raised, I’m more convinced than ever of the vital, two-way link between narrative on the one hand and our collective cultural, historical consciousness on the other. I can’t imagine any Japanese equivalent to the moment in Independence Day when cheering American soldiers nuke the alien ship over Las Angeles, the consequences never discussed again despite the strike’s failure, because the pain of that legacy is too fully, too personally understood to be taken lightly.

At a cultural level, Japan is a nation that knows how to prepare for and respond to natural disasters. Right now, a frightening number of Americans – and an even more frightening number of American politicians – are still convinced that climate change is a hoax, that scientists are biased, and that only God is responsible for the weather. How can a nation prepare for a threat it won’t admit exists? How can it rebuild from the aftermath if it doubts there’ll be a next time?

Watching Shin Godzilla, I was most strongly reminded, not of any of the recent American versions, but The Martian. While the science in Shin Godzilla is clearly more handwavium than hard, it’s nonetheless a film in which scientific collaboration, teamwork and international cooperation save the day. The last, despite a denouement that pits Japan against an internationally imposed deadline, is of particular importance, as global networking still takes place across scientific and diplomatic back-channels. It’s a rare American disaster movie that acknowledges the existence or utility of other countries, especially non-Western ones, beyond shots of collapsing monuments, and even then, it’s usually in the context of the US naturally taking the global lead once they figure out a plan. The fact that the US routinely receives international aid in the wake of its own disasters is seemingly little-known in the country itself; that Texas’s Secretary of State recently appeared to turn down Canadian aid in the wake of Harvey, while now being called a misunderstanding, is nonetheless suggestive of confusion over this point.

As a film, Shin Godzilla isn’t without its weaknesses: the monster design is a clear homage to the original Japanese films, which means it occasionally looks more stop-motion comical than is ideal; there’s a bit too much cutting dramatically between office scenes at times; and the few sections of English-language dialogue are hilariously awkward in the mouths of American actors, because the word-choice and use of idiom remains purely Japanese. Even so, these are ultimately small complaints: there’s a dry, understated sense of humour evident throughout, even during some of the heavier moments, and while it’s not an action film in the American sense, I still found it both engaging and satisfying.

But above all, at this point in time – as I spend each morning worriedly checking the safety of various friends endangered by hurricane and flood and fire; as my mother calls to worry about the lack of rain as our own useless government dithers on climate science – what I found most refreshing was a film in which the authorities, despite their faults and foibles, were assumed and proven competent, even in the throes of crisis, and in which scientists were trusted rather than dismissed. Earlier this year, in response to an article we both read, my mother bought me a newly-released collection of the works of children’s poet Misuzu Kaneko, whose poem “Are You An Echo?” was used to buoy the Japanese public in the aftermath of the 2011 tsunami . Watching Shin Godzilla, it came back to me, and so I feel moved to end with it here.

May we all build better futures; may we all write better stories.

Are You An Echo?

If I say, “Let’s play?”
you say, “Let’s play!”

If I say, “Stupid!”
you say, “Stupid!”

If I say, “I don’t want to play anymore,”
you say, “I don’t want to play anymore.”

And then, after a while,
becoming lonely

I say, “Sorry.”
You say, “Sorry.”

Are you just an echo?
No, you are everyone.

 

 

 

A poem by me, with apologies to Dylan Thomas:

Nevertheless, She Persisted

Nevertheless, she persisted.

Live women fighting we shall be one

With la Liberté and the French Joan;

When their hearts are picked clean and the clean hearts gone,

She shall wear laws at elbow and foot;

Though she go mad she will be sane,

Though she flees through the sea she shall rise again;

Though justice be lost the just shall not;

For nevertheless, she persisted.

.

Nevertheless, she persisted.

Over the whinings of their greed

Men lying long have now lied windily;

Changing their tacks when stories give way,

Stacking their courts, yet we shall not break;

Faith in our hands shall snap in two

And the unicorn evils run them through;

Split all ends up she shan’t crack;

And nevertheless, she persisted.

.

Nevertheless, she persisted.

No more may Foxes cry in decline

Or news break loud to a silenced room;

Where fawned a follower may a follower no more

Bow his head to the blows of this reign;

Though she be mad and tough as nails,

Her headlines in characters hammer the dailies;

Break in the Sun ‘till the Sun breaks down,

As nevertheless, she persisted.

The last few weeks or so, I’ve seen the same video endlessly going around on Facebook: a snippet of an interview with Simon Sinek, who lays out what he believes to be the key problems with millennials in the workplace. Every time I see it shared, my blood pressure rises slightly, until today – joy of joys! – I finally saw and shared a piece rebutting it. As often happens on Facebook, a friend asked me why I disagreed with Sinek’s piece, as he’d enjoyed his TED talks. This is my response.

In his talk, Sinek touches on what he believes to be the four core issues handicapping millennials: internet addiction, bad parenting, an unfulfilled desire for meaningful work and a desire to have everything instantly. Now: demonstrably, some people are products of bad parenting, and the pernicious, lingering consequences of helicopter parenting, wherein overzealous, overprotective adults so rob their children of autonomy and instil in them such a fear of failure that they can’t healthily function as adults, is a very real phenomenon. Specifically in reference to Sinek’s claims about millennials all getting participation awards in school (which, ugh: not all of us fucking did, I don’t know a single person for whom that’s true, shut up with this goddamn trope), the psychological impact of praising children equally regardless of their actual achievements, such that they come to view all praise as meaningless and lose self-confidence as a result, is a well-documented phenomenon. But the idea that you can successfully accuse an entire global generation of suffering from the same hang-ups as a result of the same bad parenting stratagems, such that all millennials can be reasonably assumed to have this problem? That, right there, is some Grade-A bullshit.

Bad parenting isn’t a new thing. Plenty of baby boomers and members of older generations have been impacted by the various terrible fads and era-accepted practises their own parents fell prey to (like trying to electrocute the gay out of teenagers, for fucking instance), but while that might be a salient point to make in individual cases or in the specific context of tracking said parenting fads, it doesn’t actually set millennials apart in any meaningful way. Helicopter parenting might be comparatively new, but other forms of damage are not, and to act as though we’re the only generation to have ever dealt with the handicap of bad parenting, whether collectively or individually, is fucking absurd. But more to the point, the very specific phenomenon of helicopter parenting? Is, overwhelmingly, a product of white, well-off, middle- and-upper-class America, developed specifically in response to educational environments where standardised testing rules all futures and there isn’t really a viable social safety net if you fuck up, which leads to increased anxiety for children and parents both. While it undeniably appears in other countries and local contexts, and while it’s still a thing that happens to kids now, trying to erase its origins does no favours to anyone.

Similarly, the idea that millennials have all been ruined by the internet and don’t know how to have patience because we grew up with smartphones and social media is – you guessed it – bullshit. This is really a two-pronged point, tying into two of Sinek’s arguments: that we’re internet addicts who don’t know how to socialise properly, and that we’re obsessed with instant gratification, and as such, I’m going to address them together.

Yes, internet addiction is a problem for some, but it’s crucial to note it can and does affect people of all ages rather than being a millennial-only issue, just as it’s equally salient to point out that millennials aren’t the only ones using smartphones. I shouldn’t have to make such an obvious qualification, but apparently, I fucking do. That being said, the real problem here is that Sinek has seemingly no awareness of what social media actually is. I mean, the key word is right there in the title: social media, and yet he’s acting like it involves no human interaction whatsoever – as though we’re just playing with digital robots or complete strangers all the time instead of texting our parents about dinner or FaceTiming with friends or building professional networks on Twitter or interacting with our readerships on AO3 (for instance).

The idea, too, that millennials have their own social conventions different to his own, many of which reference a rich culture of online narratives, memes, debates and communities, does not seem to have occurred to him, because we’re not learning to do it face to face. Except that, uh, we fucking are, on account of how we still inhabit physical bodies and go to physical places every fucking day of our goddamn lives, do I really have to explain that this is a thing? Do I really have to explain the appeal of maintaining friendships where you’re emotionally close but the person lives hundreds or thousands of kilometres away? Do I really have to spell out the fact that proximal connections aren’t always meaningful ones, and that it actually makes a great deal of human sense to want to socialise with people we care about and who share our interests where possible rather than relying solely on the random admixture of people who share our schools and workplaces for fun?

The fact that Sinek talks blithely about how all millennials grew up with the internet and social media, as though those of us now in our fucking thirties don’t remember a time before home PCs were common (I first learned to type on an actual typewriter), is just ridiculous: Facebook started in 2004, YouTube in 2005, Twitter in 2006, tumblr in 2007 and Instagram in 2010. Meaning, most millennials – who, recall, were born between 1980 and 1995, which makes the youngest of us 21/22 and the eldest nearly forty – didn’t grow up with what is now considered social media throughout our teenage years, as Sinek asserts, because it didn’t really get started until we were out of high school. Before that, we had internet messageboards that were as likely to die overnight as to flourish, IRC chat, and the wild west of MSN forums, which was a whole different thing altogether. (Remember the joys of being hit on by adults as an underage teen in your first chatroom and realising only years later that those people were fucking paedophiles? Because I DO.)

And then he pulls out the big guns, talking about how we get a dopamine rush when we post about ourselves online, and how this is the same brain chemical responsible for addiction, and this is why young people are glued to their phones and civilisation is ending. Which, again, yes: dopamine does what he says it does, but that is some fucking misleading bullshit, Simon Says, and do you know why? Because you also get a goddamn dopamine rush from talking about yourself in real life, too, Jesus fucking Christ, the internet is not the culprit here, to say nothing of the fact that smartphones do more than one goddamn thing. Sinek lambasts the idea of using your phone in bed, for instance, but I doubt he holds a similar grudge against reading in bed, which – surprise! – is what quite a lot of us are doing when we have our phones out of an evening, whether in the form of blogs or books or essays. If I was using a paperback book or a physical Kindle rather than the Kindle app on my iPhone, would he give a fuck? I suspect not.

Likewise, I doubt he has any particular grudge against watching movies (or TED talks, for that matter) in bed, which phones can also be used for. Would he care if I brought in my Nintendo DS or any other handheld system to bed and caught a few Pokemon before lights out? Would he care if I played Scrabble with a physical board instead of using Words With Friends? Would he care if I used the phone as a phone to call my mother and say goodnight instead of checking her Facebook and maybe posting a link to something I know will make her laugh? I don’t know, but unless you view a smartphone as something that’s wholly disconnected from people – which, uh, is kind of the literal antithesis of what a smartphone is and does – I don’t honestly see how you can claim that they’re tools for disconnection. Again, yes: some people can get addicted or overuse their phones, but that is not a millennial-exclusive problem, and fuck you very much for suggesting it magically is Because Reasons.

And do not even get me started on the total fuckery of millennials being accustomed to instant gratification because of the internet. Never mind the fact that, once again, people of any age are equally likely to become accustomed to fast internet as a thing and to update their expectations accordingly – bitch, do you know how long it used to take to download music with Kazaa using a 56k modem? Do you know how long it still takes to download entire games, or patches for games, or – for that matter – drive through fucking peak-hour traffic to get to and from work, or negotiate your toddler into not screaming because he can’t have a third juicebox? Because – oh, yeah – remember that thing where millennials stopped being teenagers quite a fucking while ago, and a fair few of us are now parents ourselves? Yeah. Apparently our interpersonal skills aren’t so completely terrible as to prevent us all from finding spouses and partners and co-parents for our tiny, screaming offspring, and if Mr Sinek would like to argue that learning patience is incompatible with being a millennial, I would like to cordially invite him to listen to a video, on loop, of my nearly four-year-old saying, “Mummy, look! A lizard! Mummy, there’s a lizard! Come look!” and see what it does for his temperament. (We live in Brisbane, Australia. There are geckos everywhere.)

But what really pisses me off about Sinek’s millennial-blaming is the idea that we’re all willing to quit our jobs because we don’t find meaning in them. Listen to me, Simon Sinek. Listen to me closely. You are, once again, confusing the very particular context of middle-class, predominantly white Americans from affluent backgrounds – which is to say, the kind of people who can afford to fucking quit in this economy – for a universal phenomenon. Ignore the fact that the global economy collapsed in 2008 without ever fully recovering: Brexit just happened in the UK, Australia is run by a coalition of racist dickheads and you’ve just elected a talking Cheeto who’s hellbent on stripping away your very meagre social safety nets as his first order of business – oh, and none of us can afford to buy houses and we’re the first generation not to earn more than our predecessors in quite a while, university costs in the States are an actual goddamn crime and most of us can’t make a living wage or even get a job in the fields we trained in.

But yeah, sure: let’s talk about the wealthy few who can afford to quit their corporate jobs because they feel unfulfilled. What do they have to feel unhappy about, really? It’s not like they’re working for corporations whose idea of HR is to hire oblivious white dudes like you to figure out why their younger employees, working longer hours for less pay in tightly monitored environments that strip their individuality and hate on unions as a sin against capitalism, in a context where the glass ceiling and wage gaps remain a goddamn issue, in a first world country that still doesn’t have guaranteed maternity leave and where quite literally nobody working minimum wage can afford to pay rent, which is fucking terrifying to consider if you’re worried about being fired, aren’t fitting in. Nah, bro – must be the fucking internet’s fault.

Not that long ago, Gen X was the one getting pilloried as a bunch of ambitionless slackers who didn’t know the meaning of hard work, but time is linear and complaining about the failures of younger generations is a habit as old as humanity, so now it’s apparently our turn. Bottom line: there’s a huge fucking difference between saying “there’s value in turning your phone off sometimes” and “millennials don’t know how to people because TECHNOLOGY”, and until Simon Sinek knows what it is, I’m frankly not interested in whatever it is he thinks he has to say.

annie-mic-drop

And lo, in the leadup to Christmas, because it has been A Year and 2016 is evidently not content to go quietly into that good night, there has come the requisite twitter shitshow about diversity in YA. Specifically: user @queen_of_pages (hereinafter referred to as QOP) recently took great exception to teenage YouTube reviewer Whitney Atkinson acknowledging the fact that white and straight characters are more widely represented in SFF/YA than POC and queer characters, with bonus ad hominem attacks on Atkinson herself. As far as I can make out, the brunt of QOP’s ire hinges on the fact that Atkinson discusses characters with specific reference to various aspects of their identity – calling a straight character straight or a brown character brown, for instance – while advocating for greater diversity. To quote QOP:

[Atkinson] is separating races, sexuality and showing off her white privilege… she wants diversity so ppl need to be titled by their race, disability or sexuality. I want them to be titled PEOPLE… I’m Irish. I’ve been oppressed but I don’t let it separate me from other humans.

*sighs deeply and pinches bridge of nose*

Listen. I could rant, at length, about the grossness of a thirtysomething woman, as QOP appears to be, insulting a nineteen year old girl about her appearance and lovelife for any reason, let alone because of something she said about YA books on the internet. I could point out the internalised misogyny which invariably underlies such insults – the idea that a woman’s appearance is somehow inherently tied to her value, such that calling her ugly is a reasonable way to shut down her opinions at any given time – or go into lengthy detail about the hypocrisy of using the term “white privilege” (without, evidently, understanding what it means) while complaining in the very same breath about “separating races”. I could, potentially, say a lot of things.

But what I want to focus on here – the reason I’m bothering to say anything at all – is QOP’s conflation of mentioning race with being racist, and why that particular attitude is both so harmful and so widespread.

Like QOP, I’m a thirtysomething person, which means that she and I grew up in the same period, albeit on different continents. And what I remember from my own teenage years is a genuine, quiet anxiety about ever raising the topic of race, because of the particular way my generation was taught about multiculturalism on the one hand and integration on the other. Migrant cultures were to be celebrated, we were told, because Australian culture was informed by their meaningful contributions to the character of our great nation. At the same time, we were taught to view Australian culture as a monoculture, though it was seldom expressed as baldly as that; instead, we were taught about the positive aspects of cultural assimilation. Australia might benefit from the foods and traditions migrants brought with them, this logic went, but our adoption of those things was part of a social exchange: in return for our absorption of some aspects of migrant culture, migrants were expected to give up any identity beyond Australian and integrate into a (vaguely homogeneous) populace. Multiculturalism was a drum to beat when you wanted to praise the component parts that made Australia great, but suggesting those parts were great in their own right, or in combinations reflective of more complex identities? That was how you made a country weaker.

Denying my own complicity in racism at that age would be a lie. I was surrounded by it in much the same way that I was surrounded by car fumes, a toxic thing taken into the body unquestioning without any real understanding of what it meant or was doing to me internally. At my first high school, two of my first “boyfriends” (in the tweenage sense) were POC, as were various friends, but because race was never really discussed, I had no idea of the ways in which it mattered: to them, to others, to how they were judged and treated. The first time I learned anything about Chinese languages was when one of those early boyfriends explained it in class. I remember being fascinated to learn that Chinese – not Mandarin or Cantonese: the distinction wasn’t referenced – was a tonal language, but I also recall that the boy himself didn’t volunteer this information. Instead, our white teacher had singled him out as the only Chinese Australian present and asked him to explain his heritage: she assumed he spoke Chinese, and he had to explain that he didn’t, not fluently, though he still knew enough to satisfy her question. That exchange didn’t strike me as problematic at the time, but now? Now, it bothers me.

At my second high school, I was exposed to more overt racism, not least because it was a predominantly white, Anglican private school, as opposed to the more diversely populated public school I’d come from. As an adult, I’m ashamed to think how much of it I let pass simply because I didn’t know what to say, or because I didn’t realise at the time now noxious it was. Which isn’t to say I never successfully identified racism and called it out – I was widely perceived as the token argumentative lefty in my white male, familially right-wing friend group, which meant I spent a lot of time excoriating them for their views about refugees – but it wasn’t a social dealbreaker the way it would be now. The fact that I had another friend group that was predominantly POC – and where, again, I was the only girl – meant that I also saw people discussing their own race for the first time, forcing me to examine the question more openly than before.

Even so, it never struck me as anomalous back then that whereas the POC kids discussed their own identities in terms of race and racism, the white kids had no concept of their whiteness as an identity: that race, as a concept, informed their treatment of others, but not how they saw themselves. The same boys who joked about my biracial crush being a half-caste and who dressed up as “terrorists” in tea robes and tea towels for our final year scavenger hunt never once talked about whiteness, or about being white, unless it was in specific relation to white South African students or staff members, of which the school historically had a large number. (The fact that we had no POC South African students didn’t stop anyone from viewing “white” as a necessary qualifier: vocally, the point was always to make clear that, when you were talking about South Africans, you didn’t mean anyone black.)

Which is why, for a long time, the topic of race always felt fraught to me. I had no frame of reference for white people discussing race in a way that wasn’t saturated with racism, which made it easy to conflate the one with the other. More than that, it had the paradoxical effect of making any reference to race seem irrelevant: if race was only ever brought up by racists, why mention it at all? Why not just treat everyone equally, without mentioning what made them different? I never committed fully to that perspective, but it still tempted me – because despite all the racism I’d witnessed, I had no real understanding of how its prevalence impacted individuals or groups, either internally or in terms of their wider treatment.

My outrage about the discriminatory treatment of refugees ought to have given me some perspective on it, but I wasn’t insightful enough to make the leap on my own. At the time, detention centres and boat people were the subject of constant political discourse: it was easy to make the connection between things politicians and their supporters said about refugees and how those refugees were treated, because that particular form of cause and effect wasn’t in question. The real debate, such as it was, revolved around whether it mattered: what refugees deserved, or didn’t deserve, and whether that fact should change how we treated them. But there were no political debates about the visceral upset another boyfriend, who was Indian, felt at knowing how many classmates thought it was logical for him to date the only Indian girl in our grade, “because we both have melanin in our skins”. (I’ve never forgotten him saying that, nor have I forgotten the guilt I felt at knowing he was right. The two of them ran in completely different social circles, had wildly different personalities and barely ever interacted, and yet the expectation that they’d end up dating was still there, still discussed.) I knew it was upsetting to him, and I knew vaguely that the assumption was racist in origin, but my own privilege prevented me from understanding it as a microaggression that was neither unique to him nor the only one of its kind that he had to deal with. I didn’t see the pattern.

One day, I will sit down and write an essay about how the failure of white Australians and Americans in particular to view our post-colonial whiteness as an active cultural and racial identity unless we’re being super fucking racist about other groups is a key factor in our penchant for cultural appropriation. In viewing particular aspects of our shared experiences, not as cultural identifiers, but as normal, unspecial things that don’t really have any meaning, we fail to connect with them personally: we’re raised to view them as something that everyone does, not as something we do, and while we still construct other identities from different sources – the regions we’re from, the various flavours of Christianity we prefer – it leaves us prone to viewing other traditions as exciting, new things with no equivalent in our own milieu while simultaneously failing to see to their deeper cultural meaning. This is why so many white people get pissed off at jokes about suburban dads who can’t barbecue or soccer moms with Can I Speak To The Manager haircuts: far too many of us have never bothered to introspect on our own sociocultural peculiarities, and so get uppity the second anyone else identifies them for us. At base, we’re just not used to considering whiteness as an identity in its own right unless we’re really saying not-black or acting like white supremacists – which means, in turn, that many of us conflate any open acknowledgement of whiteness with some truly ugly shit. In that context, whiteness is either an invisible, neutral default or a racist call to arms: there is no in between.

Which is why, returning to the matter of QOP and Whitney Atkinson, pro-diversity advocates are so often forced to contend with people who think that “separating races” and like identifiers – talking specifically about white people or disabled people or queer people, instead of just people – is equivalent to racism and bigotry. Whether they recognise it or not, they’re coming from a perspective that values diverse perspectives for what they bring to the melting pot – for how they help improve the dominant culture via successful assimilation – but not in their own right, as distinct and special and non-homogenised. In that context, race isn’t something you talk about unless you’re being racist: it’s rude to point out people’s differences, because those differences shouldn’t matter to their personhood. The problem with this perspective is that it doesn’t allow for the celebration of difference: instead, it codes “difference” as inequality, because deep down, the logic of cultural assimilation is predicated on the idea of Western cultural superiority. A failure or refusal to assimilate is therefore tantamount to a declaration of inequality: I’m not the same as you is understood as I don’t want to be as good as you, and if someone doesn’t want to be the best they can be (this logic goes) then either they’re stupid, or they don’t deserve the offer of equality they’ve been so generously extended in the first place.

Talking about race isn’t the same as racism. Asking for more diversity in YA and SFF isn’t the same as saying personhood matters less than the jargon of identity, but is rather an acknowledgement of the fact that, for many people, personhood is materially informed by their experience of identity, both in terms of self-perception and in how they’re treated by others at the individual, familial and collective levels. And thanks to various studies into the social impact of colour-blindness as an ideology, we already know that claiming not to see race doesn’t undo the problem of racism; it just means adherents fail to understand what racism actually is and what it looks like, even – or perhaps especially – when they’re the ones perpetuating it.

So, no, QOP: you can’t effectively advocate for diversity without talking in specifics about issues like race and sexual orientation. Want the tl:dr reason? Because saying I want more stories with PEOPLE in them isn’t actually asking for more than what we already have, and the whole point of advocating for change is that what we have isn’t enough. You might as well try and work to decrease the overall number of accidental deaths in the population without putting any focus on the specific ways in which people are dying. Generalities are inclusive at the macro level, but it’s specificity that gets shit done at the micro – and ultimately, that’s what we’re aiming for.

 

 

Let me tell you what I wish I’d known
When I was young and dreamed of glory:
You have no control
Who lives, who dies, who tells your story.
– Lin-Manuel Miranda, “History Has Its Eyes On You”, Hamilton
.
As the Brexit vote and its consequences reverberate through the internet, I listen to Hamilton’s”History Has Its Eyes On You”, and of all possible things, I find myself remembering the morning of 9/11. I was fifteen years old, and as I stumbled through my parents’ bedroom to their en-suite to get ready for school, despite my habitual bleariness, I was conscious that they were both unnaturally still, frozen in bed as they listened to the radio. I remember my mother saying, unprompted, “Something terrible has happened in the world,” my stomach lurching at the graveness of her tone. I remember how, at school that day, the attacks were all anyone could talk about; how even the most diffident students begged our history teacher for permission to watch George Bush’s address on the TV in our classroom. Above all else, I remember the sense, not of fear, but of irrevocable change beyond my control: the knowledge that something material to all our futures had happened – was in the process of happening still – and yet we were expected to carry on as usual.
 .
I remember thinking about documentaries we’d watched in history or which I’d seen at home, segments where various adults were asked to give their eyewitness accounts of events that happened in their youth, and imagined being one day called on to do likewise. Where were you when it happened? How did you feel? What did you say? Did you know, then, what stretched out before you? What were the details? I was years away from wanting children, but I still wondered what I might say to my own hypothetical offspring, if some future history teacher asked them to interview a parent about the momentous events which they (meaning I) had lived through. And I thought of the propaganda posters I’d so recently studied for my own modern history class – that classic image of the beslippered pater familias sitting in his armchair, two cherubic children at his feet, and a pained, distant expression on his face as his daughter asked, “Daddy, what did YOU do in the Great War?”
.
Great War
.
Neither one is a comparable situation to the Brexit vote, of course. (I hope.). But that feeling of history having its eyes on me – on all of us – is one of which I’ve felt increasingly conscious ever since the neo-fascist Golden Dawn party gained unprecedented power in Greece in 2015. I find myself thinking again of those high school history lessons, where Edmund Burke’s adage that those who don’t know history are doomed to repeat it first became a part of my awareness, a tritely profound observation that nonetheless remains relevant, and of all the early warning signs that preceded both world wars. Perhaps it’s just the consequence of having grown to adulthood in the spectre of 9/11, American politics the long shadow cast constantly over my Australian shoulder, but since then, I’ve never lost the awareness that my local world is only an engine part in a bigger and more complex machine.
 .
I have plenty of scathing things to say about the current state of secondary education in Australia, but it was a modern history unit on the Israel-Palestine conflict that inspired the core of my early university studies: Arabic as a language, the Arab World, Islam and the Middle East, Biblical Studies. Then as now, I understood that, whatever my personal atheism, it was the religious, political and cultural schisms developed over centuries between Judaism, Christianity and Islam that had ultimately shaped the modern world, and in light of that fateful day in September 2001 – in light of the history assignment for which I subsequently won a school award, cutting endless newspaper clippings on conflict in the Middle East to explain how each event went back to what we’d studied about pogroms and Zionism, the UN and oil and the Sykes-Picot Agreement – I wanted to try and understand the foundations of the present.
 .
I am, by nature, a storyteller. Narrative has patterns, and though we construct them knowingly in fiction, still they echo naturally in life, whose permutations are often far stranger than anything we can dream up. I look at where the world is poised, on the brink of men like Donald Trump and Nigel Farage, Malcolm Turnbull and Boris Johnson, and for an instant the very air is textbook paper, a blurring of time and distance and a whisper of darkest timelines. I am a mater familias in an armchair as yet unbuilt, and as my son looks up from whatever device goes on to replace the common iPad, I hear him ask, “Who did YOU vote for in 2016, mummy?” I imagine a homework sheet that asks him to list the date of Jo Cox’s death the same way I once listed Emily Davison’s, and with as much bland dispassion. I wonder if his history module will cover the Orlando massacre, assuming it isn’t deemed too volatile for junior study, the same way I never learned in school that the queer prisoners in Nazi concentration camps, once freed, were immediately rearrested – homosexuality was still considered a crime, you see, even by the Allies.
 .
I wonder how many teenagers throughout the UK and Europe checked the results of the Brexit vote on their phones today, on laptops, in class, and watched it all with that same spectre of wrenching helplessness that I once did, as their future was altered without their say-so. Overwhelmingly, it was young people who voted Remain, and older folks who voted Leave, and while the result is a tragedy for all of them alike – regardless of how or whether they voted, older Britons have just been screwed out of their pensions as the pound falls to a 31 year low, a span longer than my lifetime – it’s the young whose futures have just been overwritten. Scotland might yet break from the UK, just as other countries might yet break from the EU; Canada is a lone bastion of Western political sanity right now, but everything else is disintegrating. To quote another poet:
 .
Things fall apart; the centre cannot hold;
Mere anarchy is loosed upon the world,
The blood-dimmed tide is loosed, and everywhere
The ceremony of innocence is drowned;
The best lack all conviction, while the worst
Are full of passionate intensity. 
.
I don’t know where the world is headed; only that it scares me. We’ve fought through so much to achieve even the fragments of unity and fellowship we now possess, and yet the backlash against it has been so violent, so continuous, as to defy belief. It’s barely been two months since I left the UK, and already it appears unrecognisable, a distorted funhouse reflection trying desperately to possess the body that cast it. Perhaps the only way out is through, but at this point, I can’t imagine that we’re going to get there painlessly.
 .
I wonder who’ll live, who’ll die. Who’ll tell this story.

On the phone from the Middle East, where he is currently deployed, Torgersen lamented what he called “the cognitive dissonance of people saying, ‘No, the Hugos are about quality,’ and then at the same time they’re like: ‘Ooh, we can vote for this author because they’re gay, or for this story because it’s got gay characters,’ or, ‘Ooh, we’re going to vote for this author because they’re not white.’ As soon as that becomes the criteria, well, quality goes out the window.”

Who Won Science Fiction’s Hugo Awards, and Why It Matters, by Amy Wallace

In light of this year’s Hugo Award results, and with particular reference to Amy Wallace’s excellent rundown on the Puppies affair, I feel moved to address the Sad, rather than the Rabid, contingent. Per Torgersen’s remarks above, and setting aside every other aspect of the debate that renders me alternately queasy or enraged, I can’t shake the feeling that there’s something fairly fundamental to the problem that’s consistently misunderstood by the Puppies, and which, when explained, might go a long way towards explaining the dissonance between what they think is happening and what is actually happening. Not that I particularly expect Torgersen or Correia to listen to me at this point; or if they did, I’d be greatly surprised. Even so, the point seems worth stating, if only for the sake of greater clarity.

When it comes to debating strangers with radically different perspectives, you sometimes encounter what I refer to as Onion Arguments: seemingly simple questions that can’t possibly be answered to either your satisfaction or your interlocutor’s because their ignorance of concepts vital to whatever you might say is so lacking, so fundamentally incorrect, that there’s no way to answer the first point without first explaining eight other things in detail. There are layers to what’s being misunderstood, to what’s missing from the conversation, and unless you’ve got the time and inclination to dig down to the onion-core of where your perspectives ultimately diverge, there’s precious little chance of the conversation progressing peacefully. After all, if your interlocutor thinks they’ve asked a reasonable, easy question, your inability to answer it plainly is likely to make them think they’ve scored a point. It’s like a cocky first-year student asking a 101 question and feeling smug when their professor can’t condense the four years of study needed to understand why it’s a 101 question into a three-sentence answer. The problem is one as much of attitude as ignorance: having anticipated a quick response, your interlocutor has to be both willing and interested enough to want to hear what might, at least initially, sound like an explanation of a wholly unrelated issue – and that’s assuming you’re able to intuit the real sticking point straight off the bat.

So: inasmuch as any of the Puppies can be said to have a reasonable concern at the bottom of all their rhetoric, which often comes off as little more than “we think books about people who aren’t straight white dudes are boring”, it’s the worry that certain stories are being rewarded because they contain X character or are written by Y author rather than because they’re actually good. And given the way such books are often discussed and lauded by those who love them, where these aspects are openly stated as pros, you can see where the concern comes from. After all, the difference between saying “this book is great because it had a queer protagonist” and “this book is great because it had a well-written protagonist” seems, on the surface, pretty obvious: one is concerned with a single aspect of characterisation regardless of its execution, and the other is concerned with execution alone. So clearly, if you’re vaunting queerness (for instance) as though it’s a synonym for quality, you’re at risk of recommending mediocre stories on a tokenistic, uninformed basis.

Right?

Wrong.

But in order to explain why this is so, there’s six onion layers we need to unravel: context, experience, awareness, representation, language and taste.

Let’s start with layer one: context. While there’s always been an element of diversity in SFF – you can’t ignore the contributions of writers like Ursula K. Le Guin or Octavia Butler, or pretend that the Golden Age greats never wrote about politics – as the Puppies themselves agree, it’s only comparatively recently that a movement in favour of promoting diversity has flourished. Setting aside the question of whether this is a good or a bad thing, or merely just a thing, at a practical level, increased diversity in narrative means you’re adding a new set of variables to any critical equation, which in turn requires a new way to discuss them. For example: if the vast majority of protagonists in a given genre are straight, white men, then critically speaking, there’s little need to mention their straightness/whiteness/maleness when making reviews or recommendations, because none of these details are relevant in distinguishing Story A from Story B, or Character A from Character B. Instead, you talk about other things – the quality of the characterisation, for instance – and consider it a job well done.

Which, contextually, it is. And somewhat understandably, if this is what you’re used to, it can be easy to assume that ever mentioning race or gender or sexuality in a review is irrelevant – even when the characters are more diverse – because these details, whatever else they might indicate, have no bearing on the quality of the story.

Except, of course, they do, as per the evidence of layer two: experience. Who we are and where we’ve come from impacts on our construction; on our beliefs and personalities. Returning to a situation where straight white male characters are the default, a reviewer would be within their rights – would, indeed, be doing a good job – to discuss how Character A’s working class upbringing informs his personality, especially when compared with Character B’s more aristocratic heritage and attitudes. A veteran soldier will have a different perspective on combat to someone who’s only ever studied tactics at a remove, just as an old man who’s recently outlived the love of his life will think differently about romance to a teenager in the throes of his first infatuation. These details are critically pertinent because they demonstrate how and why two characters can experience the same story in radically different ways, and if we as readers happen to have some points in common with Character A or Character B, we’re always going to compare our own experiences with theirs, no matter how fantastical or futuristic the setting, because it helps us gauge whether, in our opinion, the writer has done a good job of portraying their thoughts and feelings realistically.

And so it is with details like race and gender and sexuality. A queer character will have different experiences to a straight one, particularly if they live in a homophobic culture; someone who’s religious will have a different outlook on life to someone who’s an atheist; a person from a racial and cultural minority will experience their surroundings differently to someone from the racial and cultural majority; someone who grows up poor will approach wealth differently to someone who’s always had it. How relevant these details are to individual characterisation and worldbuilding – and how successfully they’re executed within a given story – will, of course, vary on a case by case basis; but of necessity, they matter more often than not, and therefore deserve to be mentioned.

Which means that, if the straight white man is no longer the default character, but is rather just one of a number of options, his straightness, whiteness and maleness will be subject to new scrutiny, both in the present and as a retroactive phenomenon. This is layer three: awareness. All stories, no matter how fantastic or futuristic, are ultimately the product of their times, because their writers are the product of their times, too. We might envisage new worlds, but what we consider new depends as much on what we think is old as what we think is possible; our taboos change with the decade or century or according to cultural context; particular writing styles go in and out of vogue; and audiences, depending on their tastes and when they’re raised, expect a range of different things from narrative.

The retroactive criticism and analysis of old works has always been part of literary tradition; what changes is the present-day yardstick against which we measure them. Right now, we’re in the middle of a cultural shift spanning multiple fronts, both political and creative: we’re aware that stories are being told now which, for various reasons, haven’t often been told before, or which didn’t receive much prominence when they were, and which are consequently being told by a wider range of people. Depending on your personal political stance, and as with the question of diversity in the context layer, you might view this as a good thing, a bad thing, or merely a thing – but regardless of your beliefs, you can’t deny that it’s happening, and that it’s having an impact. As a direct result of this, many of us are now looking at old stories – and at old defaults – in a new light, which means that certain narratives and narrative elements which, by dint of once being so common as to void discussion, were considered thematically neutral, are now being treated as political. (Which, really, they always were – but more on that later.)

As our cultural taboos have shifted – as queerness has become decriminalised (if not always accepted) and rights extended to everyone regardless of race and gender (though still often enacted with prejudice) – the types of stories it’s acceptable to tell have changed, just as it’s now possible for a wider range of storytellers to be heard. We’re all aware of these changes, and whether we like them or not, their visibility makes us question our stories in ways we haven’t before. Thus: while there is nothing noteworthy in choosing to write a straight, white male protagonist in a cultural milieu where almost all protagonists share these qualities, the same act carries more meaning when the combination is understood to be just one of a number of possible choices; and especially where, of all those choices, it’s the one we’ve seen most often, and is therefore, in one sense, the least original. Which doesn’t make such characters inherently bad, or boring, or anything like that; nor does the presence of such characters – or the success of such writers – preclude the simultaneous presence of diversity. It simple means we have an increased awareness of the fact that, up until very recently, a certain type of character was the narrative default, and now that he’s not – or at least, now that he’s not to some people – it’s worth asking whether his presence is a sign that the writer, whether consciously or unconsciously, is perpetuating that default, and what that says about the story in either case.

Which brings us to the fourth layer: representation. Following on from the issue of awareness, consider that, as a wider variety of stories are now being told by a wider variety of people, a wider range of protagonists has consequently entered the narrative market. As with context and awareness, you might think this is a good thing, a bad thing, or merely a thing: regardless, it is happening, and will doubtless continue to happen. As such, a wider percentage of the audience is now having stories written both by and about them – or at least, about people like them – than in previous years; which means that, in response to the former dearth of such narratives, there’s been a corresponding rise in people requesting or recommending them primarily or prominently on the basis of their representational elements.

Ignoring for the moment all questions of quality – which, yes; I’m aware that’s the discussion we’re ultimately having, but bear with me – it should be a point of basic human empathy to understand why this is important; or at the very least, why representation matters to so many people. Despite our ability to empathise and connect with characters whose lives and experiences are utterly different to our own, we still like to see ourselves represented in fiction from time to time, not only as a form of validation – you’re worth telling stories about – but because, amidst so much difference, it’s a point of connection, affirmation, identity. Yet because straight white male characters were so long the default – and because that default, by virtue of its ubiquity, was considered politically neutral – changing the recipe, as it were, is still a visibly deliberate act: it makes the reader aware that the author chose for the character to be male or female, queer or straight, black or white (to give the simplest binary permutations), which awareness refutes the mythical idea of characters as the immaculate, fully-fledged gifts of some inviolable Muse, beyond the writer’s ability to pick or alter; and as such, there’s a reflexive tendency to conflate deliberate with forced, where the latter term carries implications of artificial, false, arbitrary, tokenistic. When these attributes don’t describe us, it’s easy to forget that actually, people like that do exist in the real world, and in considerable numbers; they’re not just something the author has made up out of whole cloth, and the fact that we might be surprised to see them in a given story doesn’t mean, ipso facto, that they’re incongruous within it.

As such, there’s a developing trend towards recommending stories which feature traditionally under-represented groups, not just as some arbitrary exercise, but because we’re aware that members of those groups might actually want to read those stories, and will, as a consequence, have a material interest in that aspect of the contents. But for precisely this reason, such recommendations are seldom indiscriminate, based, as Torgersen and the Puppies fear, solely on the presence of Character A regardless of execution or context – because even though protagonists have long defaulted to being straight, white and male, there’s an equally long tradition of other groups being portrayed badly. The fact that a book contains multiple female characters is no guarantee that those characters are written well, let alone inoffensively, just as the presence of POC within a classic text doesn’t mean their portrayal and treatment isn’t screamingly racist – which is why, when you see  diversity advocates recommending books on the basis that Character A is queer (for instance), the implication is that the filtering for quality has already taken place; that Character A both exists in a well-written narrative and isn’t a walking stereotype. The entire point of the exercise is to promote stories, not on the basis of token or forced diversity alone, but which portray diversity well – and because an author writing from their personal, in-depth experience is likely to have an extensive understanding of the topic, this support naturally extends to mentioning if, for instance, the author of a story starring multiple queer characters is queer themselves, not because there’s an assumption that straight people can’t write excellent stories about queer individuals, but because within any field or group, there’s always going to be a degree of insight or insider knowledge that can only be understood through personal experience, and it’s worth recognising which books are likely to replicate it, especially if we’re insiders, too, and are therefore more likely to notice if those perspectives are missing.

Consider, for instance, the probable insights contained in a military SF novel written by serving soldier, as distinct from one written by a military historian, as distinct again from one whose author’s knowledge of combat, tactics and fighting comes primarily from what they’ve read or seen in other fictional stories. The different backgrounds and knowledge-bases of these hypothetical authors says nothing about how well they write fiction, or how skilled they might be at other aspects of storytelling; they might have wildly different narrative styles and work within very different worlds, such that comparing their books, for all that they ostensibly share a genre, is a tricky proposition. All three books could be excellent in different ways, and all three books could be poor. But if someone you knew  to be both a good judge of fiction and possessed of actual combat experience – let’s call them Sam – handed you the first writer’s book and said, “Read this! The author actually served overseas!”, you’d probably deduce from context that, having served themselves, Sam was telling you that this writer gets it; their experience is my experience, or close enough to mine to be recognisable, and they know what they’re talking about. 

Similarly, if Sam praised either of the other two books for the military content, you’d understand that they were speaking from a position of personal experience: that, to someone with firsthand knowledge of fighting, the tactical/combat elements didn’t feel unrealistic or forced. By the same token, if Sam disliked the first book, you might take the criticism seriously while considering that, as the author was writing from their own first-hand perspective, too, a lack of realism wasn’t necessarily at fault, so much as a clash of opinions. But if Sam told you categorically that the third writer had no idea what they were talking about – that, regardless of any other qualities the book might have, the military aspect was hopeless – you’d be inclined to take that criticism more seriously than if a civilian friend with no grasp of tactics recommended it wholeheartedly; but depending on your own status as civilian, historian or soldier – and how badly you wanted to read the book for other reasons – your own reaction could be different again.

What I mean to say is this: seen from the outside, it’s easy to look at the members of a community recommending stories on what seems to you a superficial basis, and to conclude that, actually, nobody in that conversation is concerned with quality at all. But as per the fifth layer – language – what you’re really witnessing is a collectively understood shorthand: a way of signalling quickly that this book or that is worthy of attention based on a deeper awareness of commonly-held priorities, with respect accorded to those whose recommendations are supported by their personal experiences. Particularly on Twitter, where conversations between small groups are visible to non-participants and where character limitations make exposition difficult, it makes sense that bloggers, writers and critics alike try to be as succinct and powerful in their advocacy as possible. Just as I would accord a greater critical weight to the judgement of a soldier recommending a military SF novel, if a person of colour praises a book for its positive racial representation – or, conversely, criticises its lack thereof – I’m going to consider that relevant.

Which all ties in neatly to the final layer: taste. I’ve said before, and will say again, that I’m a firm believer in the value of negative reviews. Not only do they serve an important critical function, but as another person’s taste is seldom identical to our own, they help us construct a more useful idea of where our interests overlap with the critic’s, and where they diverge. Demonstrably, there’s an audience right now for diverse fiction: for stories which reject the old defaults and showcase a wider range of people, themes and places. The fact that some people enjoy such works does not, in and of itself, make them good works, just as popularity is no guarantee of goodness, either. The Venn diagram of why we love something is seldom a perfect circle with its objective strengths, inasmuch as such strengths can be reasonably said to exist: creative endeavours are funny like that. There’s always going to be a sort of tension between technique and passion, skill and heart, not because those things are in any way diametric opposites, but because we can never quite agree on whether one is more important than the other, or if you can really have one without the other, or where the distinction between them lies if, for instance, the most heartfelt aspects of a story are only so because of their technically sound expression.

As such, creative awards are contentious creatures – have always been so; will always be so – inasmuch as presenting one represents the imposition of an objective judgement into a fundamentally subjective medium; and because all claims to objectivity are inherently political, so must awards be political, too. This isn’t new information, though some people, like the Puppies, have become mightily outraged at the revelation that what they’ve historically perceived as a lack of politics was, in fact, merely a political bias towards their own comfort. That they are no longer predominantly catered to, they perceive as being under attack; what they call the forced introduction of politics into a formerly neutral space is rather the revelation of existing politics through a natural process of change. A sandbar might be solid for years, but when it shifts with the ocean and so makes new waves, it hasn’t betrayed the people standing on it – though possibly, it might have collapsed sooner beneath their weight, especially if they mistook it for solid and made it the foundation of an improbable edifice.

I guess what I want to say is this: despite what the Puppies think, the rest of us aren’t interested in diversity without quality, and as we’re all acutely aware, the failure mode of diversity is stereotype, which concept isn’t exactly on handshake terms with quality in the first place. That we want to celebrate historically silenced voices and perspectives doesn’t mean we’re doing so purely to spite you, or that we’ve lost all sense of judgement: if our tastes extend to seeing in fiction those versions of ourselves you’re disinclined to write, then who are you to tell us we aren’t entitled to our preferences? Nobody is saying you can’t tell your stories; we just might not want to read them, the same as you evidently have no desire to read ours. That’s not the genre being attacked – it’s the genre changing, and whether you change with it or not, we’re still going to like what we like.

Stop fighting the riptide, Puppies. As any Australian could tell you, it’s the surest way to drown.