Posts Tagged ‘Society’

Despite the vehement protestations of my formerly nine-year-old self, chances are that I’ll have kids of my own at some point in the future. Even were that not the case, I’m still the kind of gal who routinely plunges her head into the ice-cold waters of the blogsphere, and am therefore reasonably up to date on the current furor vis-a-vis motherhood. Specifically, the fact that nobody seems to know what to make of it. As Lynn Harris points out, a lot of hate for the feminine side of parenting is being bandied about by non-parents; Emma Gilby Keller is making the case for women who haven’t heard the ticking of their biological clocks and refuse to see this as a personal failing; Gen Y mum Nicole Madigan is, not unreasonably, fed up with being treated as though mothers as a demographic are still entrenched in the 1950s; and more than one person is wondering about how children should (or shouldn’t) fit into the public sphere. No matter whose side you’re on, any discussion of modern motherhood seems to imply a certain amount of outrage, anxiety and general handwringing, which, given that the prospect of giving birth is already terrifying, let alone being responsible for a tiny helpless being encoded with an unspecified, potentially lethal mix of yours and your partner’s DNA, is about as close to notions of ‘helpful’ or ‘comforting’ as the Oort Cloud is from Earth. Which is to say, very fucking distant.

I’ll admit to being fascinated by the whole malarkey – not just because I’m an opinionated snark, or because the entire buisness reeks very faintly of rubbernecking, but because it’s something in which my future self is, presumably, invested. Like everyone else, I want to know how to do this right, but despite my historical belief in the idea that moral/social absolutes are arbitrary if necessary human constructs rather than universal fixtures, it is still something of a rank shock to discover that there is no inviolable Way of the Parent, let alone Way of the Responsible Adult. Except for that part about not sticking forks in electrical socks, which, really, is only common sense.

But I digress.

The point being, there’s a lot of parenting turmoil to wade through, most of it directed towards or inflicted upon mothers themselves. And while I’m hardly about to cut in on the stroller-bashing queue, I think I’ve finally pinned down what makes me, personally, uncomfortable about the whole buisness. It’s not the idea of the Yummy Mummy that stings, although I dislike the emphasis it puts on what are frequently unrealistic standards of beauty. It’s not the helicopter, cotton-wool parenting, either, although it makes both my inner sixteen-year-old and my outer twentysomething roll their eyes. It’s not even the obnoxious, ignore-the-kids-as-they-go-on-a-public-rampage non-approach to parenthood, or the designer stroller brigades. I might lament each one in turn, but they’re not trends I feel personally threatened by: call it crazy, madcap optimism, but I’d like to think that whatever neuroses I develop as a consequence of motherhood will have less to do with social ephemera than the quirks and peculiarities of my own offspring. No: what makes me edgy in all of this is the idea that motherhood has once again become a lifestyle.

It’s a thought which simultaneously intrigues and repulses. On the one hand, everyone has the right to choose their own life. Who am I to criticize anyone for wanting the best for their children, or for taking pride in the process? Feminism has failed, and failed roundly, if it says that a woman ceases to be a feminist the moment she decides to be a stay-at-home mother, or if she cares about the type of stroller in which she perambulates her child. But on the other hand, it feels as though the current argument that children should comfortably pervade every facet of adult life – pubs, restaraunts, movies – is a reprimand on the notion that parenthood is something adults might want to take a break from. That’s not to say that it shouldn’t be easy for parents to take their children places, but even within the realms of shared public space, some areas – like parks – are more intuitively child-friendly habitats than cramped pubs. Children aren’t a disease or a nuisance, some squalid facet of humanity to be sequestered from polite society until their debutante ball: they are people, they are important, and every adult, no matter how vociferous on the subject of ‘breeders’, was one once. But neither are children accessories, undetachable scions that can’t be left off the parental radar without risk of permanent personality failure.

It’s a mess, in short, one we all have to sort through in accordance with our individual beliefs and intuitions, which goes some way towards accounting for all the different types of motherhood on offer. Sometimes, in the absence of absolute moral certainty, you just have to agree to disagree. But it’s the lifestyle element of modern mothering I baulk at: because lifestyles are all about appearances, and if there’s one thing I think childhood and parenting – and life in general, for that matter – shouldn’t boil down to, it’s an emphasis on how things look to other people, as opposed to how they actually work. And yet, this is exactly what I end up doing: looking at other mothers, who are after all the only rubric available, and judging, via their appearance, how likely they are to be engaged in the persuit of motherhood-as-a-lifestyle as opposed to motherhood-in-general. If I mistrust designer prams, Yummy Mummies and kids on parade, it’s because I worry that these are the trappings of motherhood-as-a-lifestyle, and while they certainly can be, particularly in conjunction, they are not definitive indicators. They are the accessories of stereotype, not its core. But with mothers and motherhood now so visible in public – which is a different part of the debate in and of itself –  it is frequently the case that these external signs are all we have to go by.

We are, in short, trying to find a definition for modern motherhood that suits. Women are juggling children and careers, personal lives and dedicated play schedules, the desire to spend time in adult company vs the practical difficulties of foisting one’s offspring off onto anyone else, even for an afternoon, in a climate where childcare costs approximately nine zillion squared to the power of sod off. We are having children at older ages, where an increased amount of disposable income to spend on the trappings of childhood – clothes, strollers, toys – often equates to time poverty, resulting in guilt and the desire to take the kids out wherever possible, even where that means sandwiching adult social time into a playdate at the local pub. And, as was ever the case, there is no easy answer. Society has changed, and mothers, intentionally or not, are changing with it. There is value in trying to stick up for what we think parenting should be, but if all that means is talking about the Good Old Days and judging by appearances, it won’t get us very far.

Here’s an uncontroversial statement: different people find different things sexy, just as different people find different things repulsive, outrageous, risque or tawdry. This is why so much of the porn industry nowadays is devoted to kink and specialisation. People are weird, and so, quite often, are our fantasies. It’s a thing.

When I walk into a newsagency and glance at the lads’ magazine section – Zoo and Maxim and so on – I’m usually blinded by a sea of very large bosoms in very small bikinis, hoisted proudly on the torsos of half a dozen tanned and pouting women. These mags are sold over the counter, but while I’m not grossly offended by the sight of mostly bare women, I tend to think the content is more pornographic than not. That’s less a moral judgement than it is a statement of fact: no matter how much skin they may or may not be showing compared to their hardcore counterparts, the models are there to be looked at in a lustful context.

When trying to determine whether something is pornographic, it’s certainly logical to consider why it was created in the first place, and for what audience. In many respects, I’d argue, this is actually more important than what is (or isn’t) on display, but there’s always going to be dissonance between the reaction an image is intended to provoke and the reactions is actually provokes. Because people, as has been mentioned, are weird. We get turned on by weird and unexpected and – sometimes – terrible things. And that’s what throws a spanner in the works when it comes to the current debate on child pornography.

Paedophilia is an awful thing, one that leads to awful crimes and ruined lives. It is a violation of trust and a sexual circumstance in which it is actually impossible for one of the parties to consent, meaning that it should never be condoned or legitimised. We have a social responsibility to protect children from sexual predators. And yet, in trying to do this, we have managed to paint ourselves into a legislative corner, one  in which any image of a child becomes pornographic, regardless of the context in which it was taken.

Because children – and children’s bodies – aren’t the problem. Taking a photo of a child is no more synonomous with making child pornography than being a child is synonomous with being a sexual creature. This is an instance where only two things are capable of making an image pornographic: the perspective of the viewer, which is entirely removed from the original context of the photo, and those disgusting occasions on which an abuser has recorded images of their crime. The latter instance is both vile and undeniably sexualised. But the former is where we hit a snag: because it forces people to be concerned, not with the content of a given picture, but the likelihood that someone will view it in a sexual context.

At the moment, in our zeal to protect children, we are dangerously close to smothering them. It is no longer acceptable to show up to your child’s school sports day and take photos: parents are concerned with how the images might be viewed later. But do we stop the sports day entirely for fear of what perverts on the sidelines might take away in their memories? No: and yet, this is exactly the same logic used to justify the current stance on photographing children. The more we behave as though the general populace cannot be trusted to be in the same room with our children on the offchance of what they might be thinking, the more we buy into the mindset that children need to be locked up, protected, sheltered, kept from the public eye.

On the surface, that might not sound so bad. But take that last sentence and replace the word ‘children’ with the ‘women’, and you have a viable description of the logic behind societies whose female populations are required to stay covered up at all times. Men cannot be trusted in the presence of women, this argument goes: it is futile to pretend otherwise, and much easier to make the women invisible than it is to change the attitudes of the men. This is a mentality which ultimately punnishes those whom it claims to protect, by restricting their actions and, by default, assuming that they exist in a constant sexual context. For many reasons, this is not a perfect analogy, but given our current social struggle to decide how much freedom children should have online, outside the home and in their decision-making, it strikes me that our debate over the definition of child pornography stands as a parallel issue.

Ultimately, we live in a changing world. We worry about online predators grooming or luring children away; we worry about the digial distribution of photos of children, and how our knowledge of their possible misuse might taint our perception of their contents; we worry about stranger danger, and whether it’s better to let our kids walk home by themselves and gain a bit of independence, or whether we should constantly be holding their hand. We are making decisions with the best of intentions, but I also worry that we are approaching things the wrong way. Life will always hold dangers, no matter how effectively we seek to curb them: nothing will ever be entirely safe. With new technology opening up the world in an unprecedented way, our instinct has been to clutch tightly at what we hold most dear, trying to protect it from these new, expanded threats. But the more we grip and shelter, the harder it eventually becomes to let go, and the more difficult it is for children to grow up into confident, capable adults. There is both nobility and necessity in our desire to preserve the sanctity of childhood, but in so doing, we should never forget that childhood is something to eventually be outgrown. The real world never goes away, and the more fearful we are of its dangers, the closer we come to never understanding it at all.

The following was written as a comment in response to this article in Trespass Magazine, wherein Lyrian Fleming postulates that the number of gaffes made by public figures on Twitter will eventually prove to be its undoing.

 

While I agree that celebrities and other public figures are currently struggling to walk a fine line with emergent digital media like Twitter, I don’t think their turmoil will kill the oeuvre. On the contrary, there’s few things our rubbernecking media machine enjoys more than a good old-fashioned gaffe, and in a culture where the cult of celebrity requires an almost non-stop stream of updates about its beloved stars, there’s nothing quite like Twitter for providing insight into the daily lives of the rich and famous. Those are both fairly cynical examples, but in broader terms, I’d contend that all the current spotlighting of public figures who dare to express a personal opinion are part of a bigger, currently unanswered question, viz: where do we draw the line between public and private in an age of instant media, and under what circumstances?

It’s not just about celebrities and Twitter; it’s about employees being fired because of content on their social networking pages, cyberbullying in schools, videos on YouTube – even the debate over the street-level images of private homes in Google Maps. These are all disparate examples, each of which has different quirks, different potential solutions, but what they all have in common is our need to establish etiquette for the use of technology whose rate of developmental progress has far outstripped the speed at which we are constructing rules around it. Twitter will eventually be superceded by something new, yes, but only because the next leap forward in virtual communication will replace it as a matter of course, and not because its existence has contributed to an already ongoing debate about public vs. private in the digital landscape.

Quite simply, I’d be extraordinarily worried if we, as a society, saw Twitter collapse simply because it forced us to reexamine our behaviour. The issues it’s raised – or rather, which have arisen as a result of its use – aren’t bad questions to be asking. With or without Twitter, we need to know how to live with technology. The Freedman case is a good example of this, whereas I’d argue that Kyle Sandilands acting like Kyle Sandilands is less an issue of his choice of media than it is a question of his being  an obnoxious tool who perhaps shouldn’t be paid obscene amounts of money to abuse people on air.

Should public figures have the luxury of private opinions in a public forum, or not – that’s the real question underlying these examples. Conventional wisdom seem to say ‘Yes, but ony for so long as they don’t say something offensive.’ However, given the level of media scrutiny currently attached to any gaffe, regardless of its objective severity – Freedman deserves no heat compared to Sandilands – I’d suggest a case-by-case policy of caveat orator. Let the speaker, whoever they are, beware. Because if they weren’t before, the world and his wife are certainly watching now.

This week, it seems, I am pretty much incapable of not ranting. I’ve ranted to the Sydney Morning Herald about education in NSW (scroll down for my letter); I’ve ranted about paranoral romance – and now, it seems, I’m ranting about tweens.

Not being a parent, let alone an American, I’m probably ill-placed to judge how crazy this article on tweenage freedoms may or may not be. For starters, its about tweens in New York City, which would seem to be a fairly unrepresentative slice of Americana, but that doesn’t stop it from raising alarm bells. I’ve long since accustomed myself to the notions of helicopter parenting and cocooning as repugnant (if apparently widespread) symptoms of the modern age, and yet somehow, I’ve never really sat down and thought about the age bracket in question. Most often, I rant about teenagers being downtrodden by foolish adults, and while I’m certainly familiar with tweens as a concept, it hadn’t actually occurred to me that they might be copping an even worse end of the stick.

Of all the lines in the article – the sentiment of which, for the record, I wholeheartedly agree with – there’s one which made me pull up short and sit down, once again, to rant. It’s this:

‘”Kids like to feel that they are doing something of value,” explains Michael Thompson. “Boys who like organized sports like them because it feels like they’re doing something valuable, and by that I don’t mean getting good at soccer. I mean entertaining adults.”’

On the one hand, Michael Thompson clearly means well. He’s identified a problem facing tweens – not being allowed out of parental sight for fear of cataclysmic life failure – and is trying to suggest ways of fixing it. On the other hand, it would seem to be a fairly self-evident statement, when removed from an ageist context, that people generally – and not just ‘kids’ – like to ‘feel that they are doing something of value’. Actually, scrap that. People like to actually do  things of value, and not just be given the illusion of same. Which is where I start to get angry – not at the article, or even (necessarily) Michael Thompson, but of this damnable habit we seem to have fallen into of treating everyone as a separate demographic. Has it become completely alien to our sense of being that some things, regardless of whether one is nine, nineteen, forty-nine or ninety – or, for that matter, male, female, religious, agnostic, atheistic, a ufologist, black, white, Hispanic or Chinese – might be universal? I’m not talking about complex moral truths, for heavens’ sake: just a simple recognition of the fact that we are all human beings, and therefore hold a certain type of base need in common. 

Must everything be looked at in terms of marketing? Sure, it might be a comparably slender percentage of likes which bind us together, but I’d wager they hold a pretty deep significance for the same reason. People want a purpose. Why is that such a difficult notion for society to understand? Kids might be less emotionally mature than adults, but that doesn’t make them stupid, and it sure as hell doesn’t make them any less human. Children like to entertain their parents, but past a certain age, they also want to feel like they’re getting older, a quality which, up until about age 19, is most readily identifiable by the grade we’re in at school and how we’re treated by adults. And if the latter isn’t there, the former doesn’t matter a jot, because one of the most pivotal reasons students recognise their school years as a valid progressive hierarchy is that it leads to the adult world.  Nobody goes to school for the sake of school itself, ‘school’ here being distinct from a concept of learning. What grade you’re in is based almost solely on age, not any kind of meritocratic policy. If each successive birthday from six to sixteen brings no increase in social respect, parentally granted autonomy or actual real-world power, why shoudn’t tweens be sullen – indeed, why shouldn’t they become disrespectful, disobedient teens in turn?

During a recent conversational rant about the failings of education, another adult asked me why I still gave a damn. After all, I’ve been out of school for five years, and despite my complaints, I did well enough while there; it’s been over a decade since I was a tween, and almost five years since I ceased to be a teenager. Why was I still ranting about problems which no longer concerned me?

But the thing is, they do still concern me. Part of what bothered me then – what still bothers me now – is the extent to which, despite every study telling us that children are learning increasingly more each year from younger ages; despite the leaps in technology which are picked up most readily by the young; despite the fact that tweens and teenagers are the future, adults are still persistently talking over their heads, treating tweens and teens as if they don’t matter, when everything about our new society is screaming that yes, they do. Even worse, this realisation of increased child-knowledge compared with their relative lack of emotional experience has spawned a rash of parenting techniques designed expressly to prolong the gaining of wisdom by wrapping one’s offspring in cotton wool, as though emotional experience can be achieved without any kind of learning-through-error. I keep ranting about things that no longer concern me directly because they do concern me, and everyone, indirectly. The current social system with regard to youth is predicated largely on the assumption that nobody under the age of 18 is worth listening to, while everyone over the age of 18 can no longer be bothered arguing, having managed to escape the conditions they were previously so animated about. It’s stupid, and irritating, and more than anything else about growing up, I am terrified that one day the Adult Brainwashing Machines will get me, too, ensuring that not only will I forget what it was like to be young, but, in losing all interest in youth beyond self-perpetuation, I’ll forget that there is more than one kind of youth; that the Youth of Today are just as human, just as bright and gawky and volatile as I was, but that they nonetheless are not me, and that this is not automatically a cause for concern.

So, parents: let your tweens go down to the shops or pick up the laundry or – horror of horrors – take the train alone, but don’t act as though you’re doing them a favour. Don’t be condescending in your permissal of freedoms, because if you are, then they’re not really freedoms at all. The difference between extending a privilege and acknowleging a right is the most profound difference in the world, once you’re aware of it – and with the current rate of information absorption among tweens, it’s a safe bet that most of them are.

Recently, I’ve been struggling to comprehend the social ramifications of defamation, censorship and privacy laws in government and industry. While the scenario of a verbally abusive co-worker or boss is undeniably awful, and while nobody should have to put up with insults about their character, religion, race, competency, sexuality and/or personal hygiene, I can’t help but feel that restrictions designed to enforce polite behaviour are increasingly infringing on freedom of speech. Prior to the rise of the internet, I imagine there was a fairly intuitive rule of thumb when it came to bitching about colleagues, viz: don’t write anything down. Trash talk was for the pub and other such friendly gatherings, or at the very least somewhere courteously beyond earshot of the person in question. Email lead to a new caveat: keep it off the company servers. Personal accounts are personal accounts, but you never know when someone might have legitimate cause to flip through your business correspondence. Even in this instance, however, there was still a veil of privacy, in that barring an authorised, dedicated search or deliberate hacking, there was no way for the subject of the conversation to accidentally ‘overhear’ and thereby take offence.

But sites like Facebook and Twitter have changed all that. Now, employees are able to form online groups and discuss the foibles of their jobs en masse or tweet about the demands of annoying co-workers – with troubling consequences. The blogsphere, too, has created workplace turmoil, with some employers sacking staff for mentioning their jobs online. While companies are well within their grounds to worry about the release of actual business information, especially where a preemptory or unauthorised mention of same could cause genuine loss or damage, the notion of bringing a company’s reputation into disrepute simply by admitting to personal foibles and opinions is deeply troubling. Satirising a job is not the same as maligning it, and criticising management should not be a sackable offense. Nonetheless, such things are currently happening.

As a student, I never liked the idea, put about at assemblies and other such spirit-building occasions, that I was moving through life as a ‘representative’ of my school, nor that my behaviour at all times, regardless of whether I wore the uniform, was correlated to some nebulous, anachronistic notion of school pride or reputation. As a grown worker, the sentiment still holds. First and foremost, we should belong to ourselves: all other affiliations, be they professional or academic, are secondary. There’s an ugly paternalism to schools and businesses laying claim to the morality and opinions of their attendees, and this is what rankles: the notion that our individual humanity is permissable only insofar as it doesn’t contradict the party line. It’s a big, messy, multifaceted issue – slandering colleagues is different to releasing confidential data is different to criticising management is different to having a sense of humour is different to daily blogging – but it is, ultimately, the same issue. Namely: how should we act online?

In a perfect world, people wouldn’t insult each other, nor would certain personality types be incompatable. But this is not a perfect world. In an age when instantaneous, public communication has dropped the veil of privacy from personal complaint, we need to grow thicker skins and get used to living with other people’s opinions. Because what’s really throwing us for a loop isn’t the fact that people have opinions or even that they’re different from ours: it’s that, all of a sudden, we know what they are, and feel moved to respond. Companies are kidding themselves if they think that the vast majority of their employees would still work if they didn’t have to. Work is a necessary evil: get over it. Employees are kidding themselves if they think that bitching about co-workers in cyberspace is the same as bitching at the pub. If you wouldn’t say it to their face, don’t type it where they can see it: simple. The law is kidding itself if it proves systematically incapable of distinguishing between serious, ongoing abuse and satire. People make jokes, and every exchange is nuanced: take it into account. Authority figures are kidding themselves if they think their position should put them beyond mockery or scrutiny. As in politics, you will be teased, disliked; your decisions will be questioned. It’s the price of being in power: live with it or step down.

But most importantly, we as a society are kidding ourselves if we think the solution to socio-digital omnipresence is to segregate our personalities. Our jobs and lives are bleeding together exactly because the two should be compatable; because people want to enjoy their work while still retaining the freedom to speak their minds. Communication should be used as a tool for social improvement, not restriction, which means compromise on both sides. And historically speaking, compromise has never involved the building of walls between different groups or ways of life.

Instead, it knocks them down.

Consider the following story: the refusal of a Christian school to train a Muslim teaching student. Rachida Dahlal, of Victoria University, was knocked back on her application to undergo work experience at Heathdale Christian College on the grounds of her faith. The university’s acting vice-chancellor pointed out that Mrs Dahlal, a devout Muslim who wears the hijab, had already been ‘counselled’ about Heathdale’s policy of ‘taking those whose values aligned to its own’, while school principal Reynald Tibben rather contradictingly stated that the school’s position was not that they had ‘anything against her or her beliefs’, but rather that their education policy was ‘nominal, it’s actually what parents want for their kids’, and that  hiring a Muslim teacher would have been both ‘inappropriate’ and ‘confusing’ for students.

For those who might question Mrs Dahlal’s choice of Heathdale to begin with, her decision was based on the proximity of the school to her home, and its position as one of few institutions offering both French and mathematics, her specialty subjects. Given also that it was her choice, and made in full knowledge of the school’s denomination, Principal Tibben’s guff about her likely discomfort during morning prayers seems frankly condescending. Would he have been so concerned about hiring an atheist? Would a Jewish applicant have been equally off-limits? In Mr Tibben’s eyes, would the presence of such people have proven similarly ‘confusing’ to students? Or is it just the fact that Mrs Dahlal’s faith is visible through her hijab, and not merely an internal ideology? More and more, it seems, society is struggling with the notion of discrimination; but what this case exemplifies – and yet what few people are willing to acknowledge – is that any set of beliefs associated with a specific ideal is, by definition, discriminatory.

This is not something we can legislate away. The vast majority of human interactions are predicated on conflict: disagreements over a favourite film, the appropriate price of food, who has the greatest claim to which resources, which is the best way to discipline children, how the universe began. At the far end of the scale are grandoise religious and philosophical abstractions, while at the other are trivial matters, debates that no sane person would try to legalise. But the middle regions are often indistinct, a blend of all such concerns, and it is here we live our lives. Politically, socially, sexually and legally, we have moved forwards in recent decades, making headway against racism, sexism, homophibia, explotation of children and religio-cultural discrimination; and yet despite its presence at the forefront of many such debates – if not all of them – the discrimination inherent in religious systems has remained the elephant in the room.

Put simply: if a person believes that their own religion is unshakeably correct to the exclusion of all other systems, and then refuses to hire a worker on the grounds that they are living outside of God’s rule and will set a bad example to other employees, passing a law to prevent them from doing so becomes tantamount to declaring that the logic which underpins their faith is wrong. The same thing lies at the heart of all the legislative drama over gay marriage: how do you allow someone freedom of religion while simultaneously declaring that certain of their religious or ideological tenets constitute a violation of human rights? There’s not an easy answer. But to anyone who believes in the separation of church and state, different religious beliefs should be equally accommodated – or refused – under the law, be they derived from shari’a, the Talmud or the Bible. Defending the values of one faith on the basis of its historical relationship to the nation is neither objective nor helpful: instead, it only serves to embed a lopsided definition of discrimination and entitlement in our cultural identity.

Which brings us back to Heathdale Christian College, and the reason why, in our secular state, Reynald Tibben should be found to have acted wrongly: because although a fair state must allow the existence of both secular and denominational schools, it should have no vested interest in preventing overlap between the two. Just as state schools hire teachers of all faiths, so too should their denominational equivalents. The difference between such institutions should be purely a matter of extra religious instruction, not the individual disposition of their teachers. Because if things are otherwise – if we state that a school has the right to hire or fire teachers on the basis of their personal values – then we may as well say that other Christian principals are equally within their grounds to fire teachers for apostasy, for expressing agnosticism or for religious conversion. The fact that Mrs Duhlal practises Islam does not affect her ability to speak French or teach mathematics, just as the Christianity of her students should not affect their ability to learn. As the saying goes, it’s impossible to please everyone. At the most basic level, discrimination simply means choice: to differentiate between one thing and another. We load the word with negative connotations, conflating it with prejudice in all instances, but saying that our society disciminates against racism is just as valid a useage as complementing someone on their discriminating taste. Because discrimination, be it deemed neutural, positive or negative, figures equally in choice, legislation and religion alike. And the sooner we start to confront that fact, the better for all of us.

It’s fair to say I think about elves more than the average person; that is to say, firstly, that I think of them at all, and secondly, that a sizeable chunk of this time is dedicated to theorising what elves really are. Among other things, this makes me slightly crazy. But I’ve come up with a theory. And now, rejuvinated by the illustrious Harkaway’s recent musings on cryonics, I’m ready to show and tell. Or maybe just tell, in this case. Whatever.

Anyway.

Elves, according to a wide range fantastic and mythological sources, are essentially very pretty people who live damn near forever in beautiful cities considerably superior to those of other races by the grace of their higher intellect, magic, advanced technology or a combination of all three. Outside of cities, they dwell in forests or natural areas, usually in a deeply symbiotic relationship with their surroundings, but in either instance, elven society is usually lauded as being progressive, or at least very successful. They are highly culturally advanced, but despite professing a preference for peace, tend, when roused, to be lethal in war. Outsiders often know little about them, as they prefer not to mingle with humankind, and their settlements are often isolated; typically, they also exhibit a low birth-rate in compensation for their incredible longevity. There is also a strong tendency to infer relationships between elves and dragons, or elves and white horses of superior stamina and intellect, both of which species are, in such instances, rarely if ever found elsewhere, granting their elvish masters the exclusive advantage of swift transport in largely medieval settings. Finally, elves are frequently described as placing a dual emphasis on learning, academic or otherwise, and on leisurely, creative passtimes.

Got all that?

Good.

Now, if we take the above hallmarks of elfness, remove the fantasy connotations, and render them as a set of socio-cultural markers, we end up with the following list of real-world characteristics:

1. Longer than average lifespans;

2. Objectively exceptional but culturally normative looks;

3. Technological superiority at an everyday level;

4. An outward preference for pacifism underwritten by extreme martial capabilities;

5. A preference for isolation from less advanced societies;

6. Largely urban lifestyles balanced against deeply held environmental convictions;

7. Access to superior modes of transportation and information relay;

8. A low average birthrate; and

9. A largely functional societal model extolling the virtues of both learning and leisure.

Sound familiar?

I find it both amusing and ironic that the mythical beings of early European culture are starting to look like the end point of modern Western society. True, we don’t live hundreds of years, but our lifespans are ever-increasing thanks to the ongoing advance of medical science. Give it another couple of decades, and who knows where we’ll be? And true, we’re not universally beautiful, but there is an increasing emphasis on physical perfection and achieving a set body type. With the advent of plastic surgery, many people now choose to alter their own appearance, and consider, too, the unveiling of the first ‘designer baby’ clinic in LA, where the new practice of cosmetic medicine allows parents to select the appearence of their future children.

Technological superiority? While it’s true that most of the world is now online, there’s certainly accuracy to the statement that affluent western, eastern and northern European nations have access to more and better gadgets than their counterparts in Africa, South-East Asia and South America. Similarly, technological prowess confers the advantage of both swift, secret information relay and rapid transportation worldwide. The notion of esposuing pascifism but practicing violence is, traditionally, a hallmark of nations throughout history; nonetheless, it seems particularly apt in a day and age when countries can initiate wars or engage in battles so geographically removed from their own turf that no risk of invasion is run, and where stockpiling WMDs has become routine practice. As for isolation, one need only look at the recent global tightening of immigration laws, particularly in the west: we might praise the notion of living in multicultural societies, but still remain fearfully recalcitrant when it comes to the very process which allows them to take shape.

The recent passion for reducing our carbon footprint while retaining an urban lifestyle is, to me, a particularly elvish dualism, and one which is sweeping most of the developed world. Similarly, while it’s difficult to try and argue for a lowered birthrate on such an enormous and diverse scale (although China’s One Child Policy is an intruiging counter-example), anecdotally, there seems to be a trend of affluent, educated women giving birth later and to fewer children, while our childhoods – or, more particularly, the time we spend at school and under the parental roof – is growing. Our current social model promotes a minimum of thirteen years’ schooling, while more and more people are attending university as a matter of course. At the same time, we deeply value labour-saving devices, the creation of entertainment and the right to leisure time, which is arguably a kind of social symbiosis: we work hard at learning how to do less in one sphere of daily life in order to create more time for learning in another, which in turn leads to more time, and also to the necessity for each generation to learn enough to keep up.

In short, we are growing into elves: not the fey creatures of our early imaginings, but into long-lived, scientific, face-selecting humans of a new technological era. Whether for good or ill, I’m not prepared to judge, but in either case, the comparison seems warranted. Which leaves only the question of magic, that elusive quality so associated with mythological elfhood; and yet even here, we might find a real-world comparison, in the words of Arthur C. Clarke, who wrote that “any sufficiently advanced technology is indistinguishable from magic,” a sentiment with which I wholeheartedly agree.

Because if any one of us went back in time to the genesis of elven myths; if we stood before our ancestors, iPhone-wielding, smooth-skinned, nylon-wearing, bedecked in even the cheapest, meanest jewellery and spoke of our world, what else could they name us – what else could they think us – but elves?

Oh noes – politicians have been caught Twittering ‘like bored schoolchildren’ throughout an address to Congress! Damn those evil youths and their seductive brainwasters for corrupting the attention of America’s finest! Calamity! Outrage! Way to lay it on thick, Dana Milbank: truly, anyone caught interacting with technology in such a vile fashion must belong to ‘ a support group for adults with attention deficit disorder,’ thereby invalidating the notion of ‘a new age of transparency’ in favour of ‘Twittering while Rome burns.’  

Or, like, not.

Don’t get me wrong: I’d much prefer our (or rather, America’s) politicans payed attention. That is the ideal scenario. But they are still human, and humans – funnily enough – get bored at inappropriate moments. Our brains are cluttered with odd little thoughts and observations crying to get out. We’re a social species. We can’t help ourselves. Thus, while Twitter undeniably constitutes a newfangled outlet for such internal deviance, it is not the source, and scary though we might find the thought, politicians have always been like this: picking their nose in the gallery, wondering what’s on TV tonight, wishing a hated opponent would get off the podium, watching the clock, perving on their colleagues and generally – gasp! – acting like people.

When, exactly, did we start expecting otherwise normal human beings to stop being human just because the cameras (or teh internets) were rolling? Here’s a wacky theory: maybe the only reason we’ve maintained this crazy notion of political pomp and dignity for so long is because we’ve had no intimate windows into the mindset of our leaders. And in this instance, it’s worth remembering that windows work both ways: just as we can now poke our heads in, metaphorically speaking, so can those on the inside stick an arm out and wave.

So, Mr Milbank, repeat after me: Technology Is My Friend. By the grace of what other agency does your irksome perspective reach Melbourne from Washington with such speed? Through what other medium do I now type this reply? Each new invention changes us, yes, but in most respects, it must first build on what is already there, be it a hitherto unrealised ideal, an untapped market, or the even unvoiced musings of our leaders. If, as per your inflationary grumblings, this new global digital society of ours consitutes a kind of Rome, it doesn’t belong to Nero, but to Augustus.

Because while Nero merely fiddled, Augustus found a world of brick and left it clad in marble.

Consider the following four articles on the dangers of youth exposure to too much digital culture:

iPod Safety: Preventing Hearing Loss in Teens;

Twittering brains withering, expert warns;

Teens flaunt sex, drugs on MySpace; and

Too much PlayStation causes painful lumps,

all of which cropped up in today’s online news. Together, they pretty much exemplify the fears of the Builders, the Baby Boomers and, to a certain extent, the elder members of Generation X, not just as regards their own offspring, but concerning all of modern society. Loud and clear, they’ve been wringing their hands for the past few years over the perils of digitisation, and every time, I’ve experienced a disqueting lurch of anger. It’s taken today’s media quartet for me to understand why this is: after all, cynic though I may be, I still put a certain base faith in the opinions of scientists and sociologists, especially when backed up by established studies. As a member of Generation Y, I’m hardly an impartial observer, and to a large extent, my negative reactions stem from a sense of being personally maligned, even where certain behaviours or criticisms don’t apply either to me as I am now, or to my historic teenage self. Rather, I feel outraged on behalf of my generation and those younger: that we are, in some sense, being fundamentally misunderstood. I can hack criticism, yes; but the sheer weight of professional authorities whose time has been purposefully devoted to proving that almost everyone under the age of 25 is steering themselves on a course towards social oblivion has begun to seem less like the amalgamated findings of unbiased research and more like an unconscious desire to demonise technology.

When it comes to growing up, it’s human nature to get fixed in whatever era raised us. Modern society is shaped, by and large, to ensure this happens – advertising and television timeslots, for instance, aren’t shown at random, but painstakingly catered to particular demographics. Thus, once we lose interests in the trappings of a given age and progress to playing with a new kind of gadget or watching a different kind of film, we effectively graduate from one type of newsfeed to another. Not watching weekend and afterschool cartoons, for example, means that we no longer learn which shows are cancelled and which will replace them, and that certain products, like the latest toys and games, will no longer form part of our daily media experience. Because our interest in such things has waned, we don’t notice the dissonance: rather, we assume that things have remained static in our absence, and are often startled in a moment of later nostalgia when, flipping on the TV at 3pm, we recognise none of the cartoon characters, none of the hosts, and none of the merchandise. Such disorientation provokes outrage: who are these strangers, and what have they done with our childhood? This biases our opinion of the new product towards hostility and skepticism from the outset; and even when we take the time to watch these new shows, the magic is missing, because we are no longer children. Wrongheadedly, however, we don’t immediately identify this as the problem, and tend to believe, rather, that the product itself is at fault. In fact, it becomes difficult to fathom what kind of person such programmes are catered to, and so, by extension and quite unselfconsciously, we have already taken the first steps towards discrediting the intelligence and taste of the next generation. This outrage slumbers in us, omnipresent but quiescent, until we have children of our own, or are forced to deal with someone else’s. Nonetheless, it is there.

Consider, then, that the technological advances of the past few decades have leapt ahead at unprecedented speeds. In the space of twenty years, we have moved from cassette tapes and walkmans to CDs and discmans to the now-ubiquitous mp3s and iPods of the new millenium. For a generation who started out buying their albums on LP, this is triply disconcerting, while for the generation who thought themselves blessed by the miracle of radio, it seems like a kind of magic. This is all common knowledge, of course, and therefore glossed with the shiny patina of frequent repetition: by itself, the comparison doesn’t provide an explanation for the hostility of older generations. Until, that is, we combine it with the above example about treasured childhood cartoons, because in this instance, not only are the new characters unrecognisable, but they don’t even appear on the same device.

And adults struggle with this. They are disconnected from their offspring, from their students; more important than connectivity and reminiscence, however, is the loss of firsthand advice. They simply cannot guide today’s teenagers through the digital world, which leads most youth to discover it on their own. Most of us who grew up with computers and videogames are either several years away from reproducing or blessed with children still in early primary-school: in other words, we are yet to witness what happens when a generation of adolescents is reared by a generation of adults anywhere near as technologically literate as their teenage progeny, who remember what it was like to hang out on Trillian or MSN chat all night, to experiment with cybersex, to write achingly of school crushes in their LiveJournal or to download music at home. Members of Generations Y and Z, in other words, in addition to being burgeoning iFolk, are also a social anomaly: a group whose own adolescence is so far removed from the experience of their caretakers as to prevent their parents and teachers, in many instances, from adequately preparing them for the real (digital) world.

But the gap will close. Already there are children in the world whose parents own game consoles, who will guide them online from a young age, and whose joint mannerisms both in real and virtual company will be drawn from a single parental source. Navigating away from certain parts of the internet will be taught in the same way as stranger danger and the implict lesson to avoid dangerous parts of the local neighbourhood. We teach what we know, after all, and yet large number of commentators seem not to have realised this – which is why I react badly to their writings. They never purport to be talking about teenagers now so much as teenagers always, or from this point on, a frustrating alarmism that takes no account of what will happen when such adolescents leave home, stumble into the bright sunlight, go to university, get jobs, fall in love and maybe have children of their own. In short, they have no sense of the future, or if so, they picture a world populated by antisocial digital natives, uprooting the fruits of their hard labour out of ignorance, apathy and poor management. Either they can’t imagine us growing up, or fear what we’ll turn into.

I’m speaking here in broad-brush terms. Obviously, the distinction between those who are technologically literate and those who aren’t can’t simply be reduced to their year of birth. Every generation has its Luddites (and, if we remember the political motivations of those original iconoclasts, this is often a good thing) as well as its innovators, its geeks and scientists. And many such worried articles, irksome though I may find their tone, are still correct: listening to your iPod on full volume will probably damage your hearing, just as it’s not a wise idea to post intimate details of your sex life on MySpace. The latter concern is markedly new, and something teens certainly need to be made aware of – indeed, professionals new to Facebook are still themselves figuring out whether to friend coworkers or employers, thereby allowing them to view the results of various drunken nights out, or to keep a low digital profile. Such wisdom is new all round, and deeply appreciated. On the other hand, parents have been telling their kids to turn down their damn music in one form or another ever since Elvis Presley first picked up a guitar, and while the technology might’ve become more powerful in the intervening decades and the studies into auditory damage more accurate, the warning remains identical (as does the inter-generational eye-roll with which it tends to be received).

In short, the world is changing, and people are changing with it, teachers, teens and parents alike. And I cannot help, in my own curious optimism, to see this as a positive thing: that in a world where technology moves so swiftly, older generations must constantly remain open to the idea of learning from their younger counterparts, while those in the know must become teachers earlier. There is so much to be gained in the coming years, and so many problems, both great and small, to be solved. The gap between adults and adolescents has never been so large, but while it always behooves those in the former category to teach and aid the latter, this should never be at the expense of at least trying to understand their point of view. And this, ultimately, is what causes me to bristle: whether playing videogames can hurt your hands or spending too much time online can damage your real world social skills, such passtimes aren’t going anywhere. Rather than condemning or sneering at such things outright or tutting sadly, the more productive path is to consider how best they can be incorporated into modern life without causing harm, or to study how they work in confluence with real-world interactions, and not just fret about what happens if they’re used exclusively.

Because technology – and future generations – aren’t going anywhere. We might not look like Inspector Gadget, but baby, we’re his heirs. Or rather, Penny’s. You get the idea.

Recently, I was drawn to this article by feminist writer Monica Dux, in which she discusses the phenomenon of little girls dressing as fairy princesses. As I read, I found myself nodding: there’s truth to the idea that garbing small girls exclusively in pink and lauding their beauty above all else can lead to problematic behaviour in adolescence – a bona fide Barbie mentality. And, like the writer, I was a tomboy at school: at seven, I was deeply obsessed with dinosaurs, loved soccer, could hold my own in a handball game with boys three years my senior, burned ants with a magnifying glass, built forts in the bush and played video games whenever possible. I wasn’t Pretty In Pink. 

But for all that, I can’t help feeling that Dux has cottoned on to a genuine concern and drawn a flawed conclusion – specifically, that forbidding pink and fairies is the answer. Like other parents mentioned in her article, mine certainly never encouraged the Fairy Fixation, but neither did they actively forbid it. As a consequence, My Little Ponies jostled in my schoolbag alongside Starscream of the Decepticons; I dressed up as the Man from Snowy River for my bookday parade, but also had a tutu in my wardrobe. (I’ll give you one guess what colour.) Diversity isn’t just forcibly steering a child away from the norm, but actively offering them a choice. And if you stint the dominant side for long enough, sooner or later, you end up creating a different kind of imbalance.

There’s nothing inherently sinful about the colour pink: refusing it on grounds of its association with princess-type deviance makes as much sense as declaring that lefhandedness is evil, a pahse I’d like to think this part of the world has grown out of. The problem isn’t the concept of fairies as loved by children, but how adults react to their use. Dux herself makes note of this – parents who praise their daughters as beautiful, pretty, sugar and spice when princessed up – and yet her solution is not for adults to change their own behaviour. Rather, she advocates that they regulate costume use in children. As an approach, this is virtually identical to telling teenage girls not to dress provocatively if they don’t want to be wolf-whistled, instead of, as makes more sense, trying to raise boys who don’t judge women by their clothes. Human weakness and pragmatism allows for some middle-ground, and there’s a case to be made that dolls like Bratz and Barbie capitalise on the colour pink to sell an unrealistic standard of beauty, but ultimately, girls should be free, in the gender-biased sense, to be girls. A truck-hungry tomboy does not lurk within every prepubescent glamour queen – nor should it.

Minus the adult overzealousness, there’s still a distinct bias in the way toys are offered to children. Underneath all the gendered marketing, the fact is (and Dux agrees) that boys and girls are different. What needs to be encouraged is the idea that different isn’t automatically bad – not just between boys and girls, but girls and girls, boys and boys, and that it’s OK to pick’n’mix your interests. Girls who want to play rugby should still be able to frock up in pink, just as boys who’re happy to play with dolls should still be allowed to like cars. It’s also a fact that children are cruel, and police difference within their small communities with a rigour and bias difficult in the politics-conscious adult world. That can’t be changed entirely, but I suspect it can be mitigated by parental behaviour.

Unless we’re talking about the singer, pink’s not my cup of tea (and even then, I have to be in the right mood). There’s a long road yet to travel before society stops marketing towards the biases children have for themselves and starts venturing into new territory; in video games, at least, there’s been some headway. Parental coddling has a lot to answer for, and given the kind of adult I’m turning out to be, I’m glad I never felt pressured to cling to pink and fairydust to win approval. Perhaps, to take a backwards leap, I’m turning into the adult I am precicely because I never felt that pressure. There’s also girls who’d feel similarly uncomfortable if forced towards tomboyishness – not that Dux advocates this, but it’s one potential consequence of her solution.

And the moral of this story? That girls (and boys) can be pretty in pink, or not. The important thing is choice.