Archive for February, 2009

Oh noes – politicians have been caught Twittering ‘like bored schoolchildren’ throughout an address to Congress! Damn those evil youths and their seductive brainwasters for corrupting the attention of America’s finest! Calamity! Outrage! Way to lay it on thick, Dana Milbank: truly, anyone caught interacting with technology in such a vile fashion must belong to ‘ a support group for adults with attention deficit disorder,’ thereby invalidating the notion of ‘a new age of transparency’ in favour of ‘Twittering while Rome burns.’  

Or, like, not.

Don’t get me wrong: I’d much prefer our (or rather, America’s) politicans payed attention. That is the ideal scenario. But they are still human, and humans – funnily enough – get bored at inappropriate moments. Our brains are cluttered with odd little thoughts and observations crying to get out. We’re a social species. We can’t help ourselves. Thus, while Twitter undeniably constitutes a newfangled outlet for such internal deviance, it is not the source, and scary though we might find the thought, politicians have always been like this: picking their nose in the gallery, wondering what’s on TV tonight, wishing a hated opponent would get off the podium, watching the clock, perving on their colleagues and generally – gasp! – acting like people.

When, exactly, did we start expecting otherwise normal human beings to stop being human just because the cameras (or teh internets) were rolling? Here’s a wacky theory: maybe the only reason we’ve maintained this crazy notion of political pomp and dignity for so long is because we’ve had no intimate windows into the mindset of our leaders. And in this instance, it’s worth remembering that windows work both ways: just as we can now poke our heads in, metaphorically speaking, so can those on the inside stick an arm out and wave.

So, Mr Milbank, repeat after me: Technology Is My Friend. By the grace of what other agency does your irksome perspective reach Melbourne from Washington with such speed? Through what other medium do I now type this reply? Each new invention changes us, yes, but in most respects, it must first build on what is already there, be it a hitherto unrealised ideal, an untapped market, or the even unvoiced musings of our leaders. If, as per your inflationary grumblings, this new global digital society of ours consitutes a kind of Rome, it doesn’t belong to Nero, but to Augustus.

Because while Nero merely fiddled, Augustus found a world of brick and left it clad in marble.

Consider the following four articles on the dangers of youth exposure to too much digital culture:

iPod Safety: Preventing Hearing Loss in Teens;

Twittering brains withering, expert warns;

Teens flaunt sex, drugs on MySpace; and

Too much PlayStation causes painful lumps,

all of which cropped up in today’s online news. Together, they pretty much exemplify the fears of the Builders, the Baby Boomers and, to a certain extent, the elder members of Generation X, not just as regards their own offspring, but concerning all of modern society. Loud and clear, they’ve been wringing their hands for the past few years over the perils of digitisation, and every time, I’ve experienced a disqueting lurch of anger. It’s taken today’s media quartet for me to understand why this is: after all, cynic though I may be, I still put a certain base faith in the opinions of scientists and sociologists, especially when backed up by established studies. As a member of Generation Y, I’m hardly an impartial observer, and to a large extent, my negative reactions stem from a sense of being personally maligned, even where certain behaviours or criticisms don’t apply either to me as I am now, or to my historic teenage self. Rather, I feel outraged on behalf of my generation and those younger: that we are, in some sense, being fundamentally misunderstood. I can hack criticism, yes; but the sheer weight of professional authorities whose time has been purposefully devoted to proving that almost everyone under the age of 25 is steering themselves on a course towards social oblivion has begun to seem less like the amalgamated findings of unbiased research and more like an unconscious desire to demonise technology.

When it comes to growing up, it’s human nature to get fixed in whatever era raised us. Modern society is shaped, by and large, to ensure this happens – advertising and television timeslots, for instance, aren’t shown at random, but painstakingly catered to particular demographics. Thus, once we lose interests in the trappings of a given age and progress to playing with a new kind of gadget or watching a different kind of film, we effectively graduate from one type of newsfeed to another. Not watching weekend and afterschool cartoons, for example, means that we no longer learn which shows are cancelled and which will replace them, and that certain products, like the latest toys and games, will no longer form part of our daily media experience. Because our interest in such things has waned, we don’t notice the dissonance: rather, we assume that things have remained static in our absence, and are often startled in a moment of later nostalgia when, flipping on the TV at 3pm, we recognise none of the cartoon characters, none of the hosts, and none of the merchandise. Such disorientation provokes outrage: who are these strangers, and what have they done with our childhood? This biases our opinion of the new product towards hostility and skepticism from the outset; and even when we take the time to watch these new shows, the magic is missing, because we are no longer children. Wrongheadedly, however, we don’t immediately identify this as the problem, and tend to believe, rather, that the product itself is at fault. In fact, it becomes difficult to fathom what kind of person such programmes are catered to, and so, by extension and quite unselfconsciously, we have already taken the first steps towards discrediting the intelligence and taste of the next generation. This outrage slumbers in us, omnipresent but quiescent, until we have children of our own, or are forced to deal with someone else’s. Nonetheless, it is there.

Consider, then, that the technological advances of the past few decades have leapt ahead at unprecedented speeds. In the space of twenty years, we have moved from cassette tapes and walkmans to CDs and discmans to the now-ubiquitous mp3s and iPods of the new millenium. For a generation who started out buying their albums on LP, this is triply disconcerting, while for the generation who thought themselves blessed by the miracle of radio, it seems like a kind of magic. This is all common knowledge, of course, and therefore glossed with the shiny patina of frequent repetition: by itself, the comparison doesn’t provide an explanation for the hostility of older generations. Until, that is, we combine it with the above example about treasured childhood cartoons, because in this instance, not only are the new characters unrecognisable, but they don’t even appear on the same device.

And adults struggle with this. They are disconnected from their offspring, from their students; more important than connectivity and reminiscence, however, is the loss of firsthand advice. They simply cannot guide today’s teenagers through the digital world, which leads most youth to discover it on their own. Most of us who grew up with computers and videogames are either several years away from reproducing or blessed with children still in early primary-school: in other words, we are yet to witness what happens when a generation of adolescents is reared by a generation of adults anywhere near as technologically literate as their teenage progeny, who remember what it was like to hang out on Trillian or MSN chat all night, to experiment with cybersex, to write achingly of school crushes in their LiveJournal or to download music at home. Members of Generations Y and Z, in other words, in addition to being burgeoning iFolk, are also a social anomaly: a group whose own adolescence is so far removed from the experience of their caretakers as to prevent their parents and teachers, in many instances, from adequately preparing them for the real (digital) world.

But the gap will close. Already there are children in the world whose parents own game consoles, who will guide them online from a young age, and whose joint mannerisms both in real and virtual company will be drawn from a single parental source. Navigating away from certain parts of the internet will be taught in the same way as stranger danger and the implict lesson to avoid dangerous parts of the local neighbourhood. We teach what we know, after all, and yet large number of commentators seem not to have realised this – which is why I react badly to their writings. They never purport to be talking about teenagers now so much as teenagers always, or from this point on, a frustrating alarmism that takes no account of what will happen when such adolescents leave home, stumble into the bright sunlight, go to university, get jobs, fall in love and maybe have children of their own. In short, they have no sense of the future, or if so, they picture a world populated by antisocial digital natives, uprooting the fruits of their hard labour out of ignorance, apathy and poor management. Either they can’t imagine us growing up, or fear what we’ll turn into.

I’m speaking here in broad-brush terms. Obviously, the distinction between those who are technologically literate and those who aren’t can’t simply be reduced to their year of birth. Every generation has its Luddites (and, if we remember the political motivations of those original iconoclasts, this is often a good thing) as well as its innovators, its geeks and scientists. And many such worried articles, irksome though I may find their tone, are still correct: listening to your iPod on full volume will probably damage your hearing, just as it’s not a wise idea to post intimate details of your sex life on MySpace. The latter concern is markedly new, and something teens certainly need to be made aware of – indeed, professionals new to Facebook are still themselves figuring out whether to friend coworkers or employers, thereby allowing them to view the results of various drunken nights out, or to keep a low digital profile. Such wisdom is new all round, and deeply appreciated. On the other hand, parents have been telling their kids to turn down their damn music in one form or another ever since Elvis Presley first picked up a guitar, and while the technology might’ve become more powerful in the intervening decades and the studies into auditory damage more accurate, the warning remains identical (as does the inter-generational eye-roll with which it tends to be received).

In short, the world is changing, and people are changing with it, teachers, teens and parents alike. And I cannot help, in my own curious optimism, to see this as a positive thing: that in a world where technology moves so swiftly, older generations must constantly remain open to the idea of learning from their younger counterparts, while those in the know must become teachers earlier. There is so much to be gained in the coming years, and so many problems, both great and small, to be solved. The gap between adults and adolescents has never been so large, but while it always behooves those in the former category to teach and aid the latter, this should never be at the expense of at least trying to understand their point of view. And this, ultimately, is what causes me to bristle: whether playing videogames can hurt your hands or spending too much time online can damage your real world social skills, such passtimes aren’t going anywhere. Rather than condemning or sneering at such things outright or tutting sadly, the more productive path is to consider how best they can be incorporated into modern life without causing harm, or to study how they work in confluence with real-world interactions, and not just fret about what happens if they’re used exclusively.

Because technology – and future generations – aren’t going anywhere. We might not look like Inspector Gadget, but baby, we’re his heirs. Or rather, Penny’s. You get the idea.

Gleeful Tidings

Posted: February 25, 2009 in Ink & Feather
Tags: , , ,

Currently, it looks like Solace and Grief will come out circa March 2010, which announcement I consider to be a thing of extreme shininess. Also, my copy of the contract has been returned in the illustrious company of my first ever advance cheque, which I banked today at lunch. Behold! – I have been paid for my words. I am officially Writery. Authory, even.

 Muahaha. Roll on, 2009!

Attention anyone who, like me, compulsively watched Buffy the Vampire Slayer but has only ever glimpsed How I Met Your Mother through random channel-surfing, last-two-minutes-catching or individual episode spruiks. Despite the fact that Alyson Hannigan appears on both shows, I would like to clarify the following:

Adam Busch (Warren on Buffy) has never appeared on How I Met Your Mother. He does, however, bear a striking resemblence to Josh Radnor, who does. So, to reiterate:

this man:

Not Josh Radnor

Not Josh Radnor

is not this man:
Not Adam Busch

Not Adam Busch

That is all.

According to today’s New York Times, the high expectations of American tertiary students are leading them to haggle over their grades. The students argue that if they show up and complete all the required readings, they deserve an A, and that the act of putting in effort to meet the standards should be viewed positively during grading. Lecturers argue that merely meeting the  standards required to pass a course – in other words, showing up and doing the reading – should only earn a C, as this constitutes the bare minimum required to pass. It’s the kind of argument that could easily rant on for pages, but there’s one line which, for me, perfectly sums up why the professors, and not their students, are correct. As James Hogge puts it:

“Students often confuse the level of effort with the quality of work. There is a mentality in students that ‘if I work hard, I deserve a high grade.'”

This, to me, is as perfect a summation as one could find on the ultimate consequence of turning education into a commodity. In a society where a majority of students complete at least some tertiary study, the bar for excellence has been raised. Mechanically showing up and sitting through the allotted lectures or tutes is not the same as comprehending – or, indeed, caring about – their content. Reading something to fulfil course requirements is not commensurate with reading for pleasure. What lecturers are identifying, and what some students are evidently struggling with, is the notion that education should be more than a chore, or a means to an end: that it should be delightful in its own right, encouraged for its own sake. Under this model, the extra engagement required to reach an A grade comes from genuine interest, and, if we’re honest, a certain amount of intelligence, neither of which can be faked. And as the ultimate products of standardised testing, a system under which a love of learning is palpably secondary to meeting benchmarks, students are, unsurprisingly, floundering.

More and more, the question of how to engage students is one I find myself grappling with, despite being neither a parent nor an educator. For me, the most important components of schooling should be instilling a desire to learn while providing the tools, guidance and encouragement for pupils to do so. One of these tools, unnegotiably, is language, without which it is impossible to read, write or effectively communicate ideas. Beyond that, any decision as to which disciplines are most important is arbitrary, and while there’s certainly sense in providing as many people as possible with a base level of knowledge in a broad range of fields, such as maths and geography, it’s no substitution for producing an individual capable of selecting their own interests and researching them independently.

Which is where, for me, the entire basis of modern education comes tumbling down like London Bridge: it graphically fails to achieve this most basic and vital of outcomes. Rather, such eager students tend to flourish in opposition to the very system that should be supporting them, springing up like hardy plants between cobblestones. They learn to love knowledge despite the way it is taught to them, despite having their interests routinely cordoned off by the arbitrary barriers of syllabi. In democratising education and providing it to all, we’ve forgotten why it should be provided to anyone. Teaching all children under equal circumstances and without prejudice is not the same as believing that a single mode of tuition will be of equal benefit to everyone: quite the opposite. Except that, in commodifying education, exactly this assumption has been made.

Here’s an elitist thought: some people are brighter than others. They can learn things faster, more thoroughly and in greater number than the average student. Similarly, some people are slower: it takes them more time to register fewer concepts to a lower degree of proficiency. Education does not eradicate this fact. It’s not simply a matter of native intelligence, either: some students might be slower due to language barriers, behavioural problems, poor teaching or any number of social difficulties. Others might be faster because they enjoy a certain subject, because they appear stronger by comparison to their peers, because of an excellent teacher, or because their parents help at home. This is evident to kindergarden teachers the world over – and yet all students are given the same goals. The habit of standardised testing is not so bad in Australia as America, but one can still draw the same conclusion of education in both countries: that passing grades are held to be more important than retaining knowledge. Obvious though it seems, the argument that those who know will pass holds little water, for three important reasons:

1. Rote-learning a concept to pass a test is not the same as understanding it;

2. Those who rote-learn are, through primary and secondary school, treated identically to those who genuienly seek knowledge; and

3. There is no extra reward provided to students who demonstrably want to learn for learning’s sake.

Psychologically, this sets up an expectation in students that wanting to explore a subject further isn’t worth their while – and, academically, it isn’t. They will receive no tangible reward for reading about Henry VIII in their spare time; neither will displaying extra knowledge allow them to move forward at a faster pace, and while the outcome should be to teach a love of learning for its own sake, the way to encourage this from an early age is through reward. If students who show initiative aren’t treated any differently under the education system, then the majority will, through apathy or disappointment, revert to meeting only the minimum requirements. If they are bright, this is looked upon as coasting, a behaviour which, ironically, is discouraged. Much like the ‘intangible benefits’ so laughingly touted by many corporations in place of actual staff bonuses or health care plans, the architects of the modern educational system seem to assume that an absence of reward will nonetheless encourage students to excel in their own time. As for arbitrary in-school awards, such as often take the form of laminated and calligraphied cardboard, these are nice mementos, but ultimately meaningless, comparable to the much-loathed ‘quality awards’ of the new corporate sphere. They are the lowest possible recognition of achievement, inadequate placeholders for actual change, innovation or devlopment.

Which brings us back to American college students and their sense of entitlement. Consider them anew in light of the above. They have been taught for thirteen years that meeting the requirements of the system is all that matters, and that going above and beyond, while perhaps an idealistic concept, results only in extra work for no gain, and, quite possibly, in social mockery. At the same time, they have been told, repeatedly and with emphasis, that holding a degree is vital to their future success: they must continue to work hard. And the operative word here is work, because this is what education means to them. Not knowledge, not pleasure, not investiagtive thrills, but work, a difficult, laborious and time-consuming means to an unspecified end. They are waiting, like so many of us in the modern world, for the joy to kick in: to reach the end of the academic rainbow and find the job they love. But learning to love our jobs is, in many ways, identical to learning how to love knowledge: a process which is the direct antithesis of modern education.

Many people don’t hit their stride until university. For some, it’s the first opportunity to explore ideas that interest them as a part of learning, and not just in their own time. Others finally break through the limits of school and attack the discipline they’ve been hankering for, be it geology or medicine. But for many – and, I fear, for most – it’s a startling disappointment. Like pigeons raised in a dark coop, they have no idea how to stray beyond the bounds in which they’ve been raised. They never realised it was the point – nobody ever told them. Certainly, the system didn’t. They drop out, feeling betrayed, or go on to feel naggingly unhappy in their jobs, donning their disquiet in the assumption that it indicates adulthood. And as the twin stranglehold of commodified education and standardised testing tightens, more and more people will be squeezed into a mould inimical to learning. Those who might love university will, by the time they reach it, feel exhausted at the thought of jumping through yet more hoops, and have no savour for any educational institution; others will have long since given up. And meanwhile, those few people who excel at the standardised system will rocket through with glowing recommendations, completely ill-equipped to enter any profession which requires not only passion, but imagination.

The weight of such people is already warping the tertiary system. In Australia, the rise of full-fee paying students, both nationally from overseas, has placed enormous pressure on lecturers to pass inadequate learners. This payment for education turns the degree into a product, moving the customer to demand value for money. Invariably, such students view their own role as passive. Education is something the university must do to them, not a thing in which they must participate, or for which they might ultimately be ill-suited. And such mindsets, both in the long and short term, can only be harmful to the intellectual development of society.

Because in a time of such need for genius, and yet where genius is thin on the ground; when innovation is desperately needed at every turn, and where social, economic and environmental pressures are forcing the reinvention of long-held or unquestioned systems, we need every intellectual iconoclast, highschool anarchist and rule-breaking miscreant to remember what they loved about knowledge: that it improves those people and institutions who lovingly and eagerly receive it, and rewards those who strive in its persuit.

I signed my first book contract yesterday. In a way, it was a more momentous event than actually hearing the novel had been accepted, because it was concrete, fixed in paper. For the first time, I spoke to my publisher on the phone. We chatted about the contract, diverting fragmentarily into what comes next, and now it’s finally hit me that there is a next, that I don’t have to start reshopping again, and that all the emailed back-and-forth about series names and schools and libraries had a point.

I’m actually getting published.

Dazedly, I keep wandering into Reader’s Feast at lunch, greedily eyeing the ‘M’ slot on shelves and noting where my book, potentially, could sit. At home, working on the next volume, it startles me to think of not needing to submit all over again; that, like a privileged second child, it will never know the anxiety and heartache of its elder sibling’s early days. Wandering into Readings, I feel my stomach jump to recognise books on display as originating from my new publishing house. And so on.

I don’t have many details yet. I’m new at this. But the book, for those who are interested, will be called Solace and Grief. It’s young adult fantasy. I’m working with Ford Street Publishing and the wonderful Paul Collins. Also, I’m now on Twitter. And I am – and will continue to be – extremely, wonderfully excited.

Fire/Poem

Posted: February 12, 2009 in Ink & Feather
Tags: , , , , , ,

1.

just a spark. a tiny star,
winking in dropped glass
beside sticky tarmac, or else
an ember squeezed from a cigarette,
a sharp red dream in a firebug’s heart.
what madness, pain, will it impart?

2.

roaring gold, the maw devours
homes, lives, plants
as easily
as terror, longing, grief
steal hours.
a cancerous lung, the smoke consumes;
pauses, gathers strength
& then resumes.

3.

the wind is wild as a witch’s curse,
stinging with scarlet thorns
its Phaeton-mares, frenzied,
pulling a charcoal hearse.
sun’s chariot falls like a hammer-blow,
a wall of burning grief,
a searing loss, & while the anguish lasts
it will not cease.

4.

they hide in the earth,
seek sanctuary
that Dresden’s force denies.
above, dams boil & hearts explode
& weep as dogs lie bravely down,
a sea of guardians who will not rise.
they could have strayed,
but faithfully did not:
their masters stayed.

5.

trees shatter into swollen skies,
bursting like ripened fruit
in the fire’s hard hand. we knew the risks;
we understood
the perils of our lovely, sunswept land.
they were not this: to stay or go, but burn
without a choice. birds died aloft:
small angels, lacking voice.

6.

now only ash remains, & twisted shells.
where once sang lyrebirds,
we sift the wrecks, the dark, unlovely hells
of loss. such wounds run deep,
& still the fires burn.
we dare not sleep.