Posts Tagged ‘Geek’

Warning: absolutely giant massive spoiler alert!

OK, so: part one of the final David Tennant episode of Doctor Who, The End of Time, has now aired in the UK. The fact that I’ve been predicting the return of the TimeLords ever since Tennant first announced his retirement has left me with a warm, glowy feeling of narrative vindication. (The fact that said glow has undoubtably been heightened by the large glass of eggnog sitting to my left is by the way and nothing to do with it.) As soon as the Ood declared that ‘they are returning’, I knew it was game on, which view was ultimately proven correct when Timothy Dalton appeared mid-episode wearing the unmistakeable red and gold of Rassilon. It makes perfect sense that the Tenth Doctor’s exit would in some way be tied to the return of the denizens of Gallifrey, as his tenancy (hah – pun!) has been entirely characterised by their absence. In terms of mining the original show, the other TimeLords are the single facet yet to be brought back, and as the Daleks have turned up numerous times despite their supposed destruction during the Time War, finding a means of resurrecting their enemies is an act of natural balance. In the trailer for the final act, it has also been revealed that the drumming tune in the Master’s head – the inspiration for the four knocks which are prophecied to preempt the Doctor’s death – is representative of the double beat of a TimeLord’s heart. Armed with this knowledge and a glipse of the final episode, therefore, here are my predictions for the final ever episode of David Tennant’s term in Doctor Who.

Back in The Sound of Drums, it was revealed that what originally sent the Master mad was the TimeLord ritual of staring into the Time Vortex through the Untempered Schism. From this point on, the drums in his head were always calling to him. We know, too, that the Doctor can sense the presence of other TimeLords alive in the galaxy – but there are exceptions to this ability. Consider that creator Russell T. Davies, much like Joss Whedon, has a habit of planning his storylines long in advance, such that he is in a position to drop hints as to their eventual conclusion. One such notable clue is the Medusa Cascade, a place the Doctor was reported to have sealed off during the Time War, but where Davros and the Daleks were later proven to be hiding, along with a number of stolen planets, at the end of Season 4, by being a second out of sync with the rest of the universe. I won’t venture an explanation as to how, but my speculative guess, after the Ood announced that ‘things which have already happened are happening now’, is that those TimeLords who survived the Time War did so by a similar trick of temporal displacement; perhaps even utilising one of the Nine Gallifreys of old. Which is why, when the Master gazed into the Vortex all those years ago, the sound of drums was embedded in his head: he could hear the future/present of the timeless TimeLords, and was irrevocably altered by their (which is to say, Timothy Dalton and his prophetess’s) call to war. The Ood can sense this displacement at a psychic level, and now that the Master has turned everyone on Earth into copies of himself, the fact of this will allow the rest of the TimeLords to return: because of what he is, and of what was originally done to him.

Which leads us to Wilf, who appears to be having visions of a female TimeLord council member, and to Donna Noble, who is no longer quite human, and who has been forced to remember everything she was made to forget. This is somewhat interesting, as the Doctor has explained that Donna can’t remember without dying; but if she can, then what does this say about her deeper nature? Perhaps – one might speculate – her survival has something to do with those Huon particles she imbibed so long ago, given their relationship to TimeLord technology. We were told ealier that there was no coincidence in the Doctor meeting Donna more than once, and now we know that there is no coincidence to Wilf’s continued appearences, either. Why is he the only man to remember his bad, precognitive dreams? Perhaps this is an example of cyclic time: due to the Doctor’s protection, he was never going to turn into a copy of the Master, and was therefore able to remember in the present what his future self would eventually learn. Wilf is a stargazer, a soldier who has never killed a man; alternatively, his significance might lie in the fact that he is human – wholly human, unlike Donna – and therefore represents a viable template from which the human race might be restored. But he also has a choice to make, a life to take: the Doctor’s, the Master’s, or perhaps Timothy Dalton’s.

So, to wrap up all these vague speculations, I’ll end on a more solid, if perhaps more obvious note: Timothy Dalton’s character will die; Gallifrey will return; the Doctor will be offered the mantle of Lord President (again) and refuse; the Master will escape to fight another day, as per his speciality; and Donna’s memories will be restored.

There. How’s that for a prophecy?

Warning the First: This is what happens when I read about Twitter coverage of the Iranian election and start thinking about Little Brother, Serenity, The Gone-Away World and The X-Files in confluence. (With apologies to Cory Doctorow, Joss Whedon, Nick Harkaway and Chris Carter.)

Warning the Second: I am a giant geek.

Are you sitting comfortably? Then we’ll begin.

 

spooks in the machine

 

& in my head I hear them shouting –

Take it back! wrote Doctorow; & as the smoke

of bloody bombs and tiger-fires lights the way,

young fingers dance a typeset revolution, row on row, and say! –  

don’t updates sound like Mr Universe? You cannot stop the signal. 

True. Technology’s an everloving curse.

 

The youth are fighting back. From pens & swords

we battle guns & tweets; and shockingly the old wives’ mandate 

(tell it to the bees) has proven true:

the hivemind waits, a hydra craving news.

The truth is out there. All along the pipe of pipes,

we raise a cry: the The FOX is going up! 

& when we look and look again

there is no lie, no crawling, poor excuse to tell

that begs our ignorance of broken men,

the brimstone-charred apologists of hell.

 

Words thrive in spaces other norms refuse: they

grow like ivy, breed like mushrooms, eat the smart refuse

of dreams, & when the firewalls are trashed 

they revel in it: long live youth! whose busy thumbs in World War III

(should trenches ever come again, & schnapps, & soccer skirting bombs)

might save the Christmas Truce!

For some time now, I’ve been a serial language learner. In primary school, my Year 3 teacher spoke Japanese and taught some of it to my class, which we dutifully learned. Hearing of this, my grandmother, who taught Japanese immigrants to speak English after World War II, gave me the books she’d used to study the language herself. In this context, I started taking extra-curricular Japanese lessons. I was not, however, a dedicated student: detesting repetitive practice in the manner of children who otherwise learn so quickly as to find it tiresome, I made no effort to learn my katakana or kanji, and despite the fact that I enjoyed counting and making origami figures, the lessons eventually stopped. My occasional childhood encounters with Japanese culture, however, continued: first in the form of Miyoko Kyushu, an exchange student who stayed with my family for several weeks, and then in the guise of new neighbours, who, though Norweigian by descent, had lived in Japan for many years. All three sons learned the language, while both parents spoke it fluently. Like Miyoko, they kept the Japanese tradition of bringing gifts to one’s hosts, so that when we first met the Johansens at a welcoming-the-neighbours barbeque, the wooden geisha doll, Japanese picture book and hand-sewn juggling balls Miyoko had given me found company with a puzzlingly-shaped Japanese bag and several boxes of sticky (but delicious) Hello Kitty candies. With the exception of these last edible items, I still have everything else. Like my knowledge of Japanese numbers, it seems, they’ve never quite slipped away.

In high school, I learned French and German as part of the school curriculum. Some words from each have stuck with me, such as counting sequences, greetings and a handful of random nouns, although somewhat inexplicably, I’ve also retained a teaching song in French detailing the birthday gifts received by a fictitious singer from his various relatives. Around the same time, I decided that archaeology was my destined career, and was advised that the best languages to learn for this were Latin (for the antiquity) and German (for reasons which now seem both dubious and odd). Given that I went to a public school, such a decision was problematic: with seventeen interested students deemed not enough to sustain a full class, I ended up taking German after school, while for Latin, I was forced to resort to a correspondence course.

When I changed schools the following year, the German didn’t last; but Latin did. I kept it up through all of highschool, even taking advanced Latin units for the HSC despite my appalling grasp of grammar. Once again, my lack of enthusiasm for rote learning saw any chance at fluency well and truly shot, although my pronunciation skills and stock vocabulary were generally on par. By the time university rolled around, my interests had swung from archaeology to the history of the Middle East, such that, rather than continuing Latin, I started learning Arabic instead. I stuck it out for one year, but was still, ultimately, a lazy student: I simply couldn’t (or wouldn’t) motivate myself to do the required homework and memorisation necessary for learning a spoken language, despite the fact that learning a new script had proved a sinch – after all, I used to invent alphabets in class when I was bored, memorise them in that hour, then write to myself in that cypher for a day, or a week, or however long it took me to lose interest or start again. But vocal fluency is different. Historically, I’ve been unjustly apathetic in this regard, perhaps because I find it frustrating to have to actually work at acquiring a new language, when in almost every other discipline – the exception being maths, which I’ve never liked – I’m able to osmose comprehension with a comparative lack of effort, especially when interested in the subject. That’s the irony of native intelligence: without a competitive drive, learning becomes purely a matter of convenience. And I’m not a competitive person.

For a while, then, I stopped learning languages – until a few weeks ago, when a friend offered to teach a beginner’s course in Mandarin Chinese. I went to four or five of his classes, and had a good time: if nothing else, I can now count to ten in Mandarin, and at least in the short term, I can recognise certain words and written characters. As with Arabic, however, there’s a strong chance I’ll forget most, if  not all of it, although my track record suggests that if anything stays, it will be the numbers. This might seem paradoxical, given my dislike of maths, but remembering things in sequence is always easier than remembering them individually, at least for me.

Subsequently, since stopping the Mandarin classes, I’ve been thinking about my history with trying and failing to acquire new languages. I like the idea of being bilingual,  but short of actually moving to a non-English-speaking country, could I ever convince myself to put in the required effort? Certainly, I’m more dedicated now than I was then, and more patient; this time around, it was time constraints which caused the change of heart, not lack of interest. Which brings me back to Japanese, the first language I ever tried to learn, and the one which, oddly, I still have the most to do with. Although my foray into learning karate ended several years ago, I still remain extremely interested in anime. Since discovering anime and manga through a friend at the start of high school, I’ve never wavered in my affection for the genre, and although at times it’s been a secondary interest, I’m currently undergoing a surge of renewed fandom. Which makes me realise that, far from having forgotten the little Japanese I learned as a child, I’ve actually built upon it, albeit in a highly specalised area. Thanks to the catchy themes of shows like Cowboy Bebop and Evangelion, I’ve taken the time to write down and memorise the written-English phoenetics to several Japanese songs, learning them by heart. Through comics, interested Googling and contextual exposure, I’ve picked up the various Japanese terms of address, the rules governing their usage, and a smattering of vocab. Cumulatively, this represents the greatest interest I’ve ever directed towards learning a language, despite having nothing to do with academics. And it’s been fun.

All of which leads me to conclude that, if I were to sit down as an adult and properly attempt a language, in my own time and of my own volition, I’d be well advised to try Japanese, coming full circle. And all for the geekiest, laziest possible reason. Which makes me grin.

Ah, irony!

Barack Obama has been inagurated as President of the United States. Already, he’s signed an order to close Guantanamo Bay within a year and another to prevent the CIA from using illegal interrogation techniques. Innumerable stories of goodwill, tolerance and humanity have bloomed into the media since election day, and are yet to cease. Even for those of us overseas, there is a sense of hope: that something, finally, somewhere, is being done.

And yet, in the midst of all these history-making declarations, powerful speeches and political events, what’s really brought home the Obama win to me is a single line, delivered by new Presidential spokesman, Bill Burton, on the technological inadequacies of the White House.

‘It’s like going from an Xbox to an Atari,’ he said.

And the fact, the glorious, stupid, wonderful, geeky fact that someone in the White House actually knows what an Atari is, makes me grin like a damnfool yokel.

Bring on the revolution, guys. We’re with you.

When it comes to spending money, the LSH* and I have two core weaknesses: books and DVDs. Back in Melbourne, there’s a Swirling Vortex of Fiscal Doom between JB Hi Fi and the comic shop guaranteed to extract a minimum of $40 on an average day, which is why I avoid that street unless all bills are paid. But holidays – ah, blissful holidays! – are a different matter. Where other people spend vast quantities of moolah on spas, luxury accomodation, souveniers and exotic locations, we buy books. Lots of them. (To give you a rough idea, we bought upwards of thirteen – each – on our honeymoon. Sufficed to say, our luggage was several kilos heavier on the return trip.)

Our current jaunt has proved no exception. Being a philosopher and, more particularly, a logician, the LSH tends to buy books with scintillating titles like Logic, Logic ang Logic (seriously), and is possibly the only person ever to be wildly excited by a 40% discount on Cambridge University Press textbook editions. Meanwhile, my own papery hoard has been enriched by the aquisition of no less than 11 titles:

– The Midnighters Trilogy (Scott Westerfeld);

– The Last Days (also by Scott Westerfeld);

– The Kingdom Beyond the Waves (Stephen Hunt);

– Perdido Street Station (China Mieville);

– The Book of Dead Philosophers (Simon Critchley);

– Myth and Symbol in Ancient Egypt ( R. T. Rundle Clark);

– The Alchemyst (Michael Scott);

– City of Saints and Madmen (Jeff VanderMeer); and

– Cairo Jim and Doris in Search of Martenaten (Geoffrey McSkimming). 

Thanks to a saved wedding voucher, we’ve also gained a rather large quantity of Doctor Who DVDs from the Tom Baker era – and for those who might protest the usage, really: we’ve got all the household stuff we could possibly need, and in any case, the giver would approve. All in all: a most satisfactory harvest. (And just to dispel the image of the LSH and I as a pair of sedentry layabouts, we’ve been out to Taronga Zoo, walked around the Blue Mountains and caught up with friends, too. We just love us some books.)

The Blue Mountains, speaking of which, were spectacular in just about every respect – we even got snowed on, which is a novelty in most of Australia and particularly for us. It was even cold enough to justify the purchase of what shall hereafter be referred to as the Coolest Hat Ever (which, given that I actually collect weird hats, is no small boast). Behold!

Nifty, eh?

 

* Long-Suffering Husband

Alright. Let’s lay some cards on the table.

I’m a would-be fantasy novelist. I’ve written 2.5 actual books, but none are published, nor are any currently en route to being published. The first of these manuscripts was the end-product of my high school schemes, a 160,000 word, first-volume behemoth. Between the ages of 13 and 18, it went through approximately five different iterations, each new interpretation resulting in the total abandonment of the one before, to the point where you could reasonably add another 100,000-odd words to the total project. That still doesn’t include multiple rewrites, countless hand-written notes, several different maps and all the creative angst and sanity of five years’ effort. The irony was, I changed the plot so many times that by the fourth version, I realised (belatedly) that my original framework had ceased to be viable. I scrapped it all, started again, and finished the final product not long before my 19th birthday. It took that long.

Of course, it’s rubbish. There’s interesting characters, some nice ideas, a few paragraphs I’m not entirely ashamed of, and that’s about it. But it wasn’t a waste of time. From the experience, I learned patience, editing, self-analysis and proved, once and for all, that I was capable of writing an entire book. I edited and submitted, but deep down, I knew it was time to move on: I hadn’t started the sequal, and realistically, I never would.

Enter my mind-numbing stint as a legal secretary, and the oodles of spare time in front of a computer it entailed. In the middle of an exceptionally long day, I started writing a new story, in no small way inspired by a recent spate of Buffy-watching. It grew longer. And longer. A plot arc formed. Characters developed. And all of a sudden, without quite intending to, I’d written a 75,000 word quasi-young-adult fantasy novel, with jokes (or at least, my own would-be version of Douglas Adams/Neil Gaiman comic asidery) and the expectation of two more books to come. I submitted; it was rejected, but kindly, and once with actual praise. I managed to wrangle a literary agent, who sent it to Penguin. I started writing the next volume. The agent closed her agency. I kept writing. The novel made it through the first round of Penguin approvals, but was knocked back at the second. I made final contact with my ex-agent, thanking her for the opportunity, and started a new edit of the first volume.

And that brings us up to date.

Something I find intensely problematic with being a would-be author: there’s lots of us. Some are exceptional, some are average, and some are frankly appalling. As best I can tell, the vast majority of people who get rejected by publishers belong to the latter category: it’s a base assumption, and one most people tend to make. Despite my own views, I might objectively be godawful, or at least mediocre. There’s many styles of writing, after all, and blogging is no guarantee of narrative chutzpah. And there’s always room for improvement.

But what I want – what I really want – is to be a fantasy author. It’s no good pretending otherwise. I can’t vouch for my skills, but I can vouch for my determination. A small, stubborn core of me is devoted to that end. It’s why my name, and not a pseudonym, is on this blog: I want to succeed, and be known in that success. I don’t want vast riches, or to be the next J. K Rowling: were that the case, my naievete would be frightening. What I dream about – the dream of dreams – is meeting the writers I love, as a published author.

In the aftermath of Comicon, the longing hits me powerfully, and twists. Over at DeepGenre, Kevin Andrew Murphy pens a writeup that makes me ebb and wrench with jealousy: Scott Kurtz at PvP and Jerry Holkins of Penny Arcade, aka Tycho, aren’t helping, either. Clearly, there’s some issues here on my part, but I just want to be there, you know? The fact that I live on a different continent is just another reason to succeed.

I’d planned not to write here about trying to get published. Let’s face it: the blogsphere is a fantastic (ha!) outlet for angst, and while I’m as fond of ranting as the next person, I don’t want to whine at each and every hurdle. (Not much, anyway.) I’ll try to be good. I won’t let it hog the spotlight. But that’s where I’m coming from, and – with a bit of effort – where I’m going.

As computer games go, it’s a simple premise: collect a menagerie of different animals, level them up and fight a series of identically-staged, increasingly-difficult battles with your favourites. Every instalment boasts the same story arc: young protagonist befriends helpful professor, sets out on cross-island journey, fights villains and ultimately becomes League Champion. The stuff dreams are made of, if yours happen to particularly one-dimensional. There is no great dialouge, plot, characterisation or underlying moral. The battle options are limited to decision trees, two functional buttons and a D-pad – exactly the same setup as the original black-and-white Gameboy of eighties fame.

So why is Pokemon still so fucking addictive?

The best explanation is digital sorcery: a devious balance of intangible, acquisitive elements. You collect rare, interesting animals – animals with special powers, animals that can evolve into other, equally interesting animals. Data is revealed with each new find, and the ultimate, possible goal of a Full Set is, I believe, something which calls to our inner obsessive.

As a game mechanism, levelling up has its own inexplicable power. It’s an end in itself which, for some people, borders on the addictive: you gain a level in order to improve, so that you can gain yet more levels. Why this formula holds such hypnotic sway over me and others is perhaps the deepest mystery of our times – just take a glance at the World of Warcraft community. In Pokemon, levelling up appears in a pure, uncluttered form, to the point of constituting the whole game – and therein, methinks, lies the reason for its success. 

Riddle me this: what do Barbie dolls, teddy bears, Leggo and cardboard boxes have in common? Answer: a simple interface. These are all favourite childrens’ playthings, not because of the number of add-on features, but exactly because their mode of use isn’t prescriptive. A Barbie doll will always be a Barbie doll, but within those limits, imagination makes any game possible. This principle of creative simplicity is, I believe, an active ingreedient in the best videogames, albeit present (due to the nature of the medium) in an altered form.

Thus: games like Pokemon are addictive because, within the simple parameters of the game system, endlessly imaginative combinations become possible. I can only take one path through the story, but the way I conduct my battles, what elements I preference, the creatures I choose and which attributes I value are infinitely customisable. There is a terrible attraction to minutiae in such instances: I’ve never liked maths, but will happily spend my free time calculating DND stats and arranging the best possible combination of weapons and armour in Final Fantasy. It’s not the same kind of free-play offered by a Leggo set, but they are cousins, and the former design elements have arguably gone on to inspire their digital equivalent.

Alternatively, I’m just a grown geek who enjoys Pokemon. There’s no particular justification, but damned if it isn’t fun.

Surfing online yesterday, I ended up reading about Generation Y and our relationship to digital technology. We are (said Wikipedia) Digital Natives, having grown up with video games, computers,  the internet and mobile phones, compared to Generation X (Digital Adaptives), the Baby Boomers (Digital Immigrants) and the war-era Builders, or Silent Generation (Digital Aliens). Strange and old-timey as the phrase ‘I remember when’ makes me feel, I do remember life before the internet, digital cameras, flatscreen TVs and mobile phones, however barely. There was a dot matrix printer and early Mac in my Year 1 classroom; a favourite passtime was removing the twin perforated strips from the printer paper and twisting them into a concertina-worm. In Year 4, good students were allowed to play Sim City 2000 at recess or lunch, begging coveted knowledge of the godmode password – which unlocked unlimited resources and special building options – from a privileged few. Apart from the pre-installed features on our old family Osborne computer, the first game I ever bought was Return to Zork. Up until that point, I’d thought the graphics on Jill of the Jungle and Cosmo were far out; but this reset the whole scale.

My mother’s first mobile phone was a brick, bigger than the average landline receiver and three times as heavy. Digital cameras didn’t start becoming commonplace until the mid-nineties; previously, you paid for film and took random shots of the family pet to use up the end of a roll before development. When it finally became clear that traditional cameras were being outmoded, there was a rash of media worry about the economic and social consequences – not from a technological perspective, but because Kodak and others were forced to lay off thousands of photo lab staff. I remember when laser printers were new and fax machines a strictly corporate affair. But ancient as all that reminiscing makes me feel, it’s nothing to the realisation that my own children won’t know a time before Tivo, Facebook, 3-D graphics, game consoles with internet access and iTunes. Hell – they won’t even know about VHS, walkmans, discmans and cassette tapes, unless someone tells them. Generation Z is already partway there.  

All of which shouldn’t surprise me, if I’d ever stopped to think about it. But most people tend to assume, however unconsciously, that certain types of knowledge remain static: that no matter what social, political or technological developments occur in their lifetimes, everyone will always know what came first, because they do: it’s just paying attention, isn’t it? But when technology becomes outdated or old customs are cast aside, they don’t stick around and explain themselves. Outside of history lessons or personal curiosity, the next generation just won’t realise – and to a certain extent, it’s wrong to expect they will. Not everyone cares about history, although perhaps they should; but even then, not all of it is relevant. Does Gen Z actually need to know about non-digital cameras in order to function? Are we really taking consoles for granted if we’ve never seen 8-bit graphics? More relevant than such minutiae, surely, is an awareness of social privilege, and the fact that we have no innate entitlement to the status quo.

But people will get bogged down in details. Often, older generations interpret this non-knowledge of younger people as deliberate impudence, and subsequently refuse to become complicit in the new technology. Others find it intimidating, or assume that the only obvious applications must be personally irrelevant or childish, pertinent only to younger people. There’s some truth to the saw about old dogs and new tricks, particularly given the vast removal of digital technology from anything in my father’s Builder generation, and individuals shouldn’t be forced beyond their comfort zones. But in many cases, it’s simply hard to perceive how a new tool can help when the use for which it’s intended is similarly foreign. When my parents first started to talk about getting the internet, I remember thinking, with typically childish conservatism, ‘What use could it possibly be?’ Because until you’d seen the concept up and running, it was almost impossible to comprehend. (After all, the creator of television intended it for educational purposes, and envisaged no scope as an entertainment outlet.)

There’s always going to be new developments, and it’s silly to expect that everyone keep up with the technocrati. Ultimately, we need to keep our own knowledge in perspective, because not all information is timeless. There’s something wonderful in the ability to witness change, and at the current rate of technological advancement, those of us in Gen Y are ideally placed to realise exactly how far we’ve come in how short a time. But until another half-century has come and gone, we might do well to impose a moratorium on tech-history anecdotes.

After all, ‘I remember when’ doesn’t sound nearly so authoritative without bifocals and false teeth.

Ever since a friend introduced me to Penny Arcade back in Year 10, I’ve been a devout gaming/geek webcomics fan. At one point, I was checking seventeen different strips on a daily basis; realising this was insane, I scaled back to fourteen, where I settled until my first year of college. Probably, this would’ve continued, except that the internet connection in my new room was mysteriously broken, and took three weeks, umpteen phonecalls and five consultations with university IT support to fix. By that time, the amount of banked strips had reached critical mass; I didn’t have enough time to catch them all up, and so I pared back to a bare ten, farewelling 8 Bit Theatre, GPF, Nodwick and others with a heavy heart.

Since then, different strips have come and gone – Machall and Demonology 101 have run their course, while Dresden Codak is a new favourite – but my affection for the genre has remained. As has my admiration for the creators of my favourite strips. After eight years of being exposed to their humour, social commentary and general musings, watching the changes in art style and hearing snippets of personal data, they somehow feel more like acquaintances than anything else, people I could bump into and share a laugh with. This is, perhaps, the big difference between webcomics and traditional print media: connection to the creators. I grew up on Snoopy and Garfield, but couldn’t have picked Charles M. Schultz or Jim Davis out of a crowd; I knew nothing about them, their lives or interests beyond an intangible sense that it must somehow influence what they drew and why they drew it.

Not so Fred Gallagher, Scott Kurtz, Jerry Holkins and Michael Krahulik, Greg Dean, Randy Milholland and Tatsuya Ishida. Perhaps more consistently than any other creators, these guys have been with me through the most formative years of my life. I’ve changed since I started reading them, and they’ve changed, too: since my readership began, two have been married and three have had their first children. I’ve left school, gone to university, moved states and tied the knot – but even on my honeymoon, I was still checking comics along with email.

It’s strange to think of geeks grown up – at least, so mainstream society would have us believe. There’s still a strong bias against the idea that you can play video games, enjoy fantasy or sci-fi and read comics as an adult without being just as immature as you were at fourteen, because of the perception that these are childish persuits. As a kid, I was a geekling born to normals; and worse, I was a girl, which made it harder for my parents to notice. Had I been male, perhaps my compulsive interest in dinosaurs, Mario and Transformers would have fit a pattern, rather than seeming incongruous compared a similar fixation on My Little Pony. The penny finally dropped when, after years of playing every console and computer game my friends possessed and saving hundreds of dollars pocket money for a colour Gameboy, I woke up one Christmas to my very own PlayStation. Since then, I’ve never looked back – but had I not stumbled on a group of like-minded webcomic geeks, things might have turned out differently.

One of the greatest trials in growing up is figuring out who you are, not just in relation to other people, but on your own terms. Without friends who shared my interests, I never would have discovered webcomics; but without webcomics, I might have lost confidence in the idea that I could succeed that way, too. Because that’s the other thing I learned: that quirky, geeky, interesting, creative people can, with sufficient effort and support, earn a living through what they love. Although I read books, watch films and listen to music, I’m not privy to the everyday struggle and success of the creators. The end product just appears, disconnected from any personal genesis: like a magic trick, it entertains and inspires, but the mechanics are deliberately concealed. Authors like Neil Gaiman lift the veil through individual blogs, but back then, it was webcomics that got the message through.

Unlike Peter Pan (or today’s lost boys), geeks can grow up. And if webcomics are anything to go by, they can be happy and creatively successful into the bargain.

Thanks, guys.  

Hold on to your mittens, kittens. Not since the Pan Galactic Gargle Blaster have geekery and alcohol crossed paths in such a pas de deux of awesome as they will the next time I play bartender.

Behold my revelation: Final Fantasy themed cocktails.

Breathtaking, isn’t it? Imagine: dark, brooding Leonhearts; tropical Zidanes; a whiskey-based Tifa that kicks like a mule. Aeris would be strong, but girly – champagne and hibiscus, with a dash of vanilla-infused vodka. Set alight, a mix of brandy and bourbon poured over ice might be a Sephiroth or One-Winged Angel, while Jenova could kill you outright: vodka, absinthe and tequila shaken with citrus and served straight-up. Lulu would be dark, but subtle: kahlua, chocolate liqueur and frangelico with cream and shaved chocolate. Cloud would refresh: apple-infused vodka with soda, lime and vermouth. Auron needs must involve rum, kahlua and coke, but an Eidolon would be kinder: midori, brandy and lemonade with a lemon twist.

Merciful Squaresoft. I’ve gone and made myself thirsty.