Posts Tagged ‘Sociology’

It’s fair to say I think about elves more than the average person; that is to say, firstly, that I think of them at all, and secondly, that a sizeable chunk of this time is dedicated to theorising what elves really are. Among other things, this makes me slightly crazy. But I’ve come up with a theory. And now, rejuvinated by the illustrious Harkaway’s recent musings on cryonics, I’m ready to show and tell. Or maybe just tell, in this case. Whatever.

Anyway.

Elves, according to a wide range fantastic and mythological sources, are essentially very pretty people who live damn near forever in beautiful cities considerably superior to those of other races by the grace of their higher intellect, magic, advanced technology or a combination of all three. Outside of cities, they dwell in forests or natural areas, usually in a deeply symbiotic relationship with their surroundings, but in either instance, elven society is usually lauded as being progressive, or at least very successful. They are highly culturally advanced, but despite professing a preference for peace, tend, when roused, to be lethal in war. Outsiders often know little about them, as they prefer not to mingle with humankind, and their settlements are often isolated; typically, they also exhibit a low birth-rate in compensation for their incredible longevity. There is also a strong tendency to infer relationships between elves and dragons, or elves and white horses of superior stamina and intellect, both of which species are, in such instances, rarely if ever found elsewhere, granting their elvish masters the exclusive advantage of swift transport in largely medieval settings. Finally, elves are frequently described as placing a dual emphasis on learning, academic or otherwise, and on leisurely, creative passtimes.

Got all that?

Good.

Now, if we take the above hallmarks of elfness, remove the fantasy connotations, and render them as a set of socio-cultural markers, we end up with the following list of real-world characteristics:

1. Longer than average lifespans;

2. Objectively exceptional but culturally normative looks;

3. Technological superiority at an everyday level;

4. An outward preference for pacifism underwritten by extreme martial capabilities;

5. A preference for isolation from less advanced societies;

6. Largely urban lifestyles balanced against deeply held environmental convictions;

7. Access to superior modes of transportation and information relay;

8. A low average birthrate; and

9. A largely functional societal model extolling the virtues of both learning and leisure.

Sound familiar?

I find it both amusing and ironic that the mythical beings of early European culture are starting to look like the end point of modern Western society. True, we don’t live hundreds of years, but our lifespans are ever-increasing thanks to the ongoing advance of medical science. Give it another couple of decades, and who knows where we’ll be? And true, we’re not universally beautiful, but there is an increasing emphasis on physical perfection and achieving a set body type. With the advent of plastic surgery, many people now choose to alter their own appearance, and consider, too, the unveiling of the first ‘designer baby’ clinic in LA, where the new practice of cosmetic medicine allows parents to select the appearence of their future children.

Technological superiority? While it’s true that most of the world is now online, there’s certainly accuracy to the statement that affluent western, eastern and northern European nations have access to more and better gadgets than their counterparts in Africa, South-East Asia and South America. Similarly, technological prowess confers the advantage of both swift, secret information relay and rapid transportation worldwide. The notion of esposuing pascifism but practicing violence is, traditionally, a hallmark of nations throughout history; nonetheless, it seems particularly apt in a day and age when countries can initiate wars or engage in battles so geographically removed from their own turf that no risk of invasion is run, and where stockpiling WMDs has become routine practice. As for isolation, one need only look at the recent global tightening of immigration laws, particularly in the west: we might praise the notion of living in multicultural societies, but still remain fearfully recalcitrant when it comes to the very process which allows them to take shape.

The recent passion for reducing our carbon footprint while retaining an urban lifestyle is, to me, a particularly elvish dualism, and one which is sweeping most of the developed world. Similarly, while it’s difficult to try and argue for a lowered birthrate on such an enormous and diverse scale (although China’s One Child Policy is an intruiging counter-example), anecdotally, there seems to be a trend of affluent, educated women giving birth later and to fewer children, while our childhoods – or, more particularly, the time we spend at school and under the parental roof – is growing. Our current social model promotes a minimum of thirteen years’ schooling, while more and more people are attending university as a matter of course. At the same time, we deeply value labour-saving devices, the creation of entertainment and the right to leisure time, which is arguably a kind of social symbiosis: we work hard at learning how to do less in one sphere of daily life in order to create more time for learning in another, which in turn leads to more time, and also to the necessity for each generation to learn enough to keep up.

In short, we are growing into elves: not the fey creatures of our early imaginings, but into long-lived, scientific, face-selecting humans of a new technological era. Whether for good or ill, I’m not prepared to judge, but in either case, the comparison seems warranted. Which leaves only the question of magic, that elusive quality so associated with mythological elfhood; and yet even here, we might find a real-world comparison, in the words of Arthur C. Clarke, who wrote that “any sufficiently advanced technology is indistinguishable from magic,” a sentiment with which I wholeheartedly agree.

Because if any one of us went back in time to the genesis of elven myths; if we stood before our ancestors, iPhone-wielding, smooth-skinned, nylon-wearing, bedecked in even the cheapest, meanest jewellery and spoke of our world, what else could they name us – what else could they think us – but elves?

Consider the following four articles on the dangers of youth exposure to too much digital culture:

iPod Safety: Preventing Hearing Loss in Teens;

Twittering brains withering, expert warns;

Teens flaunt sex, drugs on MySpace; and

Too much PlayStation causes painful lumps,

all of which cropped up in today’s online news. Together, they pretty much exemplify the fears of the Builders, the Baby Boomers and, to a certain extent, the elder members of Generation X, not just as regards their own offspring, but concerning all of modern society. Loud and clear, they’ve been wringing their hands for the past few years over the perils of digitisation, and every time, I’ve experienced a disqueting lurch of anger. It’s taken today’s media quartet for me to understand why this is: after all, cynic though I may be, I still put a certain base faith in the opinions of scientists and sociologists, especially when backed up by established studies. As a member of Generation Y, I’m hardly an impartial observer, and to a large extent, my negative reactions stem from a sense of being personally maligned, even where certain behaviours or criticisms don’t apply either to me as I am now, or to my historic teenage self. Rather, I feel outraged on behalf of my generation and those younger: that we are, in some sense, being fundamentally misunderstood. I can hack criticism, yes; but the sheer weight of professional authorities whose time has been purposefully devoted to proving that almost everyone under the age of 25 is steering themselves on a course towards social oblivion has begun to seem less like the amalgamated findings of unbiased research and more like an unconscious desire to demonise technology.

When it comes to growing up, it’s human nature to get fixed in whatever era raised us. Modern society is shaped, by and large, to ensure this happens – advertising and television timeslots, for instance, aren’t shown at random, but painstakingly catered to particular demographics. Thus, once we lose interests in the trappings of a given age and progress to playing with a new kind of gadget or watching a different kind of film, we effectively graduate from one type of newsfeed to another. Not watching weekend and afterschool cartoons, for example, means that we no longer learn which shows are cancelled and which will replace them, and that certain products, like the latest toys and games, will no longer form part of our daily media experience. Because our interest in such things has waned, we don’t notice the dissonance: rather, we assume that things have remained static in our absence, and are often startled in a moment of later nostalgia when, flipping on the TV at 3pm, we recognise none of the cartoon characters, none of the hosts, and none of the merchandise. Such disorientation provokes outrage: who are these strangers, and what have they done with our childhood? This biases our opinion of the new product towards hostility and skepticism from the outset; and even when we take the time to watch these new shows, the magic is missing, because we are no longer children. Wrongheadedly, however, we don’t immediately identify this as the problem, and tend to believe, rather, that the product itself is at fault. In fact, it becomes difficult to fathom what kind of person such programmes are catered to, and so, by extension and quite unselfconsciously, we have already taken the first steps towards discrediting the intelligence and taste of the next generation. This outrage slumbers in us, omnipresent but quiescent, until we have children of our own, or are forced to deal with someone else’s. Nonetheless, it is there.

Consider, then, that the technological advances of the past few decades have leapt ahead at unprecedented speeds. In the space of twenty years, we have moved from cassette tapes and walkmans to CDs and discmans to the now-ubiquitous mp3s and iPods of the new millenium. For a generation who started out buying their albums on LP, this is triply disconcerting, while for the generation who thought themselves blessed by the miracle of radio, it seems like a kind of magic. This is all common knowledge, of course, and therefore glossed with the shiny patina of frequent repetition: by itself, the comparison doesn’t provide an explanation for the hostility of older generations. Until, that is, we combine it with the above example about treasured childhood cartoons, because in this instance, not only are the new characters unrecognisable, but they don’t even appear on the same device.

And adults struggle with this. They are disconnected from their offspring, from their students; more important than connectivity and reminiscence, however, is the loss of firsthand advice. They simply cannot guide today’s teenagers through the digital world, which leads most youth to discover it on their own. Most of us who grew up with computers and videogames are either several years away from reproducing or blessed with children still in early primary-school: in other words, we are yet to witness what happens when a generation of adolescents is reared by a generation of adults anywhere near as technologically literate as their teenage progeny, who remember what it was like to hang out on Trillian or MSN chat all night, to experiment with cybersex, to write achingly of school crushes in their LiveJournal or to download music at home. Members of Generations Y and Z, in other words, in addition to being burgeoning iFolk, are also a social anomaly: a group whose own adolescence is so far removed from the experience of their caretakers as to prevent their parents and teachers, in many instances, from adequately preparing them for the real (digital) world.

But the gap will close. Already there are children in the world whose parents own game consoles, who will guide them online from a young age, and whose joint mannerisms both in real and virtual company will be drawn from a single parental source. Navigating away from certain parts of the internet will be taught in the same way as stranger danger and the implict lesson to avoid dangerous parts of the local neighbourhood. We teach what we know, after all, and yet large number of commentators seem not to have realised this – which is why I react badly to their writings. They never purport to be talking about teenagers now so much as teenagers always, or from this point on, a frustrating alarmism that takes no account of what will happen when such adolescents leave home, stumble into the bright sunlight, go to university, get jobs, fall in love and maybe have children of their own. In short, they have no sense of the future, or if so, they picture a world populated by antisocial digital natives, uprooting the fruits of their hard labour out of ignorance, apathy and poor management. Either they can’t imagine us growing up, or fear what we’ll turn into.

I’m speaking here in broad-brush terms. Obviously, the distinction between those who are technologically literate and those who aren’t can’t simply be reduced to their year of birth. Every generation has its Luddites (and, if we remember the political motivations of those original iconoclasts, this is often a good thing) as well as its innovators, its geeks and scientists. And many such worried articles, irksome though I may find their tone, are still correct: listening to your iPod on full volume will probably damage your hearing, just as it’s not a wise idea to post intimate details of your sex life on MySpace. The latter concern is markedly new, and something teens certainly need to be made aware of – indeed, professionals new to Facebook are still themselves figuring out whether to friend coworkers or employers, thereby allowing them to view the results of various drunken nights out, or to keep a low digital profile. Such wisdom is new all round, and deeply appreciated. On the other hand, parents have been telling their kids to turn down their damn music in one form or another ever since Elvis Presley first picked up a guitar, and while the technology might’ve become more powerful in the intervening decades and the studies into auditory damage more accurate, the warning remains identical (as does the inter-generational eye-roll with which it tends to be received).

In short, the world is changing, and people are changing with it, teachers, teens and parents alike. And I cannot help, in my own curious optimism, to see this as a positive thing: that in a world where technology moves so swiftly, older generations must constantly remain open to the idea of learning from their younger counterparts, while those in the know must become teachers earlier. There is so much to be gained in the coming years, and so many problems, both great and small, to be solved. The gap between adults and adolescents has never been so large, but while it always behooves those in the former category to teach and aid the latter, this should never be at the expense of at least trying to understand their point of view. And this, ultimately, is what causes me to bristle: whether playing videogames can hurt your hands or spending too much time online can damage your real world social skills, such passtimes aren’t going anywhere. Rather than condemning or sneering at such things outright or tutting sadly, the more productive path is to consider how best they can be incorporated into modern life without causing harm, or to study how they work in confluence with real-world interactions, and not just fret about what happens if they’re used exclusively.

Because technology – and future generations – aren’t going anywhere. We might not look like Inspector Gadget, but baby, we’re his heirs. Or rather, Penny’s. You get the idea.

Imagine this image: a human brain in a vat. The brain has been removed from a real, live person and painstakingly wired into a machine which keeps it alive, utterly duplicating the necessary processes of organic flesh. Sight, sound and smell are simulated by clever contraptions, emotional surges provoke the correct chemical and hormonal reactions. To all intents and purposes, the being – the brain – is real, their sense of self intact: they are simply no longer housed in a body.

Which begs the question: do they still have a gender?

It’s an interesting problem. Socially, gender is assumed through assessment of a person’s physical body, their voice, mannerisms, clothes and so on: but strip away all these things – remove even their possibility – and what is left? Is the brain (we’ll call it Sam, a neatly androgynous handle) gendered depending on the sex of its original body? Is it possible for a ‘female’ brain to wind up ensconced in male flesh, or vice versa? If one accepts that homosexuality is more often an innate predeliction than a conscious choice (certainly, I believe, it can be both or either), what role does the physical wiring of our brain play? Is it the only factor? Does nurture always prevail over nature in matters of sexuality, or vice versa? Is it a mixture? If so, does the ratio vary from person to person? Why? And so on.

Let’s lay some cards on the table. When it comes to sexual orientation, my two rules of thumb are: 

(a) mutual, intelligent consent; and

(b) the prevention of harm to others.

In a nutshell: all parties have to agree to what’s happening, and no bystanders can be hurt or unwillingly drawn in. While this doesn’t rule out BDSM (provided, of course, it keeps within the bounds of said rules), it definitively excludes rape and paedophilia, which, really, is common sense. Anything relating to homosexuality and transexuality, however, is fair game.

A few more points, in no particular order:

1. Life is often unfair.

2. Life is often weird.

3. Insofar as evidence is concerned, human beings are still shaky on the definitive origins of personhood (souls v. genes, or possibly a blend of both), but most people will agree that brains and gender play a more important role in this than, say, knees and elbows.

4. Original notions of gender roles developed in the context of reproduction and childrearing, but provided both these things still occur in sufficient numbers to ensure the survival of the species, there is little harm in broadening or questioning their parameters.

5. People have, or should have, a basic right to assert their identity. Reasonably, there must be some limits of credulity – there was only ever one Napoleon,  mankind are distinct from dolphins – but within the recognised sphere of human gender and sexual orientation, it seems counter-intuitive that appearance should dictate black and white rules for what is, quite evidently, an internal and subtle determination.

Witness, then, the idea of transgender couples, in which one partner may undergo a sex change without ending the relationship. Witness, then, the case of Aurora Lipscomb, born Zachary, who identified as a girl from the age of two and was removed from her parents when they refused to forcibly contradict her. These are just two examples that buck the trend of traditional gender ideas, and rather than making us squirm, they should make us think. When and why did certain socio-cultural ideas of gender develop, and how do they change? Consider, for instance, the well-documented and widespread instances of winkte, berdache and two-spirit people in Native American culture, compared to the deep-seated fear of these concepts in western traditions. Look at the long-standing tradition of male homosexuality in Japan, particularly among samurais, and the role of Sappho in ancient Greek lesbianism. Think of hermaphrodites.

Point being, there’s a wealth of diverse and fascinating history surrounding the ideas of gender, sexuality and male/female roles, to the extent that many legal restrictions now placed on non-heterosexual couples and individuals are faintly ridiculous. Throw in the question of child-rearing, and there’s a tendency to reach for the nearest pitchfork. Personally, I find debating my views in this matter difficult, if only because debate is meaningless without a modicum of mutually accepted middleground, and where my opponents object to homosexuality and transsexuality as an opening gambit, it’s well-nigh impossible to discuss the matter of non-heteros breeding, adopting and/or applying for surrogacy without both sides resorting to instant moral veto of the contrary position.

Still, it’s always worth trying, and the whole issue fascinates me. Socially, I marvel at where the next hundred years could take us, and cringe at how far we might also fall. But in the interim, I return to the question of brains in vats, and how, within the parameters of such a hypothetical, gender is determined. Is it innate, biological, genetic, spiritual, chosen consciously, chosen unconsciously, socially conditioned, random, nurtured, culturally selected; or can the glorious gamut of human existence countenance the possibility that these options simultaneouly coexist as true, contributing on an individual basis, in individual ratios? Or is that too confronting a thought?