Posts Tagged ‘Digital Technology’

First, some links:

Clay Shirky on the collapse of traditional newspapers and the need to find alternative means of journalism;

Natalia Morar, who organised an anti-government flashmob on Twitter and is now hiding from arrest;

Oprah and other celebrities battling to be the first on Twitter with a million followers; and

SR7,  a company for hire that specialises in digging up dirt on employees for other companies.

Now, some thoughts, in no particular order:

 1. Journalism is essential. People both like and need to know what’s going on. However, journalism is not a naturally occuring resource. People must go out, obtain information, then analyse, write and relay it, a time-consuming process traditionally deemed deserving of monetary compensation. No matter how easy it is to copy an existing source online, that source first needs to come from somewhere; and before that, someone must decide that the source itself is newsworthy.

2. As has always been true of all creative endeavours (singing, painting, dancing), there are vastly more people who participate in these activities than are paid to do so. Largely, this is a question of enjoyment, creative expression and ease. Blogs have tapped into this in a big way. Most bloggers make no money. Many blogs are read by only a tiny handful of people known to the writer, or not at all. And yet, they are prolific, because even without monetary compensation, the vast majority of people simply enjoy writing them. Many readers employ a similar logic.

3. Despite having been around for a number of years, Twitter has only just hit the collective journalistic hivemind. Recent weeks have seen an explosion of articles on how it is being used, why it is damaging people, whether the concept is utterly pointless, and the implications of its ongoing development. Diverse examples of all these include:

– the now-notorious #amazonfail incident and its aftermath;

the Times bemoaning Twitter as a ‘rolling news service of the ego’ and then promptly signing up;

a warning that social networking sites are damaging kids’ brains at the same time Twitter is being added to the Brittish school curriculum; and

– the use of Twitter in both the Mumbai bombings and hyperlocal news sites.

4. Writing on the collapse of newspapers as we know them, Clay Shirky sums up the process of social revolutions thusly: “The old stuff gets broken faster than the new stuff is put in its place. The importance of any given experiment isn’t apparent at the moment it appears; big changes stall, small changes spread. Even the revolutionaries can’t predict what will happen.” He concludes by saying that what we need is a “collection of new experiments” to help us figure out how journalism – as distinct from newspapers – can keep working.

5. TV news isn’t going anywhere. Neither is radio, which has survived bigger technological upheavals. Print journalism is failing because the internet has ruined its monopoly on exclusive media. Unlike free-to-air radio and television, which have always had to contend with the notion that a majority of listeners won’t be paying directly for their content, newspapers have thrived as a one-to-one exchange: a set amount of money per customer per paper, with very few exceptions. It’s not that the internet devalues the written word, or that making journalism freely available is inimical to notions of profit: it’s that, without being able to charge on that one-to-one basis, newspapers cannot command anything like their previous volume of revenue. They’ve simply never had to compete with a medium that could do the same thing, better, for a fraction of the cost. And now they’re floundering.

6.  Spare a moment to consider the notion of Digital Rights Management – DRM – and its relationship to the newspaper fiasco. Although concerned parimarily with digital music copyright, the ongoing debate about encryption for games and, with the advent of the Kindle and other such devices, the pirateability of digital books and audiobook rights, the underlying problem is the same in both instances: defining notions of ownership for both users and creators in an era where digital copies are readily available. Books in particular have always been subject to the whims of borrowing and lending without falling apart, but might their new digital formats change that? Or are they an exception to the rule? For long stints of time, it’s nicer to read on a page than a screen, but what if screens are improved, or some other technology developed that is just as comfortable to use as paper? Will we still crave tactile connections

7. People might not like to pay for content, but as WikipediaYouTube and Linus Torvalds have already proven, many are ready, willing and able to create content for free. Open source principles clearly predate the current revolution, and consciously or not, they’re informing it. Remove money from the equation (or at least, give it a drastically reduced emphasis) and gaze anew at the crisis of print journalism. Blogs, tweets, viral news: many of the new news staples are ungoverned, unruly, disparate products of the hivemind – flashmobs, crowdsourcing – but that doesn’t mean they go utterly unpoliced or work without change or criticism. Hey, it’s a revolution, folks. We’re breaking and making at the speed of thought. Give us time to learn the ropes.

8. Way back in 1995,  Major Motoko Kusanagi once mused, “And where does the newborn go from here? The net is vast and infinite.” In 2006, she reaffirmed the sentiment. We’re not yet ghosts in the shell, but let’s keep an open mind. The future rests in us.

Surfing online yesterday, I ended up reading about Generation Y and our relationship to digital technology. We are (said Wikipedia) Digital Natives, having grown up with video games, computers,  the internet and mobile phones, compared to Generation X (Digital Adaptives), the Baby Boomers (Digital Immigrants) and the war-era Builders, or Silent Generation (Digital Aliens). Strange and old-timey as the phrase ‘I remember when’ makes me feel, I do remember life before the internet, digital cameras, flatscreen TVs and mobile phones, however barely. There was a dot matrix printer and early Mac in my Year 1 classroom; a favourite passtime was removing the twin perforated strips from the printer paper and twisting them into a concertina-worm. In Year 4, good students were allowed to play Sim City 2000 at recess or lunch, begging coveted knowledge of the godmode password – which unlocked unlimited resources and special building options – from a privileged few. Apart from the pre-installed features on our old family Osborne computer, the first game I ever bought was Return to Zork. Up until that point, I’d thought the graphics on Jill of the Jungle and Cosmo were far out; but this reset the whole scale.

My mother’s first mobile phone was a brick, bigger than the average landline receiver and three times as heavy. Digital cameras didn’t start becoming commonplace until the mid-nineties; previously, you paid for film and took random shots of the family pet to use up the end of a roll before development. When it finally became clear that traditional cameras were being outmoded, there was a rash of media worry about the economic and social consequences – not from a technological perspective, but because Kodak and others were forced to lay off thousands of photo lab staff. I remember when laser printers were new and fax machines a strictly corporate affair. But ancient as all that reminiscing makes me feel, it’s nothing to the realisation that my own children won’t know a time before Tivo, Facebook, 3-D graphics, game consoles with internet access and iTunes. Hell – they won’t even know about VHS, walkmans, discmans and cassette tapes, unless someone tells them. Generation Z is already partway there.  

All of which shouldn’t surprise me, if I’d ever stopped to think about it. But most people tend to assume, however unconsciously, that certain types of knowledge remain static: that no matter what social, political or technological developments occur in their lifetimes, everyone will always know what came first, because they do: it’s just paying attention, isn’t it? But when technology becomes outdated or old customs are cast aside, they don’t stick around and explain themselves. Outside of history lessons or personal curiosity, the next generation just won’t realise – and to a certain extent, it’s wrong to expect they will. Not everyone cares about history, although perhaps they should; but even then, not all of it is relevant. Does Gen Z actually need to know about non-digital cameras in order to function? Are we really taking consoles for granted if we’ve never seen 8-bit graphics? More relevant than such minutiae, surely, is an awareness of social privilege, and the fact that we have no innate entitlement to the status quo.

But people will get bogged down in details. Often, older generations interpret this non-knowledge of younger people as deliberate impudence, and subsequently refuse to become complicit in the new technology. Others find it intimidating, or assume that the only obvious applications must be personally irrelevant or childish, pertinent only to younger people. There’s some truth to the saw about old dogs and new tricks, particularly given the vast removal of digital technology from anything in my father’s Builder generation, and individuals shouldn’t be forced beyond their comfort zones. But in many cases, it’s simply hard to perceive how a new tool can help when the use for which it’s intended is similarly foreign. When my parents first started to talk about getting the internet, I remember thinking, with typically childish conservatism, ‘What use could it possibly be?’ Because until you’d seen the concept up and running, it was almost impossible to comprehend. (After all, the creator of television intended it for educational purposes, and envisaged no scope as an entertainment outlet.)

There’s always going to be new developments, and it’s silly to expect that everyone keep up with the technocrati. Ultimately, we need to keep our own knowledge in perspective, because not all information is timeless. There’s something wonderful in the ability to witness change, and at the current rate of technological advancement, those of us in Gen Y are ideally placed to realise exactly how far we’ve come in how short a time. But until another half-century has come and gone, we might do well to impose a moratorium on tech-history anecdotes.

After all, ‘I remember when’ doesn’t sound nearly so authoritative without bifocals and false teeth.