Posts Tagged ‘Ideas’

First, some links:

Clay Shirky on the collapse of traditional newspapers and the need to find alternative means of journalism;

Natalia Morar, who organised an anti-government flashmob on Twitter and is now hiding from arrest;

Oprah and other celebrities battling to be the first on Twitter with a million followers; and

SR7,  a company for hire that specialises in digging up dirt on employees for other companies.

Now, some thoughts, in no particular order:

 1. Journalism is essential. People both like and need to know what’s going on. However, journalism is not a naturally occuring resource. People must go out, obtain information, then analyse, write and relay it, a time-consuming process traditionally deemed deserving of monetary compensation. No matter how easy it is to copy an existing source online, that source first needs to come from somewhere; and before that, someone must decide that the source itself is newsworthy.

2. As has always been true of all creative endeavours (singing, painting, dancing), there are vastly more people who participate in these activities than are paid to do so. Largely, this is a question of enjoyment, creative expression and ease. Blogs have tapped into this in a big way. Most bloggers make no money. Many blogs are read by only a tiny handful of people known to the writer, or not at all. And yet, they are prolific, because even without monetary compensation, the vast majority of people simply enjoy writing them. Many readers employ a similar logic.

3. Despite having been around for a number of years, Twitter has only just hit the collective journalistic hivemind. Recent weeks have seen an explosion of articles on how it is being used, why it is damaging people, whether the concept is utterly pointless, and the implications of its ongoing development. Diverse examples of all these include:

– the now-notorious #amazonfail incident and its aftermath;

the Times bemoaning Twitter as a ‘rolling news service of the ego’ and then promptly signing up;

a warning that social networking sites are damaging kids’ brains at the same time Twitter is being added to the Brittish school curriculum; and

– the use of Twitter in both the Mumbai bombings and hyperlocal news sites.

4. Writing on the collapse of newspapers as we know them, Clay Shirky sums up the process of social revolutions thusly: “The old stuff gets broken faster than the new stuff is put in its place. The importance of any given experiment isn’t apparent at the moment it appears; big changes stall, small changes spread. Even the revolutionaries can’t predict what will happen.” He concludes by saying that what we need is a “collection of new experiments” to help us figure out how journalism – as distinct from newspapers – can keep working.

5. TV news isn’t going anywhere. Neither is radio, which has survived bigger technological upheavals. Print journalism is failing because the internet has ruined its monopoly on exclusive media. Unlike free-to-air radio and television, which have always had to contend with the notion that a majority of listeners won’t be paying directly for their content, newspapers have thrived as a one-to-one exchange: a set amount of money per customer per paper, with very few exceptions. It’s not that the internet devalues the written word, or that making journalism freely available is inimical to notions of profit: it’s that, without being able to charge on that one-to-one basis, newspapers cannot command anything like their previous volume of revenue. They’ve simply never had to compete with a medium that could do the same thing, better, for a fraction of the cost. And now they’re floundering.

6.  Spare a moment to consider the notion of Digital Rights Management – DRM – and its relationship to the newspaper fiasco. Although concerned parimarily with digital music copyright, the ongoing debate about encryption for games and, with the advent of the Kindle and other such devices, the pirateability of digital books and audiobook rights, the underlying problem is the same in both instances: defining notions of ownership for both users and creators in an era where digital copies are readily available. Books in particular have always been subject to the whims of borrowing and lending without falling apart, but might their new digital formats change that? Or are they an exception to the rule? For long stints of time, it’s nicer to read on a page than a screen, but what if screens are improved, or some other technology developed that is just as comfortable to use as paper? Will we still crave tactile connections

7. People might not like to pay for content, but as WikipediaYouTube and Linus Torvalds have already proven, many are ready, willing and able to create content for free. Open source principles clearly predate the current revolution, and consciously or not, they’re informing it. Remove money from the equation (or at least, give it a drastically reduced emphasis) and gaze anew at the crisis of print journalism. Blogs, tweets, viral news: many of the new news staples are ungoverned, unruly, disparate products of the hivemind – flashmobs, crowdsourcing – but that doesn’t mean they go utterly unpoliced or work without change or criticism. Hey, it’s a revolution, folks. We’re breaking and making at the speed of thought. Give us time to learn the ropes.

8. Way back in 1995,  Major Motoko Kusanagi once mused, “And where does the newborn go from here? The net is vast and infinite.” In 2006, she reaffirmed the sentiment. We’re not yet ghosts in the shell, but let’s keep an open mind. The future rests in us.

Back in Ye Olde Shakespearean times, there was a fantastic word for what happened when one man shagged another man’s wife: cuckolding. Contrary to how it might sound, a ‘cuckold’ was the injured party, while the wife-snatcher was said to have ‘put horns on another man’s head’. Although I can’t vouch for the origins of this latter colloquialism, cuckold was aptly inspired by the French word for cuckoo – that is to say, a bird which lays its eggs in other birds’ nests.

So on that note, let’s talk about Pete Doherty – or, more specifically, his none-too-subtle appearance in this article on bad boys. More specifically still: the fact that, in keeping with popular mythology, they apparently do get all the girls.

Short term, anyway. According to new extensive research, males who exhibit dark characteristics such as narcissism, deceitfulness and thrill-seeking do better in the mating stakes; or, to quote researcher David Schmitt, “They are more likely to try and poach other people’s partners for a brief affair.”

The success of cuckoo bird species is, biologically speaking, ingenious: have all the fun of mating with another cuckoo, find some poor devoted wren or smaller bird, replace their eggs with your own and fly off into the sunset. In times past, one suspects this tactic might have worked well for cuckolds of the human type, too, thus ensuring these traits were passed on – but that was before contraception.

Which raises the question: if treacherous, deadbeat cuckolds remain a genetic mainstay despite our ability to block their genes from the pool, is it because we’re stupid? Or are they all just sneakier than we thought?

Some infinities are bigger than others.

Literally.

Imagine the top left corner of an infinite grid square. Infinite rows begin at the left-hand edge: infinite in length, and infinite in number.

Each row contains a unique arrangement of zeroes and ones – infinitely diverse combinations.

The first number of the first row nestles in the corner of the grid. It is also the first number on the diagonal, cutting evenly between the top of the grid and the left-hand edge.

Descending along this axis, imagine the reverse of every number. Zeroes become ones. Ones become zeroes. Completely opposite.

Make this alternate diagonal a horizontal row, and set it above the grid. Does it have a twin in the infinite rows? 

No. You have just created an entirely new combination: one which cannot be found in the infinite sets.  

If you tried to match up your new row, you would find that one digit is always reversed. If the grid begins with a zero, your new row begins with a one. If the second digit in the second row is a one, your second digit must be zero. This pattern holds no matter how far down, or how far across, you check, because your row is composed entirely of changes to the original, infinite, rows.

Which means, put simply, that your new row now exists in a bigger infinity.

Nifty, huh?

As a concept, maturity hinges on two things: emotional experience, and learning from it. While neither qualification can be rushed (or, indeed, predicted) we nonetheless have a tendency to associate maturity with people above a certain age – 18, say, or 21. This is based on the not-unreasonable hypothesis that most people will have undergone significant life or emotional change by this point, but this fulfills only one half of the equation. Learning from our experience is, quite arguably, the more relevant point, and is appropriately hard to test for. This difficulty is compounded by two other problems: firstly, the potential for youthful maturity and aged immaturity to wreck the curve; and secondly, a social habit of conflating maturity with intelligence.

This is where the issue becomes confused. Information, maturity, experience, knowledge and wisdom are all learned, but intelligence is native. It can be stunted, developed, delayed, repressed or encouraged according to circumstance, but there is still a level of predetermination to how smart you can actually be. A person’s base intelligence exists beyond their level of maturity: it doesn’t kick in at a certain birthday, nor does it require random, unpredictable change to flourish. Instead, intelligence blossoms as we use it. In pragmatic terms, this is still no substitute for acquired knowledge and life skills, nor can it stand in for maturity; and while an argument might be made that native intelligence governs how swiftly we mature once the opportunity has presented itself, this is of secondary concern. Consider, then, the following concept: that at some point in your teenage years, you become as technically bright as you’ll ever be.

Although the human brain doesn’t finish developing in its entirety until around 24, the cerebral development of intelligence stops between 8 and 12. Hormonal change starts not long after this process ends: even at 14, 15 and 16, we are still dealing with teenagers whose intelligence is set, and whose emotional development – at least on a biological level – is well underway.

The main difference between adults and teenagers, then, is one of experience. Firsts are a big part of human growth, and with a whole new set of chemicals informing teenage behaviour, it’s not only circumstances that are changing, but the knowledge of how an individual might react. Unsurprisingly, this can lead to mood swings, irrational behaviour, poor decisions and other such adolescent staples – but underneath all the drama lies an essentially adult intelligence, assessing situations, collating data, drawing conclusions and commenting on the process.

This is what society tends to forget: that teenagers are entirely capable of rational thought and observation, not just by childish standards, but in an adult context. Rather than an inability to form cogent arguments or opinions, the struggle is processing the sheer volume of information on offer, much of which is new. Areas which before held little or no interest whatsoever, such as politics, have suddenly hit the radar, while much of what was previously deemed important, like old hobbies, are being discarded. 

There’s more to be said on the subject of teenagers: highschool, stereotyping by adults, social issues. But for now, I’m content to distinguish between maturity and intelligence, and to point out that both are equally significant to adulthood and development. The one might take years to learn, but the other is all too often overlooked.