"The internet has quietly infiltrated our lives, and yet we seem to be remarkably unreflective about it." #wymhm

We're living through a radical transformation of our communications environment. Since we don't have the benefit of hindsight, we don't really know where it's taking us. And one thing we've learned from the history of communications technology is that people tend to overestimate the short-term impact of new technologies — and to underestimate their long-term implications.

We see this all around us at the moment, as would-be savants, commentators, writers, consultants and visionaries tout their personal interpretations of what the internet means for business, publishing, retailing, education, politics and the future of civilisation as we know it. Often, these interpretations are compressed into vivid slogans, memes or aphorisms: information "wants to be free"; the "long tail" is the future of retailing; "Facebook just seized control of the internet", and so on. These kinds of slogans are really just short-term extrapolations from yesterday's or today's experience.

While a definite qualifier for "tl;dr," it's worth a read for the clarifying perspective and reflection on what the internet is, does and could/will be.

"if people in the future are going to understand what this society is like, they need to understand gaming" #wymhm

Ars: Why do librarians and archivists want to preserve games?

JM: The really simple, one-sentence answer is because games are important. In the United States we're looking at about 80,000 people who are directly employed by the gaming industry and maybe another 240,000 people involved in related, tangential industries that rely on gaming companies for their existence. So just as a monetary phenomenon, games are important. You probably saw the sales for Modern Warfare? We're talking a single game that realized over a billion dollars in sales. Sort of shows on a monetary level the importance that games have taken within our economy.

This has certainly made librarians take note of games, but also they've become important culturally. There's a long history of wanting to say "popular culture is lower culture and therefore we should not be preserving it." For all of us in our project, we're rejecting that point of view. Popular culture is the most important culture we need to preserve. It shows what people were actually interested in and what they were doing.

Further ammunition for those arguing for the validity of videogame studies as well as their preservation.

Also: on this here laptop, I have Fallout, World of Goo and two emulators, one for DOS and one for NES.

"Soon there will be no reason to have a big, boxy computer on your desk" #wymhm

In spite of their name, desktop PCs often have several users. Laptops, netbooks, and tablets are usually single-user machines—that is, they really are personal. Modern mobile operating systems are built with room enough for one—Apple's iOS and Google's Android are both tied in to a single user's e-mail, calendar, and app-purchasing accounts. Forrester's numbers also suggest that in the future we'll have many such machines around the house. Your "main" computer will be a laptop—and you'll probably have several smaller, tablet-type machines that you use regularly as well.

My laptop is my "main" computer, my only computer, at least until my current wireless carrier contract ends and I'm able to land a Google Android. Then I'll have two single-user machines, four and five if the PS3 and Xbox 360 qualify, six if I dare include my Moleskine journal.

"We are reading more text, writing far more often, than we were in the heyday of television" #wymhm

It’s no accident that most of the great scientific and technological innovation over the last millennium has taken place in crowded, distracting urban centers. The printed page itself encouraged those manifold connections, by allowing ideas to be stored and shared and circulated more efficiently. One can make the case that the Enlightenment depended more on the exchange of ideas than it did on solitary, deep-focus reading.

Quiet contemplation has led to its fair share of important thoughts. But it cannot be denied that good ideas also emerge in networks.

Yes, we are a little less focused, thanks to the electric stimulus of the screen. Yes, we are reading slightly fewer long-form narratives and arguments than we did 50 years ago, though the Kindle and the iPad may well change that. Those are costs, to be sure. But what of the other side of the ledger? We are reading more text, writing far more often, than we were in the heyday of television.

Right now, I'm reading A Better Pencil, by Dennis Baron. It's a fantastic, timely work about arguments against writing technologies from Plato onward. It also provides excellent counterpoints to Nick Carr and others about the influence of technology on our thinking processes.

"Open worlds are so popular now, but only a few developers know how to make them truly work." #wymhm

Non-player characters are very important when creating this kind of world. BioWare can get away with having everyone stand around forever, but in an open world, the people must be moving and acting. It’s surprising how many games fail at this. Assassin’s Creed, The Saboteur, and Red Faction: Guerilla are all high-profile open worlds filled with people that do nothing but wander aimlessly. They feel like artificial obstacles in our path. Rockstar is great at creating emergent moments of NPC interaction, moments that occur regardless of our presence. From the spontaneous gang wars in GTA to another gang dragging some poor sap through a town in Red Dead Redemption, Rockstar uses these NPC interactions to make their worlds feel persistent.

I look for opportunities in games to forget my responsibilities to missions and NPCs. Accelerating along the San Fierro Highway with the radio blaring was one of the most memorable experiences for me in San Andreas. Wandering the Capital Wasteland was often more engaging than searching for Liam Neeson. Riding the rails in Empire City was a consistent exhilaration. For some reason, knowing I was the lone non-NPC in the gameworld was a comforting freedom, too.

"“I write games for old machines for the sheer fun and sense of freedom it gives me" #wymhm

One of the main motivations is being able to concentrate on gameplay. The programmers understand the age-old languages well. “I write games for the Commodore 64 because it's very simple and quick to learn,” admits a coder who calls himself, rather ironically, Richard of The New Dimension. “It also shows support for the retro gamers who still love this retro machine today.”

The guys who code retro games (they're invariably all men) also know the technology inside and out and it allows them to push the boundaries of what the machines can do.

“People are pushing the hardware to do things they never did back in the 80s and 90s,” says Jason Mackenzie, owner of retro label Psytronik. He explains that people are not just interested in creating the very best games and says some also want to hack the hardware too. A new graphics mode for the Commodore 64 has been created called NUFLI. It displays high resolution, full colour, bitmap images on a standard machine.

I admire and appreciate this kind of nostalgia much more than what Nintendo churns out on a regular basis.

In Japan, Twitter is "playing out as a rediscovery of the Internet" #wymhm

One reason is language. It's possible to say so much more in Japanese within Twitter's 140 letter limit. The word "information" requires just two letters in Japanese. That allows academics and politicians to relay complex views, according to Tsuda, who believes Twitter could easily attract 20 million people in Japan soon.

Another is that people are owning up to their identities on Twitter. Anonymity tended to be the rule on popular Japanese Web sites, and horror stories abounded about people getting targeted in smear-campaigns that were launched under the shroud of anonymity.

In contrast, Twitter anecdotes are heartwarming. One well-known case is a woman who posted on Twitter the photo of a park her father sent in an e-mail attachment before he died. Twitter was immediately abuzz with people comparing parks.

I look forward to future comparative studies on how citizens in different countries use Twitter. I'm also taken by the translation of "tweet" as "mumble." For me, there's a humility and a demureness there that isn't very present in much of American tweeting, particularly by 'social media experts.'

"Pick up just about any novel and you'll find the phrase 'somewhere a dog barked.'" #wymhm

Most authors, however, employ the trope as a narrative rest stop, an innocuous way to fill space and time; since the bark is hollow, a reader can read anything into it, or nothing at all. Charlaine Harris, queen of the vampire authors, in Dead as a Doornail: "The entire parking lot was empty, except for Jan's car. The glare of the security lights made the shadows deeper. I heard a dog bark way off in the distance." The chief of Scandinavian crime writers, Henning Mankell: "She begins to tell him. The curtain in the kitchen window flutters gently, and a dog barks in the distance" (The Eye of the Leopard). And "genre" books aren't the only guilty category. Take 2666, Robert Bolaño's magnum opus: "The window looked out over the garden, which was still lit. A scent of flowers and wet grass drifted into the room. In the distance he heard a dog bark." For all we know, these dogs are off-camera sound machines set to woof.

Is there a similar trope in academic scholarship and writing, one that is as concrete and noticeable? I can think of certain turns of phrase, transitions and the like, but nothing comparable to "somewhere a dog barked."

"what we really need is to break the carrier’s stranglehold on devices" #wymhm

We should free the makers and small companies of the world to make devices without having to negotiate with carriers to get their approval.

Say you wanted to make a phone just for weekend nights, say one that included a lighter and a slot for holding whatever kind of cigarette you like. What carrier would offer that phone?

Or how about ones designed for kids, the elderly or the disabled?

A company could make a phone with guts that mesh with a number of networks, making the wireless companies have to compete for your business.

Google made a half-hearted effort to break the carrier’s grip with its Nexus One, which they wanted to sell directly to individuals who could then choose their carrier. Among the problems leading Google to close its online store was that the carriers soon decided that playing that game wasn’t in their long-term interest. Verizon and Sprint backed out of their commitment to support the device — leaving U.S. customers with only T-Mobile.

The carriers’s lobbying association likes to point to all the cool new phones and ask “Where’s the harm?” The problem is the harm comes from the devices and services that haven’t been invented yet, because wireless isn’t an open platform.

We literally don’t know what we are missing.

I worry that this piece, while I agree with it, is too idealistic in its suggestions. Perhaps my cynicism is showing too much in this regard, but given how the FCC and the Obama administration have (mis)handled net neutrality, I haven't much hope for this kind of revolution for wireless.

"Slow reading, like slow food, is about savouring rather than gobbling." #wymhm

We conspire in an unspoken agreement that our carefully considered choices are more a measure of students' inadequacy than our hopes for them, so they increasingly stay home as the weeks, and the novels, fly by. Like a high-speed train through gorgeous countryside, a novel a week turns the lovely hinterland of literature into a meaningless blur. Slow down, and the landscape changes: tempting byways appear; curiosity is given a chance to supplant urgent strategy. Acoustic engineers like to leave "headroom" in a recording, fine wine must apparently be allowed to breathe, and great books deserve space to come into their own.

Slow reading, like slow food, is about savouring rather than gobbling. The alternative, as voluntary reading continues to decline, is the futile effort of policing: quizzes, exams and journals that, whatever their other merits, purchase reassurance about students' reading at the cost of deep engagement.

Thomas Newkirk isn't the first or most prominent proponent of the so-called "slow reading" movement, but he argues it's becoming all the more important in a culture and educational system that often treats reading as fast food to be gobbled up as quickly as possible.

"You see schools where reading is turned into a race, you see kids on the stopwatch to see how many words they can read in a minute," he said. "That tells students a story about what reading is. It tells students to be fast is to be good."

Newkirk is encouraging schools from elementary through college to return to old strategies such as reading aloud and memorization as a way to help students truly "taste" the words. He uses those techniques in his own classroom, where students have told him that they've become so accustomed from flitting from page to page online that they have trouble concentrating while reading printed books.

"One student told me even when he was reading a regular book, he'd come to a word and it would almost act like a hyper link. It would just send his mind off to some other thing," Newkirk said. "I think they recognize they're missing out on something."

I fail to see how words acting as hyperlinks is a problem. If a particular idea, phrase or single word inspires a certain synapse to fire in my brain, I follow its trajectory and come back to the original reading when I'm ready. I make and see connections within and beyond the text in front of me.

I also have some measure of resistance to memorization and even to reading aloud. I can see some benefits to both, but memorization outside of an acting or poetry class appears as rather pointless to me. Is it possible to savor an essay, such as "Consider the Lobster," by David Foster Wallace, without memorization? Absolutely. How can this happen? By engaging with it by way of discussion.