On rhetoric, Anonymous, bookstores, connectedness, videogames, digital natives, and slang #dyr

The reason that rhetoricians have never preponderantly been the primary sources that media go after is that we are just one of many competitors interpreting reality, and often we are looked at as purveyors of ‘‘mere rhetoric’...rhetoricians, although they are often aligned with the political zeitgeist of academia, must compete with other high-ethos sources in or social commentary sources which, again, have more credentials to be able to sort out reality: political scientists, historians, journalists, bloggers, etc. In fact, the fragmentation of prominent sources of rhetoric demands even more the approach to rhetoric argued in the ‘‘Myth’’ piece. Imagine how increasingly irrelevant situationally-grounded rhetoricians’ depictions and interpretations of reality must seem to political principals, political professionals, and even average citizens.

part of Anonymous has over the last three years moved from disaggregated practices rooted in the culture of trolling to also become a rhizomatic and collective form of action catalyzed and moved forward by a series of world events and political interventions.

a small tribe of devoted book lovers with a business bent say that the economic setting has been right for small, highly personal ventures.

The lesson in the decline of big stores, these owners say, is not that no one wants to buy books. It’s that the big stores were too big. They had overreached and, in trying to be all things to all readers, had lost a sense of intimacy that books and reading seem to thrive on.

The Internet has had a dual effect on the level of connectedness I feel with the people I know in my offline life. On one hand, the basic communication tools now available make distance almost a non-issue...On the other hand, when I am actually with my friends and family, I find myself (and increasingly, my companions) distracted by a smartphone that’s either the object of my gaze or being fingered in my front pocket.

People have less time to play games than they did before. They have more options than ever. And they're more inclined to play quick-hit multiplayer modes, even at the expense of 100-hour epics.
via cnn.com

So Prensky was right the first time – there really is digital native generation? No, certainly not – and that’s what’s important about this study. It shows that while those differences exist, they are not lined up on each side of any kind of well-defined discontinuity. The change is gradual, age group to age group. The researchers regard their results as confirming those who have doubted the existence of a coherent ‘net generation’.

There's no grand unified theory for why some slang terms live and others die. In fact, it's even worse than that: The very definition of slang is tenuous and clunky. Writing for the journal American Speech, Bethany Dumas and Jonathan Lighter argued in 1978 that slang must meet at least two of the following criteria: It lowers "the dignity of formal or serious speech or writing," it implies that the user is savvy (he knows what the word means, and knows people who know what it means), it sounds taboo in ordinary discourse (as in with adults or your superiors), and it replaces a conventional synonym. This characterization seems to open the door to words that most would not recognize as slang, including like in the quotative sense: "I was like … and he was like." It replaces a conventional synonym (said), and certainly lowers seriousness, but is probably better categorized as a tic.

"There is, has been and will be no shortage of grand talk of the Internet’s potential." #wymhm

In his recent book “Cognitive Surplus,” Clay Shirky, the New York University lecturer and Web pontificator, suggests that the shift from passive media consumption to active and democratized media creation means we will all work in previously impossible concert to build astonishing virtual cathedrals of the mind, solving the world’s problems instead of vegging out in front of “Gilligan’s Island.” As it happens, he even mentions Lolcats. Because Lolcats are both made and shared by the Internet-­connected masses, they are examples of how Web tools have “bridged that gap” between passivity and activity. But this lasts only a few paragraphs (in which Lolcats are characterized as “dumb,” “stupid” and “crude”). He quickly pivots back to the more high-minded stuff about how “the wiring of humanity lets us treat free time as a shared global resource.”

Shirky is among the thinkers engaged in the popular debate over whether the Internet makes us smarter or dumber. And that question is interesting, but let’s face it: it’s not awesome. What Tim Hwang and his cohorts basically hit upon was the conclusion that, while that debate drags on, funny cat pictures and so on are really, really popular. And maybe another question to consider is what that means — to consider the Web not in terms of how it might affect who we become but rather in terms of how it reflects who we are.

"Getting your Internet Surfing License is a necessary prerequisite in making the web safe for everyone." #wymhm

Before governments made the ISL mandatory, people often found themselves lost in the myriad of web sites, naively double-clicking Hit The Monkey to Win iPad ads, finding themselves spammed by pop-unders. Acquiring the license typically takes only between 2-5 days of education by your local Surf Training School. You will need to carefully prepare for the final test, in which you are required to answer simple questions like:

  • What is a pyramid scheme, and do they really work?
  • How do I replace the solar cells on my cyber glove?
  • Why exactly is it bad for people to badmouth their governments or big companies online?
  • Why is it illegal to surf without a RealIdentity card?
  • In which year did Google buy the internet?

"The internet has quietly infiltrated our lives, and yet we seem to be remarkably unreflective about it." #wymhm

We're living through a radical transformation of our communications environment. Since we don't have the benefit of hindsight, we don't really know where it's taking us. And one thing we've learned from the history of communications technology is that people tend to overestimate the short-term impact of new technologies — and to underestimate their long-term implications.

We see this all around us at the moment, as would-be savants, commentators, writers, consultants and visionaries tout their personal interpretations of what the internet means for business, publishing, retailing, education, politics and the future of civilisation as we know it. Often, these interpretations are compressed into vivid slogans, memes or aphorisms: information "wants to be free"; the "long tail" is the future of retailing; "Facebook just seized control of the internet", and so on. These kinds of slogans are really just short-term extrapolations from yesterday's or today's experience.

While a definite qualifier for "tl;dr," it's worth a read for the clarifying perspective and reflection on what the internet is, does and could/will be.

In Japan, Twitter is "playing out as a rediscovery of the Internet" #wymhm

One reason is language. It's possible to say so much more in Japanese within Twitter's 140 letter limit. The word "information" requires just two letters in Japanese. That allows academics and politicians to relay complex views, according to Tsuda, who believes Twitter could easily attract 20 million people in Japan soon.

Another is that people are owning up to their identities on Twitter. Anonymity tended to be the rule on popular Japanese Web sites, and horror stories abounded about people getting targeted in smear-campaigns that were launched under the shroud of anonymity.

In contrast, Twitter anecdotes are heartwarming. One well-known case is a woman who posted on Twitter the photo of a park her father sent in an e-mail attachment before he died. Twitter was immediately abuzz with people comparing parks.

I look forward to future comparative studies on how citizens in different countries use Twitter. I'm also taken by the translation of "tweet" as "mumble." For me, there's a humility and a demureness there that isn't very present in much of American tweeting, particularly by 'social media experts.'

"I like your idea of an internet-incapable computer." #wymhm

I’ve set up a second computer, devoid of internet, for my fiction-writing. That’s to say, I took an expensive Mac and turned it back into a typewriter. (You should imagine my computer set-up guy’s consternation when I insisted he drag the internet function out of the thing entirely. “I can just hide it from you,” he said. “No,” I told him, “I don’t want to know it’s in there somewhere.”) In fact, you ask me whether I feel there’s any difference between my fiction and essay—well, not (I ardently hope) either quality or commitment-wise (in that sense, yes, writing is writing), but lately, à la David Shields, process-wise I find I do want to Google while I essay, and while I’m always certain I need that other, internet-disabled computer for writing fiction.
via pen.org

An email correspondence between Jonathan Lethem and David Gates, touching on the influence of technology in all things writing. I have a growing interest in what we deprive ourselves of when we get down to writing, whether we perform better by way of sensory deprivation or overload (writing in silence vs. writing to music), whether we invest in programs like Anti-Social and Freedom and/or simpler word processors like OmmWriter and WriteRoom.

How many of us become cloistered when we write?

prevailing visibility: a bit further

Three days ago, I posted a gloss of Bill Wolff's Deliverator at #cw2010, attempting to reinforce some of the ideas he put forth. I focused on his notion of expanding "composition" to include bookmarking and tagging (among other online actions). I also carried this into how I've reorganized my offline research library. Brian McNely was kind enough to acknowledge the post, mentioning its connection to his and Derek Mueller's #cw2010 talks. McNely's is on his blog, so I'll let him speak for himself: "Mentorship and Professionalization in Networked Publics." 

One of the recent links I shared on Twitter, which Alan Benson retweeted, was a piece in the Atlantic about how disorganized terrorists often are and the importance of emphasizing this fact. "Can being more realistic about who our foes actually are help us stop the truly dangerous ones?" the authors ask. It's a piece worth reading, hence the sharing via Twitter (and here later tonight as part of #wymhm), but I mention it in this moment for another reason, one (I think) related to Bill Wolff's Deliverator and Brian McNely's talk at #cw2010. 

I maintain four print subscriptions: the Atlantic, Paste, Lansing State Journal and Wired. I read each issue in its entirety within 24 hours of its arrival. The aforementioned article is in the most recent issue of the Atlantic, and it was in print that I first encountered it. Most Atlantic pieces appear online soon after the print publication is out in circulation, so I went there to find it and subsequently share it. Is it safe to assume that those seeing my tweet about this Atlantic piece figured I found it online first? I think an argument for that could be made, but it would be incorrect. 

I mention all this here in the interest of that prevailing visibility, and because I just have an increasing interest in how we come to share information online and even marking the paths that lead to sharing.

"There is little doubt that the Internet is changing our brain. Everything changes our brain" #wymhm

Carr’s argument also breaks down when it comes to idle Web surfing. A 2009 study by neuroscientists at the University of California, Los Angeles, found that performing Google searches led to increased activity in the dorsolateral prefrontal cortex, at least when compared with reading a “book-like text.” Interestingly, this brain area underlies the precise talents, like selective attention and deliberate analysis, that Carr says have vanished in the age of the Internet. Google, in other words, isn’t making us stupid — it’s exercising the very mental muscles that make us smarter.

This doesn’t mean that the rise of the Internet won’t lead to loss of important mental talents; every technology comes with trade-offs. Look, for instance, at literacy itself: when children learn to decode letters, they usurp large chunks of the visual cortex previously devoted to object recognition. The end result is that literate humans are less able to “read” the details of the natural world.