On texting, videogames, and writing #dyr

Last year, 4.16 billion users made SMS the most popular data channel in the world. An estimated 6.1 trillion texts were sent, up from 1.8 trillion in 2007. And while the proportion of customers using SMS for more than simple messaging is still small, in poor nations these services are already changing the nature of commerce, crime, reporting news, political participation, and governing.

 

The subjects and themes and audiences of games should be no less of a concern than the contexts and purposes to which they are put. Not just adolescent fantasy and political activism, but everything in between.

 

research found that giving players the chance to adopt a new identity during the game and acting through that new identity – be it a different gender, hero, villain – made them feel better about themselves and less negative.

Looking at the players' emotion after play as well their motivation to play, the study found the enjoyment element of the videogames seemed to be greater when there was the least overlap between someone's actual self and their ideal self.

 

Video games aren't science. They are not a mystery of the universe that can be explained away via testable predictions and experimentation. We need to stop looking for answers, whether those answers would come from a technical innovation whose arrival only renews obsession with the next breakthrough, or from the final exploitation of the true nature of our medium by means of a historical discovery so obvious that it will become indisputable. The answers lie not in the past or the future, but in the present

 

We enter college hoping to learn effective communication skills—the kind of skills the recruiter in the Wall Street Journal article wished we possessed. But the joke is on us: the professors from whom we seek guidance, themselves don’t know good prose from porridge.

When we attend college, we throw our impressionable young minds headlong into this bog of ”scholars” such as Parsons; headlong into this asylum in which esteemed academic journals will publish gibberish if one uses the right buzzwords; headlong into this funhouse in which a computer program can generate random meaningless prose that reads passably like the stuff assigned in most graduate and undergraduate humanities classes. And from within this stylistic cesspool, we hope to get an education in how to write good prose.

 

Future assignment: The reverse-engineered essay

Among the ideas behind assignments like Mashup Scholarship and Pop Up Scholarship are that students need lots of practice writing and that they need to perform this practice in different ways. By explaining why and reflecting on how they perform academic writing instead of just producing 4-5 essays over 16 weeks, I like to think that more is happening here. The performance, i.e., the essay, is but one way of showing proficiency. The ability to reflect on that performance before, during, and after is just as important.

Such assignments also challenge how we should conduct ourselves in relation to academic writing. We need not be serious as a heart attack when discussing particulars of the kinds of writing students will be expected to do in college. There is no reverence in Pop Up Scholarship, given how much I encourage students to approach the assignment in much the way Pop Up Video did its own subject matter, i.e., with an affable, critical, knowledgeable, and playful edge. Furthermore, Mashup Scholarship invites students to do what other writing instructors may balk at.

However, both assignments focus on somewhat specific aspects of writing, including audience, grammar and syntax, organization, and source materials. Neither account much for argument or idea development, though, focusing and reflecting instead on the end result over whatever process produced it. This is not to say that my first-year writing courses are bereft of discussion concerning argument, idea development, or writing processes, only that I haven't devised a clear assignment addressing them. I think I might have something for my Fall 2011 class that does, though.

During my first year at UM-Flint, I had the privilege of working with a fourth-year student on an independent study project about comic-book writing. The student's semester-end project was a 25-page script of an origin story for a new superhero. Helping the student get to that point was an earlier assignment in which he reverse-engineered Dennis O'Neil's Batman: Birth of the Demon, breaking it down into constituent parts for further examination, seeing how the frames, panels, and pages fit together, and where O'Neil likely began.

A recent column on plagiarism got me thinking about reverse-engineering again. Even if a student were to reverse-engineer a plagiarized essay, they will undoubtedly still learn something worthwhile in the process, yes? Both Mashup and Pop Up Scholarship as well as some of the required reading in my first-year courses lean toward the idea of reverse engineering. This is something I want to explore more with students. So, similar to Mashup Scholarship in that I ask students to do something they and/or other professors may have reservations about, reverse-engineering an essay asks students to fill in the steps that led to final publication.

I'm still working on the specifics of the assignment, but here's some starting language:

Choose one of the longform articles below or submit another for approval. In an entry this week, provide context for and a summary of your chosen article. That is, note the author, the publication in which the article appears and when, etc. It's pretty much impossible to reverse-engineer an essay you haven't read.

Using either the sample steps provided below or your own identified process, reverse-engineer your chosen article. Over the next two weeks, we will move from revised drafts to shitty drafts to outlines to brainstorming to initial curiosity/perplexity. In other words, we will work backwards until we have reached a satisfactory starting point for your chosen article.

 

I welcome any/all feedback on this. Do you see particular use in devising such an assignment?

On blogging, social media, self-publishing, and teaching reading and writing #dyr

What does that mean: Blogging is writing without a safety net?

This means that you are on your own. Your work is all yours, and it rises or falls on its own merits. Nobody is fact-checking you before you hit “Publish” (though many commenters will afterwards), and nobody is having your back after your publish – you are alone to defend your work against the critics. If you are good and trusted, you may have a community of bloggers or commenters who will support you, but there is no guarantee.

You can see, from the above paragraph, that there are two senses of “blogging is writing without a safety net”. One concerns pre-publication – there is no editor to check your work. The other concerns post-publication – nobody protects you.

 

While the blogs have exposed wrongdoers and broken news before, this week’s performance may signal the arrival of weibos as a social force to be reckoned with, even in the face of government efforts to rein in the Internet’s influence.

The government censors assigned to monitor public opinion have let most, though hardly all of the weibo posts stream onto the Web unimpeded. But many experts say they are riding a tiger. For the very nature of weibo posts, which spread faster than censors can react, makes weibos beyond easy control. And their mushrooming popularity makes controlling them a delicate matter.

 

We assume that Facebook is something we should associate with the young, but my evidence suggests that this is entirely mistaken.

If there is one obvious constituency for whom Facebook is absolutely the right technology, it is the elderly. It allows them to keep closely involved in the lives of people they care about when, for one reason or another, face-to-face contact becomes difficult... Its origins are with the young but the elderly are its future.

 

Twitter/Facebook/G+ are secondary media. They are a means to connect in crisis situations and to quickly disseminate rapidly evolving information. They are also great for staying connected with others on similar interests (Stanley Cup, Olympics). Social media is good for event-based activities. But terrible when people try to make it do more – such as, for example, nonsensically proclaiming that a hashtag is a movement. The substance needs to exist somewhere else (an academic profile, journal articles, blogs, online courses).

 

There are many reasons potential authors want to publish their own books, Mr. Weiss said. They have an idea or manuscript they have passed around to various agents and publishers with no luck; they may just want to print a few copies of, say, a memoir for family members; they want to use it in their business as a type of calling card; or they actually want to sell a lot of books and make their living as writers.

 

In a hyper–abundant book world, where previous patterns of discovery may not work as well as they used to, readers are developing, and increasingly will need to develop, new ways of discovering titles that might interest them. Marketing and discovery are moving to the forefront of book marketplace activity, and social networks are adding new ideas and opportunities to the stable of traditional ways to bring books to the attention of potential readers.

 

The academic study of literature is a wonderful thing, and not just because it has paid my salary for most of my adult life, but it is not an unmixed blessing, and teachers will rarely find it possible simply to inculcate the practices of deeply attentive reading.

Over the past 150 years, it has become increasingly difficult to extricate reading from academic expectations; but I believe that such extrication is necessary. Education is and should be primarily about intellectual navigation, about—I scruple not to say it—skimming well, and reading carefully for information in order to upload content. Slow and patient reading, by contrast, properly belongs to our leisure hours.

 

email has such obvious promise as a tool for writing, and sharing writing, and teaching writing. It takes words and it sends them anywhere right away. If in 1976 you wanted to see a student's work in progress, you needed a printer and an appointment. The student had to take notes while you talked, walk home, remember what exactly you said, and work up a new draft. If he came to another impasse he'd probably keep it to himself -- nobody is going to office hours five times in three days. (Nobody is holding office hours five times in three days.)

Today each of these transactions -- copy, paste, send; receive, annotate, reply -- might take a few minutes. Emails can be composed and consumed anywhere, privately, quietly, at one's convenience. It is the free ubiquitous highway for words. It is exactly the tool you'd invent if you were a teacher of writing who wanted a better way to teach people to write.

 

from my cold dead hands, or: why i won't "get over it" and ditch the pen

[inspired by "12 Reasons to Ditch the Pen"]

The death of the pen is not being replaced by digital writing tools. The pen might be, but that doesn't mean it should be forgotten and forsaken. Just because some new technology works for one person doesn't mean I should go all in on it, too.

I admit that I like the feel of writing by hand, but I'm more impressed with how cursive looks on the page, even with how I look while writing. I revel in those compliments about my handwriting. I smile at the shocked faces of colleagues, strangers and students when they see that I still write with a pen. How can someone who studies computers and writing, social media, or videogames still write like that? Well, this is how.

I write by hand because I lost everything due to a zip disk error while in undergrad. I lost the paper I was writing at the time. I lost all previous coursework. I lost every essay I wrote in high school. I lost every terrible poem, every awful short story. That loss taught me to never put complete trust in a computer again. Everything I write now is handwritten first. Everything. This includes blog entries, emails and even Twitter updates. And there's nothing for me to "get over," because this method, this outdated, time-wasting method of writing by hand, works. I make it work. 

Writing by hand doesn't mean you are irrelevant to yourself, your colleagues or your students. It means you understand what technology works for you and in what capacity. 

The computer keyboard is not the same as a pen and paper. I need a Pilot G2 .05 with blue ink and a blank page from a Moleskine notebook in order to write. This approach, this method focuses me and my thoughts to an indescribable degree. 

Taking notes is not an outdated skill; neither is taking notes with a pen an outdated skill. Ask my #eng112 students who just completed Pop Up Scholarship and the newfound value they have for writing in the margins of academic articles. 

Being fast isn't (and shouldn't be) everything. Writing by hand forces me to take the time to really develop my ideas first to myself before putting them anywhere else. 

Handwriting means that editing happens during transcription. One sentence handwritten often becomes a full paragraph typed. I fail to see any harm in recopying either as it's important to back up everything, whether analog or digital. 

I don't do much collaborative work (which is unfortunate, I know), so the ease-of-editing-by-others argument doesn't apply to me. I do think, though, that writing by hand can, in the end, better facilitate collaboration as it adds another layer of earlier editing. But how does writing by hand prevent the sharing of ideas and the making of meaning? This one extra step between having an original thought and sharing it is not a big deal. Writing by hand gives me an additional filter, perhaps making me think twice about something before posting it online.

With a pen in hand, I don't waste time figuring out spelling and grammar. I never have to worry about red, squiggly lines appearing under my words. I annotate when I can't think of a word (or how to spell it) and come back to them later. 

If you have any idea of how to organize anything, there's no reason for clutter if you write by hand. I've been using the same black Moleskine notebooks for years now. Sure, I can't tag or apply a Google search, but it wasn't very difficult to develop (and refine) my own unique system for finding particular entries.

Since I know my own writing and my searching system, it is much more satisfying in terms of results. 

My Moleskine notebook is a working file cabinet and it's with me wherever I go. A single notebook isn't burdensome, no matter the book bag, briefcase, etc. All the ideas and information I need fit in one notebook, which lasts about nine months before it's full and I need to start a new one. 

I don't need an Internet connection or even electricity when writing by hand. A pen and paper are all I need. These physical materials aren't the end in itself, of course, only the beginning. Together, they are a beginning I want to never lose.

 

Also: the idea that ditching the pen and paper and going digital is an environmentally friendly move is laughable.

Also, too: initial commentary on Google Buzz about "12 Reasons" and my response

Also, too, also: Jerrid Kruse's comment on "12 Reasons" 

poignant passages, 9.14 #eng112

No columnist or reporter or novelist will have his minute shifts or constant small contradictions exposed as mercilessly as a blogger’s are. A columnist can ignore or duck a subject less noticeably than a blogger committing thoughts to pixels several times a day. A reporter can wait—must wait—until every source has confirmed. A novelist can spend months or years before committing words to the world. For bloggers, the deadline is always now. Blogging is therefore to writing what extreme sports are to athletics: more free-form, more accident-prone, less formal, more alive. It is, in many ways, writing out loud.

 

Kill your word-processor

Word, Google Office and OpenOffice all come with a bewildering array of typesetting and automation settings that you can play with forever. Forget it. All that stuff is distraction, and the last thing you want is your tool second-guessing you, "correcting" your spelling, criticizing your sentence structure, and so on. The programmers who wrote your word processor type all day long, every day, and they have the power to buy or acquire any tool they can imagine for entering text into a computer. They don't write their software with Word. They use a text-editor, like vi, Emacs, TextPad, BBEdit, Gedit, or any of a host of editors. These are some of the most venerable, reliable, powerful tools in the history of software (since they're at the core of all other software) and they have almost no distracting features — but they do have powerful search-and-replace functions. Best of all, the humble .txt file can be read by practically every application on your computer, can be pasted directly into an email, and can't transmit a virus.

 

There's no doubt that social-media networks are fantastic communication machines. They allow people to feel connected to a virtual community, make new friends and keep old ones, learn things they didn't know. They encourage people to write more (that can't be bad) and write well and concisely (which is hard, trust us). They are a new form of entertainment (and marketing) that can occupy people for hours in any given day.

"Great blogging is great writing, and it turns out great Twittering is great writing — it's the haiku form of blogging," says Debbie Weil, a consultant on social media and author of The Corporate Blogging Book.

But the art of the status update is not much of an art form for millions of people on Facebook, where users can post details of what they're doing for all their friends to see, or on Twitter, where people post tweets about what they're doing that potentially every user can see.

 

I fail to see any clear distinction between someone's boring Twitter feed – considered only semi-literate and very much bad – and someone else's equally boring, paper-based diary – considered both pro-humanist and unquestionably good.
Kafka would have had a Twitter feed! And so would have Hemingway, and so would have Virgil, and so would have Sappho. It's a tool for writing. Heraclitus would have had a f***ing Twitter feed.

 

On plagiarism (link bundle) #wymhm

Professors used to deal with plagiarism by admonishing students to give credit to others and to follow the style guide for citations, and pretty much left it at that.

But these cases — typical ones, according to writing tutors and officials responsible for discipline at the three schools who described the plagiarism — suggest that many students simply do not grasp that using words they did not write is a serious misdeed.

It is a disconnect that is growing in the Internet age as concepts of intellectual property, copyright and originality are under assault in the unbridled exchange of online information, say educators who study plagiarism.

In the broader intellectual sphere, incidents of plagiarism skyrocketed in universities in the late 1990s, and some people reached for scapegoats like the Evil Internet. But others began to rethink plagiarism, not only what it was but what it meant that administrators and instructors reacted as they did. Rebecca Moore Howard, a professor at Syracuse University, sensed that her students were lifting sentences from published sources not because they were bad people or didn’t know how to cite things, but because they didn’t understand the texts they were reading well enough to synthesize them. Howard realized that what was monolithically labeled “plagiarism” by institutions was actually a bunch of activities. Some you could legitimately condemn. Some you could teach through. Others were culturally acceptable practices, even time-honored and literary ones. The students had simply done them awkwardly or badly. Howard advocated that policies on student authorship abandon the monolith and try to find students where they were, morally and cognitively.

if you’re a student, plagiarism will seem to be an annoying guild imposition without a persuasive rationale  (who cares?); for students, learning the rules of plagiarism is worse than learning the irregular conjugations of a foreign language. It takes years, and while a knowledge of irregular verbs might conceivably come in handy if you travel, knowledge of what is and is not plagiarism in this or that professional practice is not something that will be of very much use to you unless  you end up becoming a member of the profession yourself.  It follows that students who never quite get the concept right are by and large not committing a crime; they are just failing to become acclimated to the conventions of the little insular world they have, often through no choice of their own, wandered into.

People have been going to libraries and using books and then not citing them forever. I don't think there's anyone who hasn't plagiarized. When I was in elementary school, for instance, we'd go to the library with index cards and open up the encyclopedia and write down exactly what it said. The difference is that now we can type the things we think are plagiarized into Google and see what comes up. But in a sense more students getting caught is a positive thing, because it creates a real teachable moment for us, when we can explain very thoroughly why it's not OK to write like that.

" Twitter...always has the potential to be a two-way conversation." #wymhm

What works about Twitter? It’s not anonymous. I’ve found it to be a marvelous medium to engage critics in a low-key, non-defensive way, to say, hey, I’m listening, I’m a real person here, can you let me know a little more about what you’re thinking? I’ve turned critics into supporters and I think softened the tone of some debates over the book.  (I recently had a critic who had trashed a speech of mine on Twitter DM me “You have far too many fans for me to criticize you in public!”  After a little back and forth about his concerns, he went on to write a glowing review of the book.)

"Writing is hard...But it's not nearly as hard, in my experience, as not writing" #wymhm

I know a lot of writers, both published and not, and so I know that for every book that makes it to stores, several are never published, and several more are never finished. Many of my friends and acquaintances from graduate school published right away, but most still haven't. No doubt some will publish in the coming years. And some have gone into social work or law or medicine and seem to have left fiction writing behind, happily, like an old hairstyle.

And what about the rest of them? These are the people—many of whom write beautifully—I wonder about. And I wonder about strangers in similar situations, artists of all ilks. I wonder if they wake in the night, their hearts racing, unable to feel anything but the fear and frustration and disappointment of the fact that they haven't finished anything in a month. I wonder if they're anything like me. My guess is that many of them are—and naturally I feel tremendous empathy. Having been there, I know there are no magic words of encouragement, no surefire tough-love tactic. I wish there were.

"A new colon is on the march. For now let’s call it the 'jumper colon.'" #wymhm

For grammarians, it’s a dependent clause + colon + just about anything, incorporating any and all elements of the other four colons, yet differing crucially in that its pre-colon segment is always a dependent clause.

(Yikes.)

For everyone else: its usefulness lies in that it lifts you up and into a sentence you never thought you’d be reading by giving you a compact little nugget of information prior to the colon and leaving you on the hook for whatever comes thereafter, often rambling on until the reader has exhausted his/her theoretical lung capacity and can continue to read no longer.

(Breathe.)

See how fast that goes? The jumper colon is a paragraphical Red Bull, a rocket-launch of a punctuator, the Usain Bolt of literature. It’s punchy as hell. To believers of short first sentences–Hemingway?–it couldn’t get any better. To believers of long-winded sentences that leave you gasping and slightly confused–Faulkner?–it also couldn’t get any better. By itself this colon is neither a period nor a non-period… or rather it is a period and it is also a non-period. You choose.

"It doesn’t stop them if you say, ‘This is plagiarism. I won’t accept it.’" #wymhm

The Pritchard axiom — that repetitive cheating undermines learning — has ominous implications for a world in which even junior high school students cut and paste from the Internet instead of producing their own writing.

If we look closely at plagiarism as practiced by youngsters, we can see that they have a different relationship to the printed word than did the generations before them. When many young people think of writing, they don’t think of fashioning original sentences into a sustained thought. They think of making something like a collage of found passages and ideas from the Internet.

They become like rap musicians who construct what they describe as new works by “sampling” (which is to say, cutting and pasting) beats and refrains from the works of others.

This comparison to rap musicians doesn't read right to me. The demands of sampling require a certain knowledge and awareness. To extend the idea that students aren't learning when they cheat to how many successful rap artists make a living reveals a lack of understanding. Such a view is overly simplistic and the definition of "learning" might be too narrow.

In my first-year and advanced composition courses, I have an assignment, "Mashup Scholarship," that asks students to put together a minimum of five sources using none of their own words. They have to use the transitions provided in the sources they've chosen. They read Lethem's "The ecstasy of influence" and watch Youtube videos for modeling purposes. They come up with alternative citation methods, from color-coding to ISBN to something else that only makes sense to them. In reflective writing about this assignment, students often name "Mashup Scholarship" among the hardest composing they've completed.