On composition pedagogy, the syllabus, Twitter, journalism, privacy, copyright, and videogames #dyr

it’s too easy to allow the classroom work associated with composition courses to focus on activities other than writing. I’ve been in many composition classes here and at other institutions where the students discuss readings and approaches and the teachers facilitate work and manage discussion and sometimes stand at the front of the classroom and show students things. Compositionists know and agree and emphasize that the work of the writing class is writing, and yet — in many classes — students simply don’t produce much text, largely because of the way we apportion the work of the course.

 

we may be a little too fond of limiting and certainty. These days syllabi are looking more and more like those Terms of Service that pop up when we use software...They are contracts that we can’t negotiate, and they contain provisions we might not agree to, if we understood what they actually meant. But the most striking thing about TOS is that they are full of rules – and very few people read them.

 

what are the teaching and learning practices of the networked classroom? No doubt there are people out there doing that work, and those of us who have taught in computer labs have related, relevant experiences. In both cases, it's a matter of turning the focal point away from the professor. Even in the class discussion format, among faculty committed to "decentering" the classroom, conversation generally runs through the professor, or at least the professor steers conversation through its iterations. As we have discovered, nothing decenters the classroom quite like a room full of laptops and smartphones, eh? The networked students is only partly in the classroom and is partly distributed. 

 

The reality of the Twitter effect isn’t just that President Obama has Twitter town halls now where he talks directly to American citizens, nor is it just that someone with no journalism background sitting in a house in Pakistan can report on a military raid that kills the world’s most notorious terrorist. It’s that journalism of all kinds has now become something you do, not something you are. Anyone can do it, whether they call themselves a journalist or not.

 

Politics presented as entertainment charges the press with a failure to treat the serious stuff seriously. And that is a valid critique. But here’s a trickier problem: even when the press is trying to be serious, to provide, say, “analysis” instead of a good yarn, it increasingly relies on an impoverished notion of politics, a cluster of bad ideas that together form the common sense of the craft

 

The definition of privacy has been thrown out the window, and we have a new definition of privacy, which is whether we have control of what companies are doing with this information and if we have knowledge of how it’s being used.

 

you can’t motivate monopoly legislation based on your costs, when others are doing the same thing for much less — practically zero. There has never been as much music available as now, just because all of us love to create. It’s not something we do because of money, it’s because of who we are. We have always created.

 

Just like in the best zombie movies, the real drama in L4D lies in the relationships between the living, not the dead. The infected are just a pretext for collapsing the social order and forcing people to depend on one another to survive. It’s the ultimate online co-op experience, a game that requires not just headshot skills but communication, collaboration and confidence in your fellow player.

 

On rhetoric, Anonymous, bookstores, connectedness, videogames, digital natives, and slang #dyr

The reason that rhetoricians have never preponderantly been the primary sources that media go after is that we are just one of many competitors interpreting reality, and often we are looked at as purveyors of ‘‘mere rhetoric’...rhetoricians, although they are often aligned with the political zeitgeist of academia, must compete with other high-ethos sources in or social commentary sources which, again, have more credentials to be able to sort out reality: political scientists, historians, journalists, bloggers, etc. In fact, the fragmentation of prominent sources of rhetoric demands even more the approach to rhetoric argued in the ‘‘Myth’’ piece. Imagine how increasingly irrelevant situationally-grounded rhetoricians’ depictions and interpretations of reality must seem to political principals, political professionals, and even average citizens.

part of Anonymous has over the last three years moved from disaggregated practices rooted in the culture of trolling to also become a rhizomatic and collective form of action catalyzed and moved forward by a series of world events and political interventions.

a small tribe of devoted book lovers with a business bent say that the economic setting has been right for small, highly personal ventures.

The lesson in the decline of big stores, these owners say, is not that no one wants to buy books. It’s that the big stores were too big. They had overreached and, in trying to be all things to all readers, had lost a sense of intimacy that books and reading seem to thrive on.

The Internet has had a dual effect on the level of connectedness I feel with the people I know in my offline life. On one hand, the basic communication tools now available make distance almost a non-issue...On the other hand, when I am actually with my friends and family, I find myself (and increasingly, my companions) distracted by a smartphone that’s either the object of my gaze or being fingered in my front pocket.

People have less time to play games than they did before. They have more options than ever. And they're more inclined to play quick-hit multiplayer modes, even at the expense of 100-hour epics.
via cnn.com

So Prensky was right the first time – there really is digital native generation? No, certainly not – and that’s what’s important about this study. It shows that while those differences exist, they are not lined up on each side of any kind of well-defined discontinuity. The change is gradual, age group to age group. The researchers regard their results as confirming those who have doubted the existence of a coherent ‘net generation’.

There's no grand unified theory for why some slang terms live and others die. In fact, it's even worse than that: The very definition of slang is tenuous and clunky. Writing for the journal American Speech, Bethany Dumas and Jonathan Lighter argued in 1978 that slang must meet at least two of the following criteria: It lowers "the dignity of formal or serious speech or writing," it implies that the user is savvy (he knows what the word means, and knows people who know what it means), it sounds taboo in ordinary discourse (as in with adults or your superiors), and it replaces a conventional synonym. This characterization seems to open the door to words that most would not recognize as slang, including like in the quotative sense: "I was like … and he was like." It replaces a conventional synonym (said), and certainly lowers seriousness, but is probably better categorized as a tic.

On reading, writing, social media, surveillance, videogame violence, and genre #dyr

the human brain was never meant to read. Not text, not papyrus, not computer screens, not tablets. There are no genes or areas in the brain devoted uniquely to reading. Rather, our ability to read represents our brain's protean capacity to learn something outside our repertoire by creating new circuits that connect existing circuits in a different way. Indeed, every time we learn a new skill – whether knitting or playing the cello or using Facebook – that is what we are doing.

Touch typing allows us to write without thinking about how we are writing, freeing us to focus on what we are writing, on our ideas. Touch typing is an example of cognitive automaticity, the ability to do things without conscious attention or awareness. Automaticity takes a burden off our working memory, allowing us more space for higher-order thinking. (Other forms of cognitive automaticity include driving a car, riding a bike and reading—you're not sounding out the letters as you scan this post, right?) When we type without looking at the keys, we are multi-tasking, our brains free to focus on ideas without having to waste mental resources trying to find the quotation mark key. We can write at the speed of thought.

Facebook, Twitter, and other forms of social software are about consumption and production, about dialectic interaction on the read/write web. It’s no wonder short-form writing in sociotechnical networks is epistemologically productive, often leading to richer, longer-form writing work. Savvy writers might intentionally deploy sociotechnical notemaking as a powerful heuristic strategy for moving from short-form to long-form writing practices. Sociotechnical notemaking may therefore be defined as short-form writing work that is typically enacted informally via the enabling technologies of social software, with explicit heuristic, inventional, and epistemological implications.

before we give more attention to having students write briefly to fit their text-messaging sensibilities and the latest technologies, we should be more forceful about expecting and bringing their attention to accuracy and precision. Strunk and White, in their classic The Elements of Style, caution against predilection for brevity over precision in their 19th style reminder: “Do not take shortcuts at the cost of clarity." I suspect most instructors would agree with this admonition, as I trust precision of thought and expression from our students is paramount for most of us.

Ideas don’t need the media any more than the media need ideas. They’ve relied on each other in the past, true enough — media as the gatekeepers, ideas as the floods — but the present media moment is characterized above all by the fact that ideas, Big and otherwise, can be amplified independently of traditional media filters. The public, online, is empowered to decide for itself which ideas are worthy of changing the world.

In their concern to stop not just mob violence but commercial crimes like piracy and file-sharing, Western politicians have proposed new tools for examining Web traffic and changes in the basic architecture of the Internet to simplify surveillance. What they fail to see is that such measures can also affect the fate of dissidents in places like China and Iran. Likewise, how European politicians handle online anonymity will influence the policies of sites like Facebook, which, in turn, will affect the political behavior of those who use social media in the Middle East.

Through two online surveys and four experimental studies, the researchers showed that people stayed glued to games mainly for the feelings of challenge and autonomy they experience while playing. Both seasoned video gamers and novices preferred games where they could conquer obstacles, feel effective, and have lots of choices about their strategies and actions.

These elements, said coauthor Richard Ryan, a motivational psychologist at the University, represent "the core reasons that people find games so entertaining and compelling. Conflict and war are a common and powerful context for providing these experiences, but it is the need satisfaction in the gameplay that matters more than the violent content itself."

There are no meaningful genres in games anymore. It’s a good thing that developers are pushing back borders and finding interesting ways to combine old mechanics, but as a consequence, there’s no ways of separating works with huge and obvious disparities. There ought to be a way to categorize games in a meaningful, succinct way that doesn’t implicitly suggest a high art/low art dichotomy.

On writing, grammar, gamification, videogames, education, academia, and data preservation #dyr

I don’t know the origin of the “write what you know” logic. A lot of folks attribute it to Hemingway, but what I find is his having said this: “From all things that you know and all those you cannot know, you make something through your invention that is not a representation but a whole new thing truer than anything true and alive.” If this is the logic’s origin, then maybe what’s happened is akin to that old game called Telephone. In the game, one kid whispers a message to a second kid and then that kid whispers it to a third and so on, until the message circles the room and returns to the first kid. The message is always altered, minimized, and corrupted by translation. “Bill is smart to sit in the grass” becomes “Bill is a smart-ass.” A similar transmission problem undermines the logic of writing what you know and, ironically, Hemingway may have been arguing against it all along. The very act of committing an experience to the page is necessarily an act of reduction, and regardless of craft or skill, vision or voice, the result is a story beholden to and inevitably eclipsed by source material.

 

there is still no widely-accepted gender-neutral pronoun. In part, that’s because pronoun systems are slow to change, and when change comes, it is typically natural rather than engineered.

 

Game developers and players have critiqued gamification on the grounds that it gets games wrong, mistaking incidental properties like points and levels for primary features like interactions with behavioral complexity. That may be true, but truth doesn't matter for bullshitters. Indeed, the very point of gamification is to make the sale as easy as possible.

 

I have never been much for handheld games, cell-phone games, or smaller games in general, but after spending several weeks playing games on my iPad, I can say that the best of them provide as much, if not more, consistent engagement than their console brethren. In fact, a really fine iPad game offers an experience in which many of the impurities of console gaming are boiled away.

 

Games are based on problems to solve, not content. This doesn't mean that game-based problem-solving should eclipse learning content, but I think we are increasingly seeing that a critical part of being literate in the digital age means being able to solve problems through simulations and collaboration.   

Videogames, and the type of learning and thinking they generate, may serve as a cornerstone for education and economies of the future.

via pbs.org

 

Simply put, we can’t keep preparing students for a world that doesn’t exist. We can’t keep ignoring the formidable cognitive skills they’re developing on their own. And above all, we must stop disparaging digital prowess just because some of us over 40 don’t happen to possess it. An institutional grudge match with the young can sabotage an entire culture.

 

Everyone benefits from more education. No one benefits from an educational system that defines learning so narrowly that whole swaths of human intelligence, skill, talent, creativity, imagination, and accomplishment do not count.

 

Thesis Whisperer is part of a growing trend for PhD students to meet and support each other through social media as they pursue the long, demanding and often draining journey to a completed thesis.

 

At first glance, digital preservation seems to promise everything: nearly unlimited storage, ease of access and virtually no cost to making copies. But the practical lessons of digital preservation contradict the notion that bits are eternal. Consider those 5 1/4-inch floppies stockpiled in your basement. When you saved that unpublished manuscript on them, you figured it would be accessible forever. But when was the last time you saw a floppy drive?

 

On texting, videogames, and writing #dyr

Last year, 4.16 billion users made SMS the most popular data channel in the world. An estimated 6.1 trillion texts were sent, up from 1.8 trillion in 2007. And while the proportion of customers using SMS for more than simple messaging is still small, in poor nations these services are already changing the nature of commerce, crime, reporting news, political participation, and governing.

 

The subjects and themes and audiences of games should be no less of a concern than the contexts and purposes to which they are put. Not just adolescent fantasy and political activism, but everything in between.

 

research found that giving players the chance to adopt a new identity during the game and acting through that new identity – be it a different gender, hero, villain – made them feel better about themselves and less negative.

Looking at the players' emotion after play as well their motivation to play, the study found the enjoyment element of the videogames seemed to be greater when there was the least overlap between someone's actual self and their ideal self.

 

Video games aren't science. They are not a mystery of the universe that can be explained away via testable predictions and experimentation. We need to stop looking for answers, whether those answers would come from a technical innovation whose arrival only renews obsession with the next breakthrough, or from the final exploitation of the true nature of our medium by means of a historical discovery so obvious that it will become indisputable. The answers lie not in the past or the future, but in the present

 

We enter college hoping to learn effective communication skills—the kind of skills the recruiter in the Wall Street Journal article wished we possessed. But the joke is on us: the professors from whom we seek guidance, themselves don’t know good prose from porridge.

When we attend college, we throw our impressionable young minds headlong into this bog of ”scholars” such as Parsons; headlong into this asylum in which esteemed academic journals will publish gibberish if one uses the right buzzwords; headlong into this funhouse in which a computer program can generate random meaningless prose that reads passably like the stuff assigned in most graduate and undergraduate humanities classes. And from within this stylistic cesspool, we hope to get an education in how to write good prose.

 

"Video game artificial intelligence is a fascinating merger between programming and artistic deception." #wymhm

An academic trying to simulate a human brain has a massive super computer devoted to the task of thinking, while an AI programmer for a video game is instead working with a small percentage of processor power. The majority of the computing power in games is instead going towards depicting graphics, sound, physics, and cow bell type things. For example, technically the AI in Halo 3 is less sophisticated than in Halo 2 because most of the processor has to be devoted to graphics. A giant, open world game will inherently have stupider AI because there just isn’t enough power to go around. So the art of video game AI is in making a player think that they’re interacting with something more sophisticated than it really is.

"Games work with a new and different palette" #wymhm

It's architectural, it’s spatial, it’s procedural, it’s aural, it’s happening on dozens of different physical platforms. We don't have ways to talk about the aesthetic experience of mastering a system of rules, the growing understanding of how that system treats, rewards, or punishes you, where its peaks and valleys are. We don’t know what it means to have art you can win or lose. (I’ve run workshops where we try and adapt Romeo and Juliet to interactive media: What happens to tragedy when you can win?)

We’re not even talking about a moving stable target. The medium is still the site of constant experimentation — technological, formal, aesthetic ideas emerge every year. And compared to the mainstream games that take two or three years to make, the world of hobbyist, indie, and academic game development are on fast-forward, and more ambitious developers are constantly watching and learning from them). We know we can do better. If we're honest with ourselves, Bioshock isn't good enough. Grand Theft Auto IV isn't good enough.

"Taking video games seriously is not unlike taking television seriously." #wymhm

Anyone who has played one of the new generation of video games and not taken some pleasure in doing so is either a liar or not a human being (alien, skin job, etc.). We play these video games because they are fun, and fun does not easily break down into units of greater analysis. Fun is fun. Exploring new worlds containing creatures strange and hilarious is fun. Hunting zombies is fun. The feeling of speed and adventure created by a good game is fun. I remember the sheer sense of exhilaration playing one of the first Sonic the Hedgehog games. In the real world, I would never move so fast and with such abandon. I remember the genuine, giddy fear that shot through me the first time I was manhandled by a zombie in Resident Evil.

Because videogames are experiential and fun, we have writing about videogames that has almost exclusive focus on experience and fun. This is fine as a starting point, but it shouldn't be the alpha and omega of how and why we talk and write about videogames. As I think we can see in Bissell's Extra Lives and Rossignol's This Gaming Life, such focus has a tendency to cloud, overlook or simplify what videogames are and do. There are exceptions, of course, but not overwhelmingly so. I'm also unsure about the successes of Bissell and Rossignol in terms of reaching a larger audience and/or convincing skeptics of why videogames matter.

Fun is not a good enough answer. Experiential writing is not a good enough answer. At least not for me.

"Ms. Pac-Man assumes her husband's phantoms, and Jr. Pac-Man in turn inherits the burden." #wymhm

Two hierarchies enmesh themselves in the routine of consuming pellets and hunting phantoms, producing a single myth. The first is familial, the patrilineal logic that tells us Pac-Man and his partner, who we know only by her relation to Pac-Man, should bear a son in the image of themselves and in the image of modern France. The sign of Jr. Pac-Man communicates by way of exclusion that the future France will, too, descend from Charlemagne. Our magazines advertise that France abides "three colors, [but] one empire," while silently assuring the Gaulois bourgeoisie that the face of Pac will be the singular face of the multivalent state.

" video games are less of a discreet category of visual entertainment than they once were." #wymhm

More importantly, I highlight how many video games are now inspiring movies, music, books, and comics, including: Prince of Persia, Max Payne, Resident Evil, Tomb Raider, Doom, Final Fantasy, Halo, and Gears of War. The characters and storylines in the books, comics, and movies based on these games often closely track the video games that inspired them.  Increasingly, therefore, games are developed along parallel tracks with these other forms of content. Thus, to regulate games under the standard California proposes in this case raises the question of whether those other types of media should be regulated in a similar fashion.  Should every iteration of the original game title be regulated under the standard California has suggested if those books, comics, or movies contain violent themes?