On composition pedagogy, the syllabus, Twitter, journalism, privacy, copyright, and videogames #dyr

it’s too easy to allow the classroom work associated with composition courses to focus on activities other than writing. I’ve been in many composition classes here and at other institutions where the students discuss readings and approaches and the teachers facilitate work and manage discussion and sometimes stand at the front of the classroom and show students things. Compositionists know and agree and emphasize that the work of the writing class is writing, and yet — in many classes — students simply don’t produce much text, largely because of the way we apportion the work of the course.

 

we may be a little too fond of limiting and certainty. These days syllabi are looking more and more like those Terms of Service that pop up when we use software...They are contracts that we can’t negotiate, and they contain provisions we might not agree to, if we understood what they actually meant. But the most striking thing about TOS is that they are full of rules – and very few people read them.

 

what are the teaching and learning practices of the networked classroom? No doubt there are people out there doing that work, and those of us who have taught in computer labs have related, relevant experiences. In both cases, it's a matter of turning the focal point away from the professor. Even in the class discussion format, among faculty committed to "decentering" the classroom, conversation generally runs through the professor, or at least the professor steers conversation through its iterations. As we have discovered, nothing decenters the classroom quite like a room full of laptops and smartphones, eh? The networked students is only partly in the classroom and is partly distributed. 

 

The reality of the Twitter effect isn’t just that President Obama has Twitter town halls now where he talks directly to American citizens, nor is it just that someone with no journalism background sitting in a house in Pakistan can report on a military raid that kills the world’s most notorious terrorist. It’s that journalism of all kinds has now become something you do, not something you are. Anyone can do it, whether they call themselves a journalist or not.

 

Politics presented as entertainment charges the press with a failure to treat the serious stuff seriously. And that is a valid critique. But here’s a trickier problem: even when the press is trying to be serious, to provide, say, “analysis” instead of a good yarn, it increasingly relies on an impoverished notion of politics, a cluster of bad ideas that together form the common sense of the craft

 

The definition of privacy has been thrown out the window, and we have a new definition of privacy, which is whether we have control of what companies are doing with this information and if we have knowledge of how it’s being used.

 

you can’t motivate monopoly legislation based on your costs, when others are doing the same thing for much less — practically zero. There has never been as much music available as now, just because all of us love to create. It’s not something we do because of money, it’s because of who we are. We have always created.

 

Just like in the best zombie movies, the real drama in L4D lies in the relationships between the living, not the dead. The infected are just a pretext for collapsing the social order and forcing people to depend on one another to survive. It’s the ultimate online co-op experience, a game that requires not just headshot skills but communication, collaboration and confidence in your fellow player.

 

On piracy, nostalgia, book publishing, New Yorker cartoons, composition pedagogy, education reform, and the humanities #dyr

The film industry loses $6.1 billion annually to digital piracy, according to a study conducted by economist Stephen Siwek and cited recently by the Motion Picture Association of America (MPAA). And the Independent Film and Television Alliance (IFTA) says royalty rights for indie films have been halved from what they were five years ago. The John Doe lawsuits are a way for desperate movie studios and distributors to recoup those losses. Armed with a list of IP addresses and draconian copyright laws, lawyers for the scorned studios are treating a broad swath of the Internet-browsing public like their own personal ATM.

 

an undercurrent to grunge retrospection is the music media's and record industry's own nostalgia for the heyday of the rock monoculture. It was already crumbling in the early '90s, thanks to rap (the rebel music of black youth, obviously, but a lot of white kids had defected to hip-hop, too) and to the emergence of rave and electronic dance culture (in America destined always to be a minority subculture, but in Europe the dominant form of '90s pop). Grunge was the last blast of rock as a force at once central in popular culture yet also running counter to mainstream show biz values.

 

There is no way to limit the output of books. But the sense that there may be too many of them is a message to authors, agents, and publishers that they would do well to exercise judgment in choosing which books actually deserve to be written and supported. At the moment, however, the process is moving in the other direction: self-publishing as a business is booming, and Amazon, Apple, and Google, with their various devices and imprints, seem to be lowering the entry bar because these corporate behemoths see new publishing ventures as a source of revenue, pretty much regardless of quality.

 

If making graphic novels felt like a staid long-term relationship, then doing gag comics is like playing the field. One day I could draw a fortuneteller; the next, an astronaut. I went from sultans to superheroes, robots to rabbits. I felt liberated. I refused to get bogged down or fuss over the drawings. I spent no more than an hour with any one cartoon, and many took far less time than that. For the first two weeks I was feeling my oats. I already had a half-dozen keepers and was confident there were plenty more winners on the way. It was at this point that I started dreaming of actually selling a cartoon to The New Yorker.

 

When you are trying to teach writing, you are trying to teach something that, when it comes down to it, we don't know a lot about. Actually, that's not precisely true. We know a great deal about writing, if by writing one is referring to an abstract concept. There is a lot of scholarship that describes and theorizes writing, to say nothing of the scholarship about particular texts. But to understand writing, one would have to understand thinking. While there is a lot of interesting brain research going on, there's nothing that going to tell you "follow these steps to come up with a good idea for your paper." Instead what we have are lots of techniques that sometimes work. Or, to quote Anchorman, "60% of the time, it works every time." The problem lies in mistaking techniques for empirical facts. There is no definitive "how to write." In short, the goal of the course is to help students become better writers, but there is no definition of "better," there is no clear, general writing practice, and there is no set body of knowledge to impart. 

 

Real educational reform, as I see it, requires a fundamental shift in our understanding of the educational process...For starters, it requires that we abandon the idea that adults are in charge of children's learning.  It requires, in other words, that we throw out the basic premise that underlies our system of schooling. 

 

The humanities needs more courage and more contact with the world. It needs to extend the practice of humanism into that world, rather than to invite the world in for tea and talk of novels, only to pat itself on the collective back for having injected some small measure of abstract critical thinking into the otherwise empty puppets of industry. As far as indispensability goes, we are not meant to be superheroes nor wizards, but secret agents among the citizens, among the scrap metal, among the coriander, among the parking meters.

 

On curation, technology, kids today, librarians, blogging, and privacy #dyr

A big part of this new age of creation is that you have infinite choice, and no clear concept of where to start...So what’s the fix? You need a filter. And I strongly believe that while algorithmical filters work, you need people to tell you about things you wouldn’t find that way.

technology has never been cold, impersonal, and industrial. We simply chose to understand it that way. Technology has always had a role in shaping the inner life, the intimate life. The telephone - surely a shaping force in the making and shaping of self. The telegram, the letter, the book.

...

Nor was there anything cold about how industrial technologies such as cars and trains shaped our sensibilities, our sense of self, of our sensuality, our possibilities. If we have succumbed to an ideology of technological neutrality that is something that needs to be studied as an independent phenomenon; it is not to be taken as a given.

whatever the flavor of the month in terms of new technologies are, there’s research that comes out very quickly that shows how it causes our children to be asocial, distracted, bad in school, to have learning disorders, a whole litany of things.

Davidson's youth worship, though extreme, is common these days among those who write about technology and society. Individuals born after the dawn of the Internet are not the same as you and me, goes the now-familiar refrain. As a result of their lifelong immersion in electronic media, young people's brains are "wired differently," and they require different schools, different workplaces, and different social arrangements from the ones we have. They are described, with more than a little envy, as "digital natives," effortlessly at home in an electronic universe, while we adults are "digital immigrants," benighted arrivals from the Old World doomed to stutter in a foreign tongue.

students rarely ask librarians for help, even when they need it. The idea of a librarian as an academic expert who is available to talk about assignments and hold their hands through the research process is, in fact, foreign to most students. Those who even have the word “librarian” in their vocabularies often think library staff are only good for pointing to different sections of the stacks.

Facebook and Twitter are too easy. Keeping up a decent blog that people actually want to take the time to read, that’s much harder. And it’s the hard stuff that pays off in the end.

Besides, even if they’re very good at hiding the fact, over on Twitter and Facebook, it’s not your content, it’s their content.

The content on your blog, however, belongs to you, and you alone. People come to your online home, to hear what you have to say, not to hear what everybody else has to say. This sense of personal sovereignty is important.

I have always been understanding that these tech giants need to make money. Supporting tens of millions of users takes time and a whole lot of resources. While it’s in Google, Facebook, and LinkedIn’s interests to attract as many users as possible – and clearly free is the way – there are obvious consequences: Users get to play without paying, but every few months we get kicked in the face when our digital profiles get abused.

employers are increasingly aware of and keen to use the huge informational resource that social media serves up on a plate; all kind of information is in the public domain, and incredibly easy to find – particularly if the applicant has an unusual name. As the chief executive of Social Intelligence has said, with something of a corporate shrug, "All we assemble is what's publicly available on the internet today". Nothing underhand going on here, they say; the company believes that the information is out there to be evaluated.

On rhetoric, Anonymous, bookstores, connectedness, videogames, digital natives, and slang #dyr

The reason that rhetoricians have never preponderantly been the primary sources that media go after is that we are just one of many competitors interpreting reality, and often we are looked at as purveyors of ‘‘mere rhetoric’...rhetoricians, although they are often aligned with the political zeitgeist of academia, must compete with other high-ethos sources in or social commentary sources which, again, have more credentials to be able to sort out reality: political scientists, historians, journalists, bloggers, etc. In fact, the fragmentation of prominent sources of rhetoric demands even more the approach to rhetoric argued in the ‘‘Myth’’ piece. Imagine how increasingly irrelevant situationally-grounded rhetoricians’ depictions and interpretations of reality must seem to political principals, political professionals, and even average citizens.

part of Anonymous has over the last three years moved from disaggregated practices rooted in the culture of trolling to also become a rhizomatic and collective form of action catalyzed and moved forward by a series of world events and political interventions.

a small tribe of devoted book lovers with a business bent say that the economic setting has been right for small, highly personal ventures.

The lesson in the decline of big stores, these owners say, is not that no one wants to buy books. It’s that the big stores were too big. They had overreached and, in trying to be all things to all readers, had lost a sense of intimacy that books and reading seem to thrive on.

The Internet has had a dual effect on the level of connectedness I feel with the people I know in my offline life. On one hand, the basic communication tools now available make distance almost a non-issue...On the other hand, when I am actually with my friends and family, I find myself (and increasingly, my companions) distracted by a smartphone that’s either the object of my gaze or being fingered in my front pocket.

People have less time to play games than they did before. They have more options than ever. And they're more inclined to play quick-hit multiplayer modes, even at the expense of 100-hour epics.
via cnn.com

So Prensky was right the first time – there really is digital native generation? No, certainly not – and that’s what’s important about this study. It shows that while those differences exist, they are not lined up on each side of any kind of well-defined discontinuity. The change is gradual, age group to age group. The researchers regard their results as confirming those who have doubted the existence of a coherent ‘net generation’.

There's no grand unified theory for why some slang terms live and others die. In fact, it's even worse than that: The very definition of slang is tenuous and clunky. Writing for the journal American Speech, Bethany Dumas and Jonathan Lighter argued in 1978 that slang must meet at least two of the following criteria: It lowers "the dignity of formal or serious speech or writing," it implies that the user is savvy (he knows what the word means, and knows people who know what it means), it sounds taboo in ordinary discourse (as in with adults or your superiors), and it replaces a conventional synonym. This characterization seems to open the door to words that most would not recognize as slang, including like in the quotative sense: "I was like … and he was like." It replaces a conventional synonym (said), and certainly lowers seriousness, but is probably better categorized as a tic.

On reading, writing, social media, surveillance, videogame violence, and genre #dyr

the human brain was never meant to read. Not text, not papyrus, not computer screens, not tablets. There are no genes or areas in the brain devoted uniquely to reading. Rather, our ability to read represents our brain's protean capacity to learn something outside our repertoire by creating new circuits that connect existing circuits in a different way. Indeed, every time we learn a new skill – whether knitting or playing the cello or using Facebook – that is what we are doing.

Touch typing allows us to write without thinking about how we are writing, freeing us to focus on what we are writing, on our ideas. Touch typing is an example of cognitive automaticity, the ability to do things without conscious attention or awareness. Automaticity takes a burden off our working memory, allowing us more space for higher-order thinking. (Other forms of cognitive automaticity include driving a car, riding a bike and reading—you're not sounding out the letters as you scan this post, right?) When we type without looking at the keys, we are multi-tasking, our brains free to focus on ideas without having to waste mental resources trying to find the quotation mark key. We can write at the speed of thought.

Facebook, Twitter, and other forms of social software are about consumption and production, about dialectic interaction on the read/write web. It’s no wonder short-form writing in sociotechnical networks is epistemologically productive, often leading to richer, longer-form writing work. Savvy writers might intentionally deploy sociotechnical notemaking as a powerful heuristic strategy for moving from short-form to long-form writing practices. Sociotechnical notemaking may therefore be defined as short-form writing work that is typically enacted informally via the enabling technologies of social software, with explicit heuristic, inventional, and epistemological implications.

before we give more attention to having students write briefly to fit their text-messaging sensibilities and the latest technologies, we should be more forceful about expecting and bringing their attention to accuracy and precision. Strunk and White, in their classic The Elements of Style, caution against predilection for brevity over precision in their 19th style reminder: “Do not take shortcuts at the cost of clarity." I suspect most instructors would agree with this admonition, as I trust precision of thought and expression from our students is paramount for most of us.

Ideas don’t need the media any more than the media need ideas. They’ve relied on each other in the past, true enough — media as the gatekeepers, ideas as the floods — but the present media moment is characterized above all by the fact that ideas, Big and otherwise, can be amplified independently of traditional media filters. The public, online, is empowered to decide for itself which ideas are worthy of changing the world.

In their concern to stop not just mob violence but commercial crimes like piracy and file-sharing, Western politicians have proposed new tools for examining Web traffic and changes in the basic architecture of the Internet to simplify surveillance. What they fail to see is that such measures can also affect the fate of dissidents in places like China and Iran. Likewise, how European politicians handle online anonymity will influence the policies of sites like Facebook, which, in turn, will affect the political behavior of those who use social media in the Middle East.

Through two online surveys and four experimental studies, the researchers showed that people stayed glued to games mainly for the feelings of challenge and autonomy they experience while playing. Both seasoned video gamers and novices preferred games where they could conquer obstacles, feel effective, and have lots of choices about their strategies and actions.

These elements, said coauthor Richard Ryan, a motivational psychologist at the University, represent "the core reasons that people find games so entertaining and compelling. Conflict and war are a common and powerful context for providing these experiences, but it is the need satisfaction in the gameplay that matters more than the violent content itself."

There are no meaningful genres in games anymore. It’s a good thing that developers are pushing back borders and finding interesting ways to combine old mechanics, but as a consequence, there’s no ways of separating works with huge and obvious disparities. There ought to be a way to categorize games in a meaningful, succinct way that doesn’t implicitly suggest a high art/low art dichotomy.

On writing, blogging, design, fake Twitter accounts, spoilers, and death on Facebook #dyr

research verifies that taking notes makes writing easier­—as long as you don't look at them while you are writing the draft! Doing so causes a writer to jump into reviewing/evaluating mode instead of getting on with the business of getting words on the screen.

 

I have come up with a conceptual framework that explains what I believe to be the core elements--and the essential worth--of a blogging initiative, either within a course or across an entire program. I've built the framework out of three imperatives: "Narrate, Curate, Share." I believe these three imperatives underlie some of the most important aspects of an educated citizen's contributions to the human record. And in my experience, blogging offers a uniquely powerful way of becoming a self-aware learner in the process of making those contributions.

 

Breakthroughs in all fields—science and engineering, literature and art, music and history, law and medicine—all come about when people find fresh insights, new points of view and propagate them. There is no shortage of creative people in this world, people with great ideas that defy conventional wisdom. These people do not need to claim they have special modes of thinking, they just do what comes naturally to them: break the rules, go outside the existing paradigms, and think afresh. Yes, designers can be creative, but the point is that they are hardly unique.

 

One of my indulgences, however, is reading well-crafted tweets from satirical Tweeters who've taken on the persona of someone else. To do it right is like being a method actor: You have to get inside the head of a famous person but with a twist; the post has to be funny and insightful. It isn't easy and Twitter is littered with failures.

 

Subjects liked the literary, evocative stories least overall, but still preferred the spoiled versions over the unspoiled ones.

Why? The answers go beyond the scope of the study, but one possibility is perhaps the simplest one: that plot is overrated.

 

Nobody is resting in peace anymore. The suicide, the aneurysm, the overdose. Distilled into how they died because their [Facebook] pages are a persistent reminder they are dead, not of how they made me feel alive. I’d like to believe a legacy is in memories made, not the unintended irony of a last status update.

 

On writing, grammar, gamification, videogames, education, academia, and data preservation #dyr

I don’t know the origin of the “write what you know” logic. A lot of folks attribute it to Hemingway, but what I find is his having said this: “From all things that you know and all those you cannot know, you make something through your invention that is not a representation but a whole new thing truer than anything true and alive.” If this is the logic’s origin, then maybe what’s happened is akin to that old game called Telephone. In the game, one kid whispers a message to a second kid and then that kid whispers it to a third and so on, until the message circles the room and returns to the first kid. The message is always altered, minimized, and corrupted by translation. “Bill is smart to sit in the grass” becomes “Bill is a smart-ass.” A similar transmission problem undermines the logic of writing what you know and, ironically, Hemingway may have been arguing against it all along. The very act of committing an experience to the page is necessarily an act of reduction, and regardless of craft or skill, vision or voice, the result is a story beholden to and inevitably eclipsed by source material.

 

there is still no widely-accepted gender-neutral pronoun. In part, that’s because pronoun systems are slow to change, and when change comes, it is typically natural rather than engineered.

 

Game developers and players have critiqued gamification on the grounds that it gets games wrong, mistaking incidental properties like points and levels for primary features like interactions with behavioral complexity. That may be true, but truth doesn't matter for bullshitters. Indeed, the very point of gamification is to make the sale as easy as possible.

 

I have never been much for handheld games, cell-phone games, or smaller games in general, but after spending several weeks playing games on my iPad, I can say that the best of them provide as much, if not more, consistent engagement than their console brethren. In fact, a really fine iPad game offers an experience in which many of the impurities of console gaming are boiled away.

 

Games are based on problems to solve, not content. This doesn't mean that game-based problem-solving should eclipse learning content, but I think we are increasingly seeing that a critical part of being literate in the digital age means being able to solve problems through simulations and collaboration.   

Videogames, and the type of learning and thinking they generate, may serve as a cornerstone for education and economies of the future.

via pbs.org

 

Simply put, we can’t keep preparing students for a world that doesn’t exist. We can’t keep ignoring the formidable cognitive skills they’re developing on their own. And above all, we must stop disparaging digital prowess just because some of us over 40 don’t happen to possess it. An institutional grudge match with the young can sabotage an entire culture.

 

Everyone benefits from more education. No one benefits from an educational system that defines learning so narrowly that whole swaths of human intelligence, skill, talent, creativity, imagination, and accomplishment do not count.

 

Thesis Whisperer is part of a growing trend for PhD students to meet and support each other through social media as they pursue the long, demanding and often draining journey to a completed thesis.

 

At first glance, digital preservation seems to promise everything: nearly unlimited storage, ease of access and virtually no cost to making copies. But the practical lessons of digital preservation contradict the notion that bits are eternal. Consider those 5 1/4-inch floppies stockpiled in your basement. When you saved that unpublished manuscript on them, you figured it would be accessible forever. But when was the last time you saw a floppy drive?

 

On texting, videogames, and writing #dyr

Last year, 4.16 billion users made SMS the most popular data channel in the world. An estimated 6.1 trillion texts were sent, up from 1.8 trillion in 2007. And while the proportion of customers using SMS for more than simple messaging is still small, in poor nations these services are already changing the nature of commerce, crime, reporting news, political participation, and governing.

 

The subjects and themes and audiences of games should be no less of a concern than the contexts and purposes to which they are put. Not just adolescent fantasy and political activism, but everything in between.

 

research found that giving players the chance to adopt a new identity during the game and acting through that new identity – be it a different gender, hero, villain – made them feel better about themselves and less negative.

Looking at the players' emotion after play as well their motivation to play, the study found the enjoyment element of the videogames seemed to be greater when there was the least overlap between someone's actual self and their ideal self.

 

Video games aren't science. They are not a mystery of the universe that can be explained away via testable predictions and experimentation. We need to stop looking for answers, whether those answers would come from a technical innovation whose arrival only renews obsession with the next breakthrough, or from the final exploitation of the true nature of our medium by means of a historical discovery so obvious that it will become indisputable. The answers lie not in the past or the future, but in the present

 

We enter college hoping to learn effective communication skills—the kind of skills the recruiter in the Wall Street Journal article wished we possessed. But the joke is on us: the professors from whom we seek guidance, themselves don’t know good prose from porridge.

When we attend college, we throw our impressionable young minds headlong into this bog of ”scholars” such as Parsons; headlong into this asylum in which esteemed academic journals will publish gibberish if one uses the right buzzwords; headlong into this funhouse in which a computer program can generate random meaningless prose that reads passably like the stuff assigned in most graduate and undergraduate humanities classes. And from within this stylistic cesspool, we hope to get an education in how to write good prose.

 

On blogging, social media, self-publishing, and teaching reading and writing #dyr

What does that mean: Blogging is writing without a safety net?

This means that you are on your own. Your work is all yours, and it rises or falls on its own merits. Nobody is fact-checking you before you hit “Publish” (though many commenters will afterwards), and nobody is having your back after your publish – you are alone to defend your work against the critics. If you are good and trusted, you may have a community of bloggers or commenters who will support you, but there is no guarantee.

You can see, from the above paragraph, that there are two senses of “blogging is writing without a safety net”. One concerns pre-publication – there is no editor to check your work. The other concerns post-publication – nobody protects you.

 

While the blogs have exposed wrongdoers and broken news before, this week’s performance may signal the arrival of weibos as a social force to be reckoned with, even in the face of government efforts to rein in the Internet’s influence.

The government censors assigned to monitor public opinion have let most, though hardly all of the weibo posts stream onto the Web unimpeded. But many experts say they are riding a tiger. For the very nature of weibo posts, which spread faster than censors can react, makes weibos beyond easy control. And their mushrooming popularity makes controlling them a delicate matter.

 

We assume that Facebook is something we should associate with the young, but my evidence suggests that this is entirely mistaken.

If there is one obvious constituency for whom Facebook is absolutely the right technology, it is the elderly. It allows them to keep closely involved in the lives of people they care about when, for one reason or another, face-to-face contact becomes difficult... Its origins are with the young but the elderly are its future.

 

Twitter/Facebook/G+ are secondary media. They are a means to connect in crisis situations and to quickly disseminate rapidly evolving information. They are also great for staying connected with others on similar interests (Stanley Cup, Olympics). Social media is good for event-based activities. But terrible when people try to make it do more – such as, for example, nonsensically proclaiming that a hashtag is a movement. The substance needs to exist somewhere else (an academic profile, journal articles, blogs, online courses).

 

There are many reasons potential authors want to publish their own books, Mr. Weiss said. They have an idea or manuscript they have passed around to various agents and publishers with no luck; they may just want to print a few copies of, say, a memoir for family members; they want to use it in their business as a type of calling card; or they actually want to sell a lot of books and make their living as writers.

 

In a hyper–abundant book world, where previous patterns of discovery may not work as well as they used to, readers are developing, and increasingly will need to develop, new ways of discovering titles that might interest them. Marketing and discovery are moving to the forefront of book marketplace activity, and social networks are adding new ideas and opportunities to the stable of traditional ways to bring books to the attention of potential readers.

 

The academic study of literature is a wonderful thing, and not just because it has paid my salary for most of my adult life, but it is not an unmixed blessing, and teachers will rarely find it possible simply to inculcate the practices of deeply attentive reading.

Over the past 150 years, it has become increasingly difficult to extricate reading from academic expectations; but I believe that such extrication is necessary. Education is and should be primarily about intellectual navigation, about—I scruple not to say it—skimming well, and reading carefully for information in order to upload content. Slow and patient reading, by contrast, properly belongs to our leisure hours.

 

email has such obvious promise as a tool for writing, and sharing writing, and teaching writing. It takes words and it sends them anywhere right away. If in 1976 you wanted to see a student's work in progress, you needed a printer and an appointment. The student had to take notes while you talked, walk home, remember what exactly you said, and work up a new draft. If he came to another impasse he'd probably keep it to himself -- nobody is going to office hours five times in three days. (Nobody is holding office hours five times in three days.)

Today each of these transactions -- copy, paste, send; receive, annotate, reply -- might take a few minutes. Emails can be composed and consumed anywhere, privately, quietly, at one's convenience. It is the free ubiquitous highway for words. It is exactly the tool you'd invent if you were a teacher of writing who wanted a better way to teach people to write.