On writing, grammar, gamification, videogames, education, academia, and data preservation #dyr

I don’t know the origin of the “write what you know” logic. A lot of folks attribute it to Hemingway, but what I find is his having said this: “From all things that you know and all those you cannot know, you make something through your invention that is not a representation but a whole new thing truer than anything true and alive.” If this is the logic’s origin, then maybe what’s happened is akin to that old game called Telephone. In the game, one kid whispers a message to a second kid and then that kid whispers it to a third and so on, until the message circles the room and returns to the first kid. The message is always altered, minimized, and corrupted by translation. “Bill is smart to sit in the grass” becomes “Bill is a smart-ass.” A similar transmission problem undermines the logic of writing what you know and, ironically, Hemingway may have been arguing against it all along. The very act of committing an experience to the page is necessarily an act of reduction, and regardless of craft or skill, vision or voice, the result is a story beholden to and inevitably eclipsed by source material.

 

there is still no widely-accepted gender-neutral pronoun. In part, that’s because pronoun systems are slow to change, and when change comes, it is typically natural rather than engineered.

 

Game developers and players have critiqued gamification on the grounds that it gets games wrong, mistaking incidental properties like points and levels for primary features like interactions with behavioral complexity. That may be true, but truth doesn't matter for bullshitters. Indeed, the very point of gamification is to make the sale as easy as possible.

 

I have never been much for handheld games, cell-phone games, or smaller games in general, but after spending several weeks playing games on my iPad, I can say that the best of them provide as much, if not more, consistent engagement than their console brethren. In fact, a really fine iPad game offers an experience in which many of the impurities of console gaming are boiled away.

 

Games are based on problems to solve, not content. This doesn't mean that game-based problem-solving should eclipse learning content, but I think we are increasingly seeing that a critical part of being literate in the digital age means being able to solve problems through simulations and collaboration.   

Videogames, and the type of learning and thinking they generate, may serve as a cornerstone for education and economies of the future.

via pbs.org

 

Simply put, we can’t keep preparing students for a world that doesn’t exist. We can’t keep ignoring the formidable cognitive skills they’re developing on their own. And above all, we must stop disparaging digital prowess just because some of us over 40 don’t happen to possess it. An institutional grudge match with the young can sabotage an entire culture.

 

Everyone benefits from more education. No one benefits from an educational system that defines learning so narrowly that whole swaths of human intelligence, skill, talent, creativity, imagination, and accomplishment do not count.

 

Thesis Whisperer is part of a growing trend for PhD students to meet and support each other through social media as they pursue the long, demanding and often draining journey to a completed thesis.

 

At first glance, digital preservation seems to promise everything: nearly unlimited storage, ease of access and virtually no cost to making copies. But the practical lessons of digital preservation contradict the notion that bits are eternal. Consider those 5 1/4-inch floppies stockpiled in your basement. When you saved that unpublished manuscript on them, you figured it would be accessible forever. But when was the last time you saw a floppy drive?

 

On social media, terrorism, and academia #dyr

working online also pushes education beyond the confines of school, allowing kids to broaden discussion of their work. And it forces them to do "authentic" work that gets tested out in the real world, as outside viewers see it and respond to it.

 

Today's online experience is really the experience of being part of a gigantic crowd of people, said Jon Kleinberg, the Tisch University Professor of Computer Science at Cornell, in a lecture about what social media and other popular websites can teach us about ourselves, July 20 in Kennedy Hall.

When we go online, we do not just learn about an event, said Kleinberg...We also learn about the experiences, opinions and reactions of millions of people.

 

psychologists call it "deindividuation". It's what happens when social norms are withdrawn because identities are concealed...And it's why under the cover of an alias or an avatar on a website or a blog – surrounded by virtual strangers – conventionally restrained individuals might be moved to suggest a comedian should suffer all manner of violent torture because they don't like his jokes, or his face. Digital media allow almost unlimited opportunity for wilful deindividuation. They almost require it. The implications of those liberties, of the ubiquity of anonymity and the language of the crowd, are only beginning to be felt.

 

Don't talk, then, about the wildness in our rhetoric today, and its undeniable roots in that deep strain of political violence that runs through our national DNA, on a gene that is not always recessive. Don't relate Centennial Park in Atlanta in 1996 to Oklahoma City to murdered doctors to Columbine, and then to Tucson and to the bag on the bench in Spokane. Ignore the patterns, deep and wide, that connect each event to the other like a slow-burning fuse to a charge. That there are among us rage-hardened, powerless people who resort to the gun and the bomb. That there are powerful people who deplore the gun and the bomb, but who do not hesitate to profit from their use. And when the gun goes off or the bomb explodes, the powerful will deplore the actions of the powerless, and they will reassure the rest of us that We are not like Them, who are violent and crazy and whose acts have no reason beyond unfathomable madness. But above all, they will say, Ignore the fact that there is still a horrible utility in political violence, the way there was during Reconstruction, or during the labor wars of the early twentieth century. If there were not, it wouldn't be so hard to get an abortion in Kansas, and assault weapons would not have been accessories of choice at recent rallies purportedly held to discuss changes in the way the country organizes its health-care system.

 

Breivik wrote about different classifications of “traitors,” or individuals he felt could be killed during his imagined revolution.  In his handbook, he suggested that revolutionaries consider attacking both “literature conferences and festivals” and “annual gatherings for journalists.

 

Almost by definition, academics have gotten to where they are by playing a highly scripted game extremely well. That means understanding and following self-reinforcing rules for success.

 

Academic journals generally get their articles for nothing and may pay little to editors and peer reviewers. They sell to the very universities that provide that cheap labour.

 

I can only recommend graduate school in the humanities—and, increasingly, the social sciences and sciences—if you are independently wealthy, well-connected in the field you plan to enter (e.g., your mom is the president of an Ivy League university), or earning a credential to advance in a position you already hold

 

"where wikis are jibing with the culture of academe, and where they are not" #wymhm

In most cases, using wikis to pool human knowledge of various topics into single, authoritative accounts falls into the “not” category. Academic culture abhors mass authorship. This is not only because many disciplines are given to disagreement and conflicting interpretations, but because scholars tend to chafe at the notion of not getting credit for their work, or having it fussed with by others.

“Literature reviews and summaries of articles are never going to be entirely objective; it would be difficult to write useful ones, for example, that conformed to Wikipedia's NPOV [neutral-point-of-view] requirements,” says Jason B. Jones, an associate professor of English at Central Connecticut State University.

Anyway, Jones says, the professoriate is too entrenched in traditional publishing to summon much interest in helping curate academic wikis.

"Students of all ability levels are studying less." #wymhm

Some question whether college students ever could have studied 24 hours a week — roughly three and a half hours a night. But even if you dispute the historical decline, there is still plenty of reason for concern over the state of 21st-century study practices. In survey after survey since 2000, college and high school students are alarmingly candid that they are simply not studying very much at all. Some longtime professors have noted the trend, which rarely gets mentioned by college admissions officials when prospective students visit campus.

But when it comes to “why,” the answers are less clear. The easy culprits — the allure of the Internet (Facebook!), the advent of new technologies (dude, what’s a card catalog?), and the changing demographics of college campuses — don’t appear to be driving the change, Babcock and Marks found. What might be causing it, they suggest, is the growing power of students and professors’ unwillingness to challenge them.

"a new, broader approach to tenure when considering public historians"

Public historians conduct history research and promote history in museums, parks, schools, nonprofit groups and elsewhere -- designing exhibits, overseeing archives and developing educational programs, all based on scholarship, but for a broader audience than scholars. As more history departments have created public history programs (in part because it is considered a growth area in which historians may find jobs), they have hired more public historians -- often creating tension over how to evaluate them.

To what degree are such public approaches available to academics in other fields and, perhaps more importantly, supported?