It bewilders me to visit big law firms that clearly spent top dollar on their websites and their Aeron chairs but that still use Times for their correspondence and internal documents. It equally bewilders me to visit small firms that don’t have to go through twenty layers of approval that are also still using Times.
Did you make your business cards and letterhead on a photocopier at Kinko’s? No, you didn’t, because you didn’t want them to look shoddy and cheap. If you cared enough to avoid Kinko’s, then you care enough to stop using Times
even those of us who write for publication can conclude, once we have clarified certain thoughts, that these thoughts are not especially valuable, or are not entirely convincing, or perhaps are simply not thoughts we want to share with others, at least not now. For many of us who love the act of writing—even when we are writing against a deadline with an editor waiting for the copy—there is something monastic about the process, a confrontation with one’s thoughts that has a value apart from the proximity or even perhaps the desirability of any other reader. I believe that most writing worth reading is the product, at least to some degree, of this extraordinarily intimate confrontation between the disorderly impressions in the writer’s mind and the more or less orderly procession of words that the writer manages to produce on the page.
Hundreds of students worldwide apply to snare one of 80 available spots in a separate 10-week “graduate” course that costs $25,000. Chief executives, inventors, doctors and investors jockey for admission to the more intimate, nine-day courses called executive programs.
Both courses include face time with leading thinkers in the areas of nanotechnology, artificial intelligence, energy, biotech, robotics and computing.
On a more millennialist and provocative note, the Singularity also offers a modern-day, quasi-religious answer to the Fountain of Youth by affirming the notion that, yes indeed, humans — or at least something derived from them — can have it all.
They are showing that Top 100 lists can be gamed and that entertaining content can reach mass popularity without having any commercial intentions (regardless of whether or not someone decided to commercialize it on the other side). Their antics force people to think about status and power and they encourage folks to laugh at anything that takes itself too seriously. The mindset is deeply familiar to me and it doesn’t surprise me when I learn that old hacker types get a warm fuzzy feeling thinking about 4chan even if trolls and griefers annoy the hell out of them. In a mediated environment where marketers are taking over, there’s something subversively entertaining about betting on the anarchist subculture.
Media critics write as if the brain takes on the qualities of whatever it consumes, the informational equivalent of “you are what you eat.” As with primitive peoples who believe that eating fierce animals will make them fierce, they assume that watching quick cuts in rock videos turns your mental life into quick cuts or that reading bullet points and Twitter postings turns your thoughts into bullet points and Twitter postings.
Three days ago, I posted a gloss of Bill Wolff's Deliverator at #cw2010, attempting to reinforce some of the ideas he put forth. I focused on his notion of expanding "composition" to include bookmarking and tagging (among other online actions). I also carried this into how I've reorganized my offline research library. Brian McNely was kind enough to acknowledge the post, mentioning its connection to his and Derek Mueller's #cw2010 talks. McNely's is on his blog, so I'll let him speak for himself: "Mentorship and Professionalization in Networked Publics."
One of the recent links I shared on Twitter, which Alan Benson retweeted, was a piece in the Atlantic about how disorganized terrorists often are and the importance of emphasizing this fact. "Can being more realistic about who our foes actually are help us stop the truly dangerous ones?" the authors ask. It's a piece worth reading, hence the sharing via Twitter (and here later tonight as part of #wymhm), but I mention it in this moment for another reason, one (I think) related to Bill Wolff's Deliverator and Brian McNely's talk at #cw2010.
I maintain four print subscriptions: the Atlantic, Paste, Lansing State Journal and Wired. I read each issue in its entirety within 24 hours of its arrival. The aforementioned article is in the most recent issue of the Atlantic, and it was in print that I first encountered it. Most Atlantic pieces appear online soon after the print publication is out in circulation, so I went there to find it and subsequently share it. Is it safe to assume that those seeing my tweet about this Atlantic piece figured I found it online first? I think an argument for that could be made, but it would be incorrect.
I mention all this here in the interest of that prevailing visibility, and because I just have an increasing interest in how we come to share information online and even marking the paths that lead to sharing.
One of the biggest challenges of not being at a university that has an existing OCW infrastructure is that you can't rely on the strength of said infrastructure to publicise your courses. This means that you are going to have to do a lot of the legwork yourself if you want to get the word out about your OCW materials. Social media is obviously good for this. Every time you put up new material (or an entirely new course) send out the link (and brief description) to colleagues using your social network platform of choice (for me, it's Twitter). That way, your professional colleagues will always know what you are up to (and will pass on the link to their colleagues). You might also consider going "old school." Every time you publish new course material, send out a quick email to your dept chair, campus colleagues, and Dean. These are the people, after all, who might have a significant amount of influence over your career
While it may require more personal effort from faculty, the reward is a unique opportunity to create a new model for publishing academic learning content that avoids the mistakes of the old system. Faculty can learn from their librarian colleagues, whose past experiences in managing scholarly communication offers a lesson in how not to structure a publishing model.
Here’s where things went wrong. For years faculty gave away (and in many cases continue to give away) their scholarly works to publishers who would edit and repackage the content and then sell it back, at hard-to-justify prices, to the same academic institutions that had produced it. Few faculty paid attention as the institution’s librarians financed the system by paying exorbitant journal subscription fees. Give away the intellectual capital of the institution; buy it back at unsustainable prices. An oversimplification, but that’s roughly how we got into this mess.
While professors across the tenure spectrum scored within the average range of “emotional exhaustion,” tenure track faculty had the highest score at 22.3, edging toward the “high degree of burnout” designation. In contrast, those not on the tenure track had the lowest scores at 16.4, and tenured faculty were in the middle at 20.9.
Surveyed faculty also fell within the scale's average range of burnout when assessed for “depersonalization,” a category marked by heightened cynicism and a tendency to abandon tasks. Notably, however, tenure track faculty had the most heightened levels of depersonalization as well, coming in on the high end of the scale's overall average at 6.8. As with the "emotional exhaustion" category, non-tenure track faculty scored the lowest -- 5.2 -- and tenured professors were in the middle at 6.6.
While copyright holders assert that copyright violators are “stealing” their “property,” people everywhere are remixing and recreating artistic works for the very same reasons the Glee kids do — to learn about themselves, to become better musicians, to build relationships with friends, and to pay homage to the artists who came before them. Glee’s protagonists — and the writers who created them — see so little wrong with this behavior that the word ‘copyright’ is never even uttered.