On curation, technology, kids today, librarians, blogging, and privacy #dyr

A big part of this new age of creation is that you have infinite choice, and no clear concept of where to start...So what’s the fix? You need a filter. And I strongly believe that while algorithmical filters work, you need people to tell you about things you wouldn’t find that way.

technology has never been cold, impersonal, and industrial. We simply chose to understand it that way. Technology has always had a role in shaping the inner life, the intimate life. The telephone - surely a shaping force in the making and shaping of self. The telegram, the letter, the book.

...

Nor was there anything cold about how industrial technologies such as cars and trains shaped our sensibilities, our sense of self, of our sensuality, our possibilities. If we have succumbed to an ideology of technological neutrality that is something that needs to be studied as an independent phenomenon; it is not to be taken as a given.

whatever the flavor of the month in terms of new technologies are, there’s research that comes out very quickly that shows how it causes our children to be asocial, distracted, bad in school, to have learning disorders, a whole litany of things.

Davidson's youth worship, though extreme, is common these days among those who write about technology and society. Individuals born after the dawn of the Internet are not the same as you and me, goes the now-familiar refrain. As a result of their lifelong immersion in electronic media, young people's brains are "wired differently," and they require different schools, different workplaces, and different social arrangements from the ones we have. They are described, with more than a little envy, as "digital natives," effortlessly at home in an electronic universe, while we adults are "digital immigrants," benighted arrivals from the Old World doomed to stutter in a foreign tongue.

students rarely ask librarians for help, even when they need it. The idea of a librarian as an academic expert who is available to talk about assignments and hold their hands through the research process is, in fact, foreign to most students. Those who even have the word “librarian” in their vocabularies often think library staff are only good for pointing to different sections of the stacks.

Facebook and Twitter are too easy. Keeping up a decent blog that people actually want to take the time to read, that’s much harder. And it’s the hard stuff that pays off in the end.

Besides, even if they’re very good at hiding the fact, over on Twitter and Facebook, it’s not your content, it’s their content.

The content on your blog, however, belongs to you, and you alone. People come to your online home, to hear what you have to say, not to hear what everybody else has to say. This sense of personal sovereignty is important.

I have always been understanding that these tech giants need to make money. Supporting tens of millions of users takes time and a whole lot of resources. While it’s in Google, Facebook, and LinkedIn’s interests to attract as many users as possible – and clearly free is the way – there are obvious consequences: Users get to play without paying, but every few months we get kicked in the face when our digital profiles get abused.

employers are increasingly aware of and keen to use the huge informational resource that social media serves up on a plate; all kind of information is in the public domain, and incredibly easy to find – particularly if the applicant has an unusual name. As the chief executive of Social Intelligence has said, with something of a corporate shrug, "All we assemble is what's publicly available on the internet today". Nothing underhand going on here, they say; the company believes that the information is out there to be evaluated.

"The device came out of the box and my world was transformed." #wymhm

The first thing that happened was that New York fell away around me. It disappeared. Poof. The city I had tried to set to the page in three novels and counting, the hideously outmoded boulevardier aspect of noticing societal change in the gray asphalt prism of Manhattan’s eye, noticing how the clothes are draping the leg this season, how backsides are getting smaller above 59th Street and larger east of the Bowery, how the singsong of the city is turning slightly less Albanian on this corner and slightly more Fujianese on this one — all of it, finished. Now, an arrow threads its way up my colorful screen. The taco I hunger for is 1.3 miles away, 32 minutes of walking or 14 minutes if I manage to catch the F train. I follow the arrow taco-ward, staring at my iPhone the way I once glanced at humanity, with interest and anticipation. In my techno-fugue state I nearly knock down toddlers and the elderly, even as the strange fiction and even stranger reality of New York, from the world of Bartleby forward, tries to reassert itself in the form of an old man in a soiled guayabera proudly, openly defecating on Grand Street. But sorry, viejo, you’re not global enough to hold my attention.

"Books were good at developing a contemplative mind. Screens encourage more utilitarian thinking." #wymhm

A new idea or unfamiliar fact will provoke a reflex to do something: to research the term, to query your screen “friends” for their opinions, to find alternative views, to create a bookmark, to interact with or tweet the thing rather than simply contemplate it. Book reading strengthened our analytical skills, encouraging us to pursue an observation all the way down to the footnote. Screen reading encourages rapid pattern-making, associating this idea with another, equipping us to deal with the thousands of new thoughts expressed every day. The screen rewards, and nurtures, thinking in real time. We review a movie while we watch it, we come up with an obscure fact in the middle of an argument, we read the owner’s manual of a gadget we spy in a store before we purchase it rather than after we get home and discover that it can’t do what we need it to do.

"that’s going to be the digital divide...the ability to deal with information" #wymhm

The assumption that today’s student are computer-literate because they are “digital natives” is a pernicious one, Zvacek said. “Our students are task-specific tech savvy: they know how to do many things,” she said. “What we need is for them to be tech-skeptical.”

Zvacek was careful to make clear that by tech-skeptical, she did not mean tech-negative. The skepticism she advocates is not a knee-jerk aversion to new technology tools, but rather the critical capacity to glean the implications, and limitations, of technologies as they emerge and become woven into the students’ lives. In a campus environment, that means knowing why not to trust Google to turn up the best sources for a research paper in its top returns, or appreciating the implications of surrendering personal data -- including the propensities of one’s bladder -- to third parties on the Web.

"The more you to do with robots the more you realise just how good humans are" #wymhm

"They should be able to do more," says Joseph Engelberger (pictured above), the founding force behind industrial robots and considered the father of the modern robotics industry. "We need multitasking robots that can think for themselves and do something useful. Working robots have to be something more than this," he says, referring to the impracticality of most robots, at least as far as the media's opinion goes.

"computers seem to have further separated children in low-income households" #wymhm

The Duke paper reports that the negative effect on test scores was not universal, but was largely confined to lower-income households, in which, the authors hypothesized, parental supervision might be spottier, giving students greater opportunity to use the computer for entertainment unrelated to homework and reducing the amount of time spent studying.

The North Carolina study suggests the disconcerting possibility that home computers and Internet access have such a negative effect only on some groups and end up widening achievement gaps between socioeconomic groups. The expansion of broadband service was associated with a pronounced drop in test scores for black students in both reading and math, but no effect on the math scores and little on the reading scores of other students.

 

"the pace of innovation is such that these machines should begin to learn as they teach" #wymhm

the most advanced models are fully autonomous, guided by artificial intelligence software like motion tracking and speech recognition, which can make them just engaging enough to rival humans at some teaching tasks.

Researchers say the pace of innovation is such that these machines should begin to learn as they teach, becoming the sort of infinitely patient, highly informed instructors that would be effective in subjects like foreign language or in repetitive therapies used to treat developmental problems like autism.

Related to a recent tweet by @stevendkrause, a thought bundle on reading books and e-books, print and screen

we want the fruits of our labor to exist between hard or even soft covers in our own time and after us (and accept that the pages containing our being will turn brown and become brittle), it means something to us to see and speak of a book as a weighty tome or a slender volume, we like to be able to locate a passage we've already read spatially on a page, we are interested, even as we are dismayed, to discover that we are the first person in 61 years, eight months, and three days (according to the "due date" slip) to check a book out of the library, it pleases us to think of Whitman's leaves of grass as pages of a book
via tnr.com

 

For me, reading is a physical experience, one that vanishes, evaporates completely, the minute you read something on a screen. Books also have an architectural dimension. Rooms full of books are meaningful places where people assemble. And yet, one of the things that defines reading is its very intimacy—which is what I love about it.

 

So real books and e-books will coexist. That has happened time and again with other new technologies that were prophesied to kill off old ones. Autos didn't wipe out horses. Movies didn't finish theater. TV didn't destroy movies. E-books won't destroy paper and ink. The Internet and e-books may set back print media for a while, and they may claim a larger audience in the end. But a lot of people who care about reading will want the feel, the smell, the warmth, the deeper intellectual, emotional, and spiritual involvement of print.

 

"The internet has quietly infiltrated our lives, and yet we seem to be remarkably unreflective about it." #wymhm

We're living through a radical transformation of our communications environment. Since we don't have the benefit of hindsight, we don't really know where it's taking us. And one thing we've learned from the history of communications technology is that people tend to overestimate the short-term impact of new technologies — and to underestimate their long-term implications.

We see this all around us at the moment, as would-be savants, commentators, writers, consultants and visionaries tout their personal interpretations of what the internet means for business, publishing, retailing, education, politics and the future of civilisation as we know it. Often, these interpretations are compressed into vivid slogans, memes or aphorisms: information "wants to be free"; the "long tail" is the future of retailing; "Facebook just seized control of the internet", and so on. These kinds of slogans are really just short-term extrapolations from yesterday's or today's experience.

While a definite qualifier for "tl;dr," it's worth a read for the clarifying perspective and reflection on what the internet is, does and could/will be.

"Soon there will be no reason to have a big, boxy computer on your desk" #wymhm

In spite of their name, desktop PCs often have several users. Laptops, netbooks, and tablets are usually single-user machines—that is, they really are personal. Modern mobile operating systems are built with room enough for one—Apple's iOS and Google's Android are both tied in to a single user's e-mail, calendar, and app-purchasing accounts. Forrester's numbers also suggest that in the future we'll have many such machines around the house. Your "main" computer will be a laptop—and you'll probably have several smaller, tablet-type machines that you use regularly as well.

My laptop is my "main" computer, my only computer, at least until my current wireless carrier contract ends and I'm able to land a Google Android. Then I'll have two single-user machines, four and five if the PS3 and Xbox 360 qualify, six if I dare include my Moleskine journal.