Audrey Watters has noted the essentially apocalyptic flavor of what she describes as “the myth and the millennialism of disruptive innovation” – mythic in the sense that it prophesies “the destruction of the old and the ascension of the new” and constitutes a narrative that “has been widely accepted as unassailably true.” When applied to education, disruptive innovation promises nothing less than “the end of school as we know it.” In “A Future with Only Ten Universities” (referring to the Udacity founder Sebastian Thrun’s vision of fifty years from now, as expressed in a Wired magazine profile), Watters laid out what might lie beyond “the end”:
To get to 10 universities, higher education as we know it today will have to be “unbundled.” Someone other than universities will have to provide the services that extend beyond “content delivery.” Private companies will run football. Private companies will run tutoring. Private companies will run research. Private companies will run assessment. Private companies will run student housing, student daycare, student mental health services. Private companies will help guide students through their career paths. Thanks to big data, the University of Walmart and the University of Google will be particularly adept at this.
See also, “The Undoing of Disruption“:
…a new paper, the most extensive test yet of Christensen’s theory, may prove more difficult to dismiss [than Jill Lepore]. Andrew A. King, a professor at the Dartmouth College business school, and Baljir Baatartogtokh, a graduate student at the University of British Columbia, spent two years digging into disruption, interviewing scores of experts, trying to determine whether 77 of Christensen’s own examples conformed to his theory, studies involving big names like Ford, McDonald’s, and Google, along with lesser-known makers of blood-glucose meters and blended plastics. Only a tiny minority — 9 percent — fit Christensen’s criteria. Disruption is real but rare, King and Baatartogtokh conclude, which suggests that it’s at best a marginally useful explanation of how innovation happens.
Artificial Intelligence, text analysis and creation, etc… and the future of work
Tasks that would seem to require a distinctively human capacity for nuance are increasingly assigned to algorithms, like the ones currently being introduced to grade essays on college exams. Particularly terrifying to me, computer programs can now write clear, publishable articles, and, as Ford reports, Wired magazine quotes an expert’s prediction that within about a decade 90 percent of news articles will be computer-generated.
Workplace monitoring – The Spy Who Fired Me
Pearson described the Work Diary as “the equivalent of being able to walk up to someone’s desk and see how they’re doing.” But it is much more than that. Once every ten minutes while you’re logged in, the program takes a snapshot of your computer’s desktop. It’s a detailed image that shows, for example, all the tabs open on your Web browser. The program also records minute-by-minute keystroke and mouse data, along with a productivity rating. The exact timing of the snapshot is unpredictable. It could happen at the moment you open iTunes to start a new playlist. Or when your boyfriend sends you an instant message. An icon pops up on your screen whenever a screenshot is captured, and you can review them and delete any troubling images. “The application is not a surveillance system,” oDesk’s online Help Center says. “You have full control over what it records . . . deleting those [screenshots] you choose not to share with your client.” But the Help Center fails to note that for each screenshot you delete, you sacrifice ten minutes of guaranteed pay.