Anyone who has read The Clutter Museum for a while knows I’m not a Luddite. I like to play with technology, and I encourage my students to be curious about digital media, and particularly about how they might use it to build thoughtful public history projects and programs.
However, there’s a constellation of higher ed “innovations” that has me worried. A couple of these innovations, taken alone, might not be cause for concern, but because they’re emerging at the same moment, they’re troubling.
First, there’s the university’s adoption of minimum viable product development strategies, and all the tech-marketing rhetoric and thinking such strategies seem to require.
Second, there are MOOCs, the massively open online courses being peddled by universities and start-ups alike. (If you’re unfamiliar with the phenomenon, Jonathan Rees consistently writes the hardest-hitting posts about both the academic labor implications of MOOCs and their (utter lack of) impact on student learning.)
Third, there are badges, alternative forms of assessment that circumvent traditional academic accreditation.
Fourth, we have the New University of California, where there are no classes—only high-stakes exams.
Fifth, we have companies that students can hire to take tests, write assignments, or even complete entire classes on their behalf. Students don’t have to take the courses for which they’re “earning” credit.
Finally, we have automated essay-grading software from EdX. Faculty no longer need to grade the “work” of the “students” “enrolled” in their “classes.”
Anyone want to call the tech-induced time of death on faculty governance and authentic student learning?
[Update: Jonathan Rees has already called it, and he points out faculty autonomy and student learning aren’t the only casualties.]