"We don't yet know what the body can do."
It's hard to describe the disappointment I felt while reading Adam Briggle's "Technology Changes 'Cheating'", after Steve Fuller, who I count among the most formative figures in my intellectual development, had described it as "one of the smartest things on academic cheating in a long time" on Twitter yesterday. But, then again, it's always hard to interpret a tweet. I guess Steve might have meant it sarcastically. Perhaps Adam Briggle himself is just being ironic; perhaps it's a pastiche of empty-headed techno-hype carping its diem. Maybe Steve just meant it as a comment on the generally poor quality of discussion in the debate. But I think we are justified in taking it straight, at least for the sake of argument. So what, you may be wondering, do I think Briggle gets so wrong?
In a word, everything. He begins by observing, uncontroversially, that "students are now using cell phones and other devices to text answers to one another or look them up online during tests" and points out, quite rightly, that "these sorts of practices are widely condemned as cheating". (The Harvard cheating scandal, which Farhad Manjoo in a not quite, but almost, as ill-conceived piece denied even exists, is of course somewhere in the background here.) But he then attempts to challenge this conventional wisdom (that if you use your cell phone in a closed-book exam you are cheating) by suggesting that similar rules don't apply in the workplace. "Imagine a doctor or a journalist punished for using their smart phone to double-check some information," he balks.
Well, okay. Let's imagine a doctor who is unable to even begin to diagnose your condition without first Googling your symptoms. Or a journalist who can't engage in a lively panel discussion without the help of his smartphone. And do note that journalists do get fired for not attributing their work to their rightful sources, and doctors are in fact expected to do some things, like take your blood pressure or remove your brain tumor, relying heavily on skills they carry around with them in their bodies, and which they were presumably asked to demonstrate to their examiners before they were granted their degrees and licenses.
But to point this out, of course, only exposes how hopeless backward I am. This is the pre-emptive point of Briggle's familiar techno-fantastic boilerplate that now follows. It might well get him invited to a TED conference one day, but we must really stop letting this sort of thing serve as a universal game changer:
The human used to be that creature whose brain occupies the roughly 1,600 cubic centimeters inside of the skull. Call that humanity 1.0. But now we have humanity 2.0. Cognition extends beyond the borders of the skull.
Humans are nebulous and dispersed: avatars, Facebook profiles, YouTube accounts and Google docs. These cloud selves have the entire history of human knowledge available to them instantaneously and ubiquitously. Soon we will be wearing the Internet, and then it will be implanted in our bodies. We are building a world for humans 2.0 while our education system is still training humans 1.0.
Welcome to the twenty-first century! The truth is, and has always been, that humans are creatures with brains that occupy 1,600 cubic centimeters of a body, a volume that contains muscles, bones, nerves, skin, a heart, guts .... We are building a world, if we are, for disembodied brains, not "humans 2.0".
That's the whole problem with Briggle's enthusiasm for the new technologies and social practices. It's not a new kind of "human" environment, it's simply an inhuman one. We have to learn how to face this environment resolutely, not simply be assimilated by it. Human bodies don't improve by way of implants, but by training and discipline. We don't need to "upgrade" them, we just need to learn how to use them, as Spinoza noted in his Ethics.
Last year Arum and Roksa made a good case for something many of us had long suspected. Social study practices (and Humans 2.0 are hyper-social, if they're anything) don't make students smarter. They may, of course, improve their ability to work in groups, and therefore should not be entirely done away with. But what actually improves critical thinking and analytical reasoning is the old-school, individual arts of reading and writing. And the only way to test whether students can do these things is to get them to close their books (after having read them carefully), unplug, and tell us what they think, on their own. We've never trusted a student who couldn't talk about a book without having it open on their desk. Why are we going to grant that their constant gestures at their touch screens count as a demonstration of intelligence? Note that Briggle is not just for socially-mediated examination: he is against old-school conceptions of cheating. He wants to do away with traditional values, not just augment them with new technologies.
This, like I say, displays a desire to do away with the body itself: to jack the (disembodied) brain ("wetware") directly into the cloud of human knowledge. But it doesn't take long to realize how utterly empty Briggle's image of "cloud selves" who "have the entire history of human knowledge available to them instantaneously and ubiquitously" ultimately is. After all, our stone-age forebears had "the entire world of nature's bounty", you know, "at their fingertips". The problem was just sorting the dangerous things in the environment from the useful ones, and getting to where the good stuff was, while also finding places to weather a storm. That problem remains even if everything that's ever been written can found be found by Google (something we're far from even today). Let's keep in mind that the vast majority of the "cognitive" capacity of the "collective mind" (a.k.a. the Internet) is devoted to circulating spam and porn. We can "instantaneously and ubiquitously" jump into that sewer. But that's only how the fun begins.
Briggle closes the piece with something that looks like an insight. "The ultimate problem is not with students but our assessment regime that values 'being right' over 'being intelligent'. This is because it is far easier to count 'right' answers than it is to judge intelligent character." This has been known to progressive educators for a long, long time of course. And Briggle doesn't realize that all the technologies he's so hyped about letting the students use presume precisely that the point is finding the right answer not being able to think. That's all we can test if we don't count their use as cheating.
His protestations to the contrary, Briggle really does value the right answer over the exercise of real intelligence. How else could he suggest that "it doesn't matter how you get there"? Yes, life's the destination now, not the journey. In fact, we've already arrived. There's nothing more to learn except what the "the entire history of human knowledge" (which is so readily available to everyone!) has already brought us.
The brain doesn’t obey the boundaries of the skull, so why do students need to cram knowledge into their heads? All they need in their local wetware are the instructions for accessing the extended mind. This is not cheating, it is simply the reality of being plugged into the hive mind. Indeed, why waste valuable mental space on information stored in the hive?
Right! Of course! Why hadn't we thought of that sooner? We should have abandoned painting after we invented photography. And stopped teaching anyone to play the violin after we had recorded the first thousand concertos! And I guess all these doping scandals in the Tour de France are also going to be a thing of the past—when it becomes a motorcycle race? What absolute nonsense.
Someone has to tell Professor Briggle (and it may as well be me) that nobody is asking anyone to cram knowledge into their heads. Students are being asked to train their minds, which, because they are part of the same being, is to say they are being asked to train their bodies, to be able to hold their own in an ongoing conversation ("of mankind", if you will) with other knowledgeable peers. To read and write coherent prose paragraphs. That's a distinctly "academic" virtue, to be sure. But surely there is room for academics in the new era?
Ben Lerner once proposed that poetry is part of the "struggle against what Chuck D has called the ‘dumbassification’ of American culture, against the deadening of intellects upon which our empire depends". I don't think it is too much to ask our philosophers to help out, rather than promoting the spread of these deadening social media as some kind of glorious new stage of human evolution. Philosophers could, at the very least, resist, rather than celebrate, the dumbassification of academia.