Education, Heresies
Leave a comment

Person-Plus or Person-Solo

John Tenniel’s original (1865) illustration for Chapter IX of Lewis Carroll’s Alice in Wonderland. In this chapter the Gryphon and the Mock Turtle tell Alice about the “best of educations” they received at their “school in the sea,” the Gryphon remarking, “I went to the Classical master, though. He was an old crab, he was.” “I never went to him,” the Mock Turtle said with a sigh. “He taught Laughing and Grief, they used to say.”


The present tendency to destroy all tradition or render it
unconscious could interrupt the normal process of
development for several hundred years and substitute an
interlude of barbarism.
                    C. G. Jung, Aion

Willful blindness is the refusal to know something that
could be known…It’s refusal to admit error while
pursuing the plan.
                    Jordan Peterson, 12 Rules for Life

Among the many perils we face today—political, economic, social—there is one that is seldom remarked because it is so pervasively interior. By this I mean the danger to the self. The etymology of the word is pertinent in this regard. “Self” derives from Indo-European *selo < se, reflexive pronoun = “separate, apart,” whence Latin sibi, se + *(o)lo, pronoun suffix = itself, by itself.

The general sense seems to be that of something apart, uniform, consistent, unique, autonomous, i.e., the essential qualities of the person. This is the staple Western conception of the self, understood, in the words of anthropologist Clifford Geertz in Local Knowledge, as “a bounded, unique, more or less integrated motivational and cognitive universe, a dynamic center of awareness, emotion, judgment, and action organized into a distinctive whole.” A bit “heavy,” perhaps, as a definition, but plainly native to the Western cultural and philosophical heritage. The implication is conveyed by the very word “individual,” i.e., non-divisible.

The core-identity of the individual person—founded in conscience, self-awareness and memory—has been radically destabilized in our day. The cultural UV level (if we read UV as a sigla for something like Ultimate Vicariousness or Utter Virtuality) is alarmingly high. The real person, whose kernel of perishable subjectivity at one time at least theoretically provided the source and touchstone of meaningful experience, has been left unprotected in this age of supposedly millennial emancipations. 

The ideal of the autonomous individual has been supplanted by that of the corporate automaton, the cyberclone, the remote specialist, the purveyor of mere technique, in short, the modern sophist…

It is as if we have taken up residence in a parallel but phantom and decontextualized microworld characterized by pure events. These are, in effect, media events—video games, TV programs, dataflows, infomercials, VGA color charts, simulations, telecommutes, news “stories,” imaginary encounters—streaming ceaselessly by and yet temporally static, a flux of information always present to be activated and yet distressingly ethereal, the world of the dominant present. We may in fact be dealing with what naturalist Lyall Watson in Dark Nature called pathics, “a set towards a world-system in which order is disturbed by loss of place, disrupted by loss of balance, and diluted by the impoverishment of real connections and associations—deficits in quality, quantity, and relation which characterize the environment we are busy creating for ourselves.” The gulf between information processing and meaning production is vast.

In short, we have entered into the condition of pure participle, a flowing and perpetual “nowhere” which, in an anagrammatic prank, is always “now here”—and only here. It is a world we can “save” but never redeem, to which we can return but paradoxically never leave, a world of riotous and dedicated “users” who have succumbed to the Twentieth Century version of the Nineteenth Century disease, consumption, where it is the consumer who has been consumed, the user who has been “usenetted” into Lethean abeyance.

An important example of this tendency toward dispossession, this unhealthy inclination to insert a V-chip into the heart of the responsive self, may be found in the work of David Perkins, a director of the aptly named Harvard Project Zero, who claims in Smart Schools that much of our thinking is actually done by the cognitive environment and that what really counts is the “access characteristics of relevant knowledge”—what he denominates as the “person-plus.” It’s the education version of Barack Obama’s “You didn’t build that.”

Former US President Barak Obama
Photo credit: J. Scott Applewhite (AP)

In his Roanoke speech while running for re-election in 2012, Obama said: Somebody invested in roads and bridges. If you’ve got a business—you didn’t build that. Somebody else made that happen.

As David Henderson writes in The Foundation for Economic Education: “Why rehash this now? Because Cato policy analyst Derek Bonett has a particularly nice way of laying out what’s wrong with Obama’s thinking. He does it with his “Tale of Two Commuters.” Here’s an excerpt:

‘Imagine two commuters living equidistant from a downtown city law firm. One is an attorney at the firm, the other is her secretary. Each drives to work, thereby obtaining some value from the use of public roads. Each, in turn, imposes a roughly equal amount of depreciation on those roads, the cost of which must be defrayed via taxes. But what about the value “built” by each of them once they reach their office?

The attorney will almost certainly command a far higher salary than will her secretary. Insofar as these salaries emerge from a competitive market for labor, they reflect, at least within an order of magnitude, the respective marginal products of these commuters’ labor. But, crucially, the attorney’s higher salary is not attributable to a greater consumption of public goods. She traversed the same roads on the way to work as did her secretary. The two of them rely on the same police and fire departments. They may have even attended the same local public K-12 schools. The attorney’s higher salary is instead attributable to her command over a set of skills and human capital, which are more scarce —and more valuable—on the market than are secretarial skills. The salary differential, and the difference in productivity it reflects, cannot be explained by differential public goods consumption. In each case, some degree of public goods and services may be a necessary complement to these employees’ labor, but they are not sufficient to explain their differential success in earning taxable income. In what way is society justified in expropriating a greater percentage of the attorney’s income because her labor is more productive and therefore commands a higher salary?’”

David Perkins claims that what is in the notebook, the concept map, the hard drive, the video cassette or CD, “whether the person-solo remembers it or not, is part of what the person-plus has learned.” The operative phrase here is “whether the person-solo remembers it or not.” (Perkins does not recognize that his proposal justifies cheating—the inside of a shirt cuff or an internet plagiarism is also an “access characteristic” and part of the “person-plus.”) The problem is that the person-solo, irrespective of his relationships and technical appendages, is the person. As Bertolt Brecht wrote in The Mother, “What you don’t know yourself, you don’t know!” Or as Nassim Taleb points out in his new book, Skin in the Game, if you don’t have skin in the game, then you’re not in the game. For Perkins, the dispersal of knowledge through a cognitive field is intended to replace the operation of real memory. All one need remember is the code that gives access to the system, bypassing the self and replacing it with a sort of virtual ghost learner.

Consider, too, the premise of Lewis J. Perelman’s School’s Out, which confidently asserts that knowledge is “embedded in networks and smart tools, rather than in personal masters, as learning comes progressively to rely on the impersonal structures of the new information economies (italics mine). The ideal of the autonomous individual has been supplanted by that of the corporate automaton, the cyberclone, the remote specialist, the purveyor of mere technique, in short, the modern sophist, wherever we may find him, whether in the political community or the discourse community—and especially in the education apparatus. (One can’t help thinking that our education pundits are about as fatuous as our political impresarios.) What Perelman, Perkins and others like them fail to consider is that in the circumstances they envision, education and intellectual growth would be stripped of the vitality and intimacy that only the “living text” in the custody of the “personal master” can ensure.

This is precisely the argument that the late president of the Czech Republic Vaclav Havel makes in his Summer Meditations in which he stresses that “the basic component” in education must be “the human personalities of the teachers.” There is no other way to fulfill the purpose and historic mandate of the schools which is: to eschew the production of “idiot-specialists” and “to send out into life thoughtful people capable of thinking about the wider social, historical, and philosophical implications of their specialties.” But the “human personalities” of the teachers as well as those of the students fall by the wayside of the information highway on which the vehicle has altogether dispensed with the driver. It parks by itself, as it were. And runs over strolling pedestrians.

We cannot fix education simply by fixing education; we also need to address ourselves to the culture of which education is a part. The essential dilemma by which we are haunted and in which we remain embroiled is that we have become both morally and memorially absent. We have become netizens and denizens of an essentially contentless and dangerously vaporous society, mere entries in the encyclomedia. As agents pursuing one or another of our technological projects and innovations, we are superbly effective, clever and practical, at least in the short run. But as testamentary selves who query the past as well as answer to the future we are scarcely on the scene at all. We are eminently forgettable in our forgetfulness, suffering the eclipse of time, and critically deficient in that hypothetical substance that Jerome Bruner in The Culture of Education jokingly calls glueterium, a kind of self-and-time-binder that prevents our “world” from foundering in the short-term temporal range. 

For the self, as the locus of awareness and reflexivity, is constitutively non-distributable. Nevertheless, the individual person is increasingly regarded as an accessory portion of the larger cognitive system, a sort of epistemological collectivism that mirrors what is occurring in the social and political realms: the redistribution of resources that ultimately thins out the accumulated capital of a given society, whether fiscal, cultural or individual, and the scourge of identity politics in which the individual is subsumed into the category and so deprived of personal agency.

Here it is important to understand that the notion of the “centered self” that I have premised has nothing to do with, and is in fact the complete antithesis of, the moral and epistemological solipsism fostered by identity-politics and “social justice” ideology—the “feeling” individual. As Michael Rectenwald, author of Springtime for Snowflakes and the forthcoming The Google Archipelago, has persuasively argued: “Social justice holds that membership in a subordinated identity group accords members exclusive access to particular knowledge, their own knowledge, their own reality”—which happens to be impenetrable to others and treats of a fictitious or imaginary world. As with the technological prepossession, it is really a form of unselving. The moral narcissist who is his only truth is the epistemic shadow of Perkins’ tenuous and insubstantial subject.

We don’t engage with books; we engage with Facebook. We don’t debate; we tweet. We don’t think; we parrot.

This is the calamity we are confronting now: the dismantling of the central, stable, unique, and perduring self which, as I argue in Lying about the Wolf, cannot rightly be construed as an instrumental applet regulating external performance or as some kind of clipper chip or amplifier or cognitive subsystem plugged into a composite abstraction. This deconstruction of the holistic self has become the decisive and supervening project, the hidden agenda, of the contemporary Zeitgeist. For deep down, if such a tropological stratum may still be said to exist, I suspect that no sane man or woman doubts that only the person-solo is real, that is, constitutively non-accessory and non-distributable, that it transacts with a real and objective rather than a virtual and fantasy world, and that the person-plus as conceived by Perkins et al. is just a euphemistic term for the person-minus.

It is precisely the person-solo, the inalienable individual, whose center in conviction, ardor, knowledge, and thought must be defended if we are not to become drones in the electronic hive, shapes blown about in the ethernet and dispersed in a featureless and anonymous realm without substance. We now have instantaneous access to one another but—apart from a number of valuable and revelatory entries in the encyclopedic library of the Internet, which must be fairly acknowledged—find that on the whole we have little to offer but “information,” that is, facts, recipes, instructions, addresses, blueprints, propaganda, “messages,” bulletins, images, advertisements, and cheaply-fabricated, styrofoam-light pseudo-narratives known as “chats.” We don’t engage with books; we engage with Facebook. We don’t debate; we tweet. We don’t think; we parrot.

This bastard notion of person-plussery, of a nomadic and distributable self progressively inhabiting the technological surround, is swiftly becoming the cultural norm. We see it happening in all sectors of communal life and perhaps most alarmingly in education where the emphasis is beginning to fall more and more on what is misleadingly called “hyperlearning”: that is, on telecommunication networks, groupware facilities, distance-delivery packages, satellite classrooms, and cybernetic mediation in the classroom itself.

The idea of the true person-plus was always there as part of the educational mission inherent in the classical model of the Western academy…

Even science is not immune to such attenuations. In an article for American Affairs (Spring 2019, Volume III, Number 1), Edward Dougherty, Robert M. Kennedy ’26 Chair and Distinguished Professor of Electrical Engineering at Texas A&M University, puts the question squarely. “The ultimate toy, the computer with a graphical user interface,” allows for humongous amounts of data to be processed requiring no scientific understanding—“but which make[s] one feel knowledgeable and provide[s] entertaining slide shows… We can compute much faster today, and this is important to science and engineering,” he continues, “but if the experimental process itself is allowed to atrophy as a consequence, then what is the payoff?” The computer is obviously a necessary tool; nevertheless, “computer game playing—data mining, vacuous simulations, and superficially dazzling visualizations—needs to be dropped in order for there to be a return to real science.” Or in short, a return to real learning.1See also Dougherty’s, The Evolution of Scientific Knowledge, SPIE Press, Bellingham, Washington, 2016. I regard the seductive powers of these new technologies, scrubbed of their functional context, as a form of dismemberment or disrememberment, as a devitalization rather than an exaltation of the authentic, learned and capable self.

I hope it is understood that I am not regressively advocating, like a latter-day Luddite, that we abandon our rampant new technologies and return to some imaginary state of pastoral felicity. Not only would the attempt be ill-advised and quixotic, these technologies are unstoppable in any case. The faint-hope clause I am lobbying for here is simply this: if enough people in sensitive or authoritative positions—parents, teachers, school administrators—can be brought to see the implications of the prosthetic technology they use, recommend, or actively promote, then some of its deleterious effects may be to some degree preventable. One of my more precocious students remarked in a critical essay on the relation between technology and education, “Technological advances really do screw up the world by making everything seem more easy and user-friendly. But learning is damn hard work. The problem today is how to unfuck this weird world and make it real again.”

The irony is, or should be, unmistakable. The idea of the true person-plus was always there as part of the educational mission inherent in the classical model of the Western academy, in which the untutored mind was to be trained in the fundamentals of language usage, the principles of civics, the study of history and at least a partial command of the monuments of the literary and cultural tradition. It was a system in which intellectual regimen was paramount and the formation of what used to be called the “rounded” individual was the goal it envisioned—that is, a system in which the Humanities lived up to their name. The idea of merit was the bedrock on which the entire edifice was founded. Specialized expertise and discipline-specific knowledge followed upon the acquisition of broad-based scholarship. Professional competency and what Edward Hirsch called “cultural literacy” went hand in hand as elements of the institutional mandate.

The student “initiate” with his callow sensibility was to be gradually transformed through diligent study, rigorous testing and responsible teaching into the “person-plus” as a matter of course. More important by far than even the sympathetic application of advanced methods, techniques and devices was the strict maintenance of high standards. Indeed, the meritocratic paradigm is perennial—one recalls Juvenal’s Fourteenth Satire with its resonant motif that careless education, a falling away from basics, was ruining the country. Ultimately, one can say that the person-plus is nothing other than the mature person, the true “person-solo.”

Fictitious portrait of Juvenal (1837)
by S.H. Gimber

From Juvenal’s Satire XIV
(transl. Peter Green):

Here lies the root of most evil: no human passion provides / So frequent an incentive to mix up a dose of poison / Or slip a knife in the ribs as our unbridled craving / For limitless wealth. The man whose goal is a fortune / Wants it double quick; and how much respect for the law, what / Decent moral scruples can such a go-getter afford? / ‘Be content with a humble cottage, my boy: don’t look beyond / These hills of ours’ — that’s what old mountain peasants / Used to tell their sons. ‘The ploughshare should furnish men / With sufficient bread for their needs: the gods of the countryside / So ordained it, whose generous bounty brought us the blessing / Of wheat, and rescued us from our old, crude acorn diet. / The man who doesn’t disdain to wear kneeboots when it’s freezing, / And keeps off an east wind’s chill with sheepskins, fleece inside — / He’ll never turn out a bad hat: it’s these strange foreign / Luxuries, purple robes and the like, that lead to crime / And wickedness.’ Such were the maxims the ancients gave their children…

Of course, as noted above, the technological imperative is not the only disruptive force undermining the sovereignty of the individual mind. Under the mantle of the “diversity and inclusion” mantra and of outright political indoctrination, free expression and the unfettered exchange of ideas have given the dais to a politically correct and leftwing camarilla effectively prohibiting conservative thinkers and patriots from speaking freely and engaging students in discussion. Its unspoken purpose is to graduate an army of student radicals and so-called “social justice warriors” subject to a vast propaganda machine and intent on social and cultural destabilization. Clearly, genuine scholarship must be disinterested, differing points of view should be presented and debated, strict research methods must be inculcated, and the mind needs to be trained to learn, judge, and think independently. Pedagogical influence is meant to be cognitive, not political.

Bias, obviously, is humanly inevitable, but the work of the moral conscience in the act of teaching, which monitors our prejudices and proclivities and keeps them under relative control, is now rapidly becoming a dead letter. The schools, to quote David Horowitz, have been turned into “indoctrination platforms for leftist agendas,” the result of the political pincer in the double-pronged attack on the thoughtful, curious and independent individual. Adopting Louis Althusser’s term, Michael Rectenwald (mentioned above) refers to academia as an ISA, or Ideological State Apparatus, whose ideological dominance of society is massively disproportionate under the rubric of “social justice.” Along with the mycelial stratum of IT junkies and StatArb quants, its graduates are the low-information voters, partisan pedants, liberal socialists, leftist ideologues, suborned journalists and entitlement parasites who from a plenary perspective might be denominated as persons-zero, echoing Perkins’ Project Zero. But this is an issue best left for another time.2I have written several books and dozens of articles for the American conservative media dealing at length with this subject. My concern here is chiefly with the incursion of technics into the educational process and with the urgent need for what is coming to be called “techlash.”

I am aware that this technological prepossession may have incorporated itself as what we have learned to call a meme. According to Richard Dawkins’ The Selfish Gene, memes are ideas which function as the bearers or units of transmission of human culture—ideological genes, so to speak, which are largely immune to conscious intervention. Nonetheless the techno-meme must be resisted and humanized. Speculative as Dawkins’ concept of the presence or reflexive continuity of the meme may be, there is no replacement in the unfolding drama and complexity of human experience for the whole and conscious presence of the self to itself and to others, or at the very minimum for the sincere and passionate effort to approximate this elusive condition of intimacy and plenitude—in solitude, in dialogue, or even in the dialogic solitude of writing.

If we are not to surrender our ethical and intellectual patrimony as human beings, we must continue to believe that freedom—the ability to attain our best selves, to resist the tyranny of psychological as well as social, political and technological forces, and so to become the persons-plus, aka the persons-solo, we are meant to be—remains always possible, if improbable. If we shirk this responsibility, we will eventually find ourselves in the same diminished condition as Lewis Carroll’s Mock Turtle in Alice in Wonderland: “‘Once,’ said the Mock Turtle at last, with a deep sigh, ‘I was a real Turtle’.”


David Solway is a Canadian poet and essayist. His most recent volume of poetry The Herb Garden appeared in spring 2018 with Guernica Editions. A partly autobiographical prose manifesto  Reflections on Music, Poetry & Politics was released by Shomron Press in spring 2016. A CD of his original songs Blood Guitar and Other Tales appeared in 2016 and a second CD Partial to Cain accompanied by his pianist wife Janice Fiamengo appeared in June of this year. Solway continues to write for American political sites such as PJ MediaAmerican Thinker and WorldNetDaily. His latest book The News from Pluto was recently accepted for publication with Black House in the U.K.

Leave a Reply

Your email address will not be published. Required fields are marked *