See also: [Part Zero] [Part One] [Part Two] [Part Three] [Part Four] [Part Five] [Part Six] [Part Seven] [Part Eight]
The awareness we have of the world is mediated through signals from sensory organs and the meaning we make of these. As a practical matter, the information we receive has to be compressed in order to make sense of it. For example, we receive more than a million bytes per second through vision alone, and formulating cause/effect hypotheses about the world without compression would be practically impossible. Right now there is a fork laying on the table to my right, but the tines are hidden by a bag of dried fruit. That sentence comprises a hundred or so bytes of information, but communicates many possible ways to visualize it (decompression)--picking one makes it concrete enough to build a narrative from. This is only possible because of very high data compression.
We also have simplified internal signals. We can be "hungry for red beans and rice," but when our stomach grumbles, it just signals a generic need to be indulged. Pain too certainly comes in flavors, but an itch on the back is pretty similar to an itch on the leg--the most important information (an itch, a burn, a bug crawling up your neck) is signaled efficiently. By contrast, imagine if you were presented with a full account at the cellular level of all the relevant activity and had to sort through it all for meaning.
Perhaps one of the fundamental attributes of being human is the ability to recognize perceptual signals on this meta level (as abstractions, in other words) that can be manipulated. New ones can be created, for example by slipping small magnets under the skin to directly feel electrical/magnetic flux, or developing a taste for Scotch whisky. More familiar is the interdiction of signals, as with pain medication. A more fanciful idea is described in NYmag's "Is It Possible to Create an Anti-Love Drug?".
Two heirs to classical Cynicism, the Stoics and Epicureans, addressed internal signals. For example, ideas about the nature of grief and what to do about it is described in "How (And Maybe Why) To Grieve Like an Ancient Philosopher". The signals-based viewpoint also leads directly to the idea that death is not something to be feared, because it is simply an absence of signals. Contrast this to religions that recommend optimizing actions in life so as to produce the attractive signals in the "afterlife."
We can think of internal signals as "coins of the realm" and proceed to debase them. Drug addiction is one way to do that, but also meditation, counseling, and meta-cognition can subvert our out-of-the-box internal signals. Traditional liberal arts curricula explore this idea from many angles, even if it's not usually packaged that way. For example, our intuition versus rational thought (signals of what's real) are topics in psychology (e.g. see Daniel Kahneman's book Thinking, Fast and Slow). Add social, political, ethical, and biological signals: these are all explored from innumerable angles in the sciences and humanities. These perspectives--if taken seriously--can create in the learner a sophisticated meta-cognition that can be practically applied as an existentialist project. It goes like this: all signals are abstract by definition, which means there is a fundamental arbitrariness to them from the point of view of the receiver of the signal. Given the possibility to prefer some signals over others, we imaging a project of internal engineering to attenuate or amplify signals according to our most demanding desires.
This is a caustic process, and fully as dangerous as any Cynical enterprise. If one strips away too much, tossing aside all social and moral guides, for example, one could become a sociopath (this resembles Marquis de Sade's Cynical project, as described in The Cynic Enlightenment, starting on page 106). Or strip all the signals away and you get nihilism or suicide. But, more positively, the ongoing process of constructing a personal ontology can produce a freedom of mind that was modeled by Diogenes.
Liberal arts curricula expose internal signals and ways of attacking them, with relativism, post-modern thought, critical theory, and simply the exposure to many ways of thinking, historical decisions, and thought experiments. And so on. As with the academy in general, the approach is mostly theory and exposition rather than active mind-engineering. There is undoubtedly more colleges could do to enable self-subversion, but it would also be dangerous. I think there is some middle ground where we could operate in sandbox mode, so that students could gain some experience, and there are some experience like this available. For example, an assignment to sleep on the street for a couple of nights or practice asceticism in some form. My daughter's high school history teacher runs a project for weeks that consists of secretly identifying students as being 'communist' or 'capitalist', and prohibiting one side from communicating with the other. Students don't know which side they are on, and the teacher has spies everywhere--he shows them photos and social media screenshots of their interactivity, and deducts points accordingly. This is Cynical in that it undermines normal discourse--designed to loosely model The Terror, I'm sure. The benefits to students potentially includes reflection on the active management of feelings of unfairness or even fear. Anyone who can't see the applicability to a work environment isn't trying.
Beyond dramatic life-changes, internal freedom to attenuate and amplify signals has the potential to produce better workers too. How many of our new graduates are going to fall into their dream jobs right away? How many workplaces are unfair to employees or have abusive bosses or mean co-workers, or arbitrary rules or demeaning requirements? What, exactly, in "jobs training" is supposed to prepare a young mind for these assaults? Wouldn't it be better if they'd read and internalized The Prince? Wouldn't it be better if they knew about Foucault and the evolution of ontology and power, and how signals are ultimately arbitrary and malleable, and constantly being subverted by those who can do so to further their own ends?
Well, no. That's probably not what the employer wants. Foxconn's replacement of humans with robots apparently involves collaboration with Google to design an appropriate operating system. This is, in effect, an attempt to specify in code what a perfect employee is. You can be there won't be a subroutine named for Machiavelli or Diogenes. (Update: apparently Google's self-driving cars have never gotten a traffic citation.)
Next time: signals and subversion at work, or "Diogenes as assistant to the regional manager."
[Go to Part Ten]
Subscribe to:
Post Comments (Atom)
-
The student/faculty ratio, which represents on average how many students there are for each faculty member, is a common metric of educationa...
-
(A parable for academic workers and those who direct their activities) by David W. Kammler, Professor Mathematics Department Southern Illino...
-
The annual NACUBO report on tuition discounts was covered in Inside Higher Ed back in April, including a figure showing historical rates. (...
-
Introduction Stephen Jay Gould promoted the idea of non-overlaping magisteria , or ways of knowing the world that can be separated into mutu...
-
In the last article , I showed a numerical example of how to increase the accuracy of a test by splitting it in half and judging the sub-sco...
-
Introduction Within the world of educational assessment, rubrics play a large role in the attempt to turn student learning into numbers. ...
-
I'm scheduled to give a talk on grade statistics on Monday 10/26, reviewing the work in the lead article of JAIE's edition on grades...
-
Inside Higher Ed today has a piece on " The Rise of Edupunk ." I didn't find much new in the article, except that perhaps mai...
-
"How much data do you have?" is an inevitable question for program-level data analysis. For example, assessment reports that attem...
-
I just came across a 2007 article by Daniel T. Willingham " Critical Thinking: Why is it so hard to teach? " Critical thinking is ...
No comments:
Post a Comment