Monday, November 04, 2019

A Cynical Argument for the Liberal Arts, Part Seventeen

Previously: Part Zero...Part Sixteen

Note: I pulled this one from the draft folder. It is the most recent article in the sequence linked above. The theme was inspired by The Cynic Enlightenment.

The Cynics were not theorists, but the disruptive attack on knowledge represented by the charge "debase the coin of the realm" has epistemological implications that I want to develop here. Diogenes would be horrified, no doubt.

By interpreting "coin of the realm" as trusted signalling within a system of information processing, we have a general starting point. I will just assume that all perception is a signal this sort, and all belief comes from perception. If I look at a counterfeit coin and believe it is real, this "fact" becomes part of my reality. Subsequently, if someone tells me the coin is fake, this is an assault on my reality (my understanding of the world), which is generally painful and to be avoided. As a reminder, little-c cynical attacks are those that verbally impugn a signal, and big-C Cynical attacks are those that spoof it (faking perceptions).

So far we have:
  1. All perceptions are signals that can be faked
  2. Beliefs are constructed from perceptions
  3. "False" beliefs are undesirable
It's the last one that has the most interesting consequences, because we need a non-circular definition of 'false belief'. Bear in mind that the 'beliefs' in question can be within an individual person or within any other system that relies on perception. These are generally the "realms" in which Cynics debase coins.

Belief is transitory, constantly competing with new perceptions for consistency. But because changing a favorable belief can be painful, we resist it. I assume this is for good evolutionary reasons. Similarly, we embrace changing an unfavorable belief. So we hope for good news and worry about bad news. Despite this, systems that endure have "sticky beliefs" that we informally call knowledge. For convenience, then,
Knowledge is belief that hasn't changed for a while.
In practice, knowledge may have to content with perceptions that contradict it, but these have to reach a certain level before the knowledge becomes outdated and must be replaced. For example, I used to "know" the quickest way to drive to work from my home in Charlotte to the university. But for a period of a week or so there was a road construction project that routed additional traffic down my "fastest route," slowing me down considerably. This counterexample to my knowledge about the best route was dealt with as an exception, not by overturning the knowledge itself. 

So knowledge comprises a stream of perceptions that are congruent enough with the belief in question that it is not sufficient to overcome the inertia to continue believing. Counterexamples will be dealt with as exceptions for a while, but if there are enough of them, or it is painful enough to be wrong, then the knowledge becomes historical and is replaced with something new.

Therefore at the heart of knowledge (as defined above) is an ongoing verification through perceptions. We agree with ourselves because of what we see. These ongoing verifications are the most trusted parts of our systems, and we mostly take them for granted. They are the the targets of the most devastating Cynical attacks. 

Why do perceptions persist? I propose two reasons. First, I assume that there is an external reality that's inaccessible to us except through perceptions. So the reliability of a cup falling when you drop it is due to Nature. This is unprovable, but it makes for an understandable theory. Second, some perceptions persist because of the nature of the system's internal rewards. For example, I know sugar tastes good. This is an artifact of my biological evolution, not a property of the universe. At the level of societies, all the conventions (like language and custom) we follow are fixed points in the evolution of the social system. The idea of a "dog" is not a fundamental part of physical reality, but is socially constructed.

The stream of perceptions coming into a system cause it to change constantly, but some knowledge is fixed or changes very slowly. I'm positing that these fixed points are either due to external reality (and its relationship to the perceptive apparatus) or because of the way the system operates, including its reward system. For convenience, let me refer to these as "real" knowledge (meaning pertaining to reality) versus "system" knowledge, having to do with the way we process information as humans and societies, with the inherent motivations.

An interesting intersection between system and real knowledge is land surveying. All land in the US has been surveyed, and it's illegal to remove the markers that have been place. Nevertheless, many of them are gone or unfindable. With increasingly precise equipment, the placement of new markers has a better argument for 'real' knowledge than the old ones, but the latter are retained for social reasons--we have enough reality, thank you very much. Imagine the chaos otherwise--property lines constantly being redrawn.

All knowledge is vulnerable to counterexamples, and the plucked-chicken-throwing Diogenes blazed the trail. This was also a public event, and these two elements are found in the beginnings of modern science. Boyle's experiments with vacuum, which I read about in Bruno Latour's We Have Never Been Modern, in reference to this book, model this Cynical approach. Latour describes the tension between a social authority and empiricism as a possibly contradictory authority, distinguishing between what I'm calling system knowledge and real knowledge. But given that science takes place within society, what test is there that can tell the difference?

The Cynical challenge to knowledge-as-coin-of-the realm is to publicly debase its value by producing obvious counterexamples. For example, if I claim that "an unsupported object will fall to earth," the Cynic can point to flying birds as counterexamples (not plucked ones this time). I can accommodate this by making exceptions to the statement, or refining it. The relationship between system and real knowledge can be troubled in different ways.

Real counter-example to system knowledge. If new physical evidence (like DNA) shows that a convicted murderer really couldn't have committed the crime, this is an empirical challenge to system knowledge that was established after a process (viz., a trial). There is no automatic sovereignty of real knowledge over system knowledge. In fact, the reverse is generally true. On the scale of an individual human, sugar and salt will still taste good, no matter how much science tells us that too much is bad for us.

System counter-example to real knowledge. Religious dietary restrictions would be examples, providing system-level "this behavior is bad for you" that doesn't correspond to perceived physical realities. My real knowledge might be that "pork chops are good to eat," but is contradicted by a system prohibition that claims "pork chops are not good to eat." This may be physically enforced. Another example: an unethical scientist can fake results so that real knowledge is contradicted. In this case, it may not be obvious that the system is the culprit. Or it may be more subtle than fakery: for example Aristotle's long-lived prejudice that the planets must move in circles, contradicting Kepler's careful calculations. 

Real counter-example to real knowledge. This is one way science proceeds, by making observations (per experiments) that contradict existing knowledge. For example, the experiments that showed that the speed of light in a vacuum was constant. 

System counter-example to system knowledge. Quite dangerous sometimes. This territory belongs to Germany, says France. Doch! says Germany. People get hurt. Even within a single integrated system like, say, the legal system, contradictions arise. One might moot that the whole idea of a trial is the resolution of contradictory realities. When laws change, realities change.

So What?

What does any of this have to do with college? 

For starters, students arrive with a full supply of reality. Being young, they are probably immersed in it in ways I can't remember. This is their real knowledge of the world. We, in turn, try to inculcate them into systems knowledge: the methods and theories of a field of study. 

In one of my undergraduate math classes, there was a guy who would always try to find a problem with the set up of some proof underway on the board. His favorite trick was to think of some edge condition and point it out: "This doesn't seem to work if S is the empty set," he'd say, and the long-suffering professor would stop what he was doing, walk back to the front of the proof, where the assumptions were listed, and--loudly with the chalk--bang out that S was assumed not to be empty. 

In retrospect, that was a Cynical act, if a pretty minor one. The 1960s were long gone by that point, and there tectonic collisions of realities with them. The only relic left was Faner Hall, a building with the single laudable trait of not being burnable-down. 

Maybe we should teach students how to be Cynics. How to look for the cracks in reality--theirs and ours--that always exist. It can be dangerous to do that in the real world, and they arguably need a place to practice. More anon. In the meantime you might be interested in this lurid example in the New Yorker: A cybersecurity firm's stunning rise and sharp collapse. It's a tale of creating new realities and selling them. Like in Hollywood, only more illegal. 


No comments:

Post a Comment