Friday, September 18, 2009

Complexity and Reboots

One of two stacks of books on my nightstand. The thin volume on top is on simplicity. Under it is a thick textbook on complexity theory.

Complexity is one of those really profound ideas that is surprisingly simple to access. The basic idea is that the complexity of some set of data is the size of the smallest description. So a one with a million zeros is not very complex because you can write it as 10^1000000. Or 10^(10^6), which shortens it by one character. This idea wouldn't work if all methods of description weren't somehow interchangeable. We're really talking about computer-code-like descriptions, and the argument goes like this: Any computer language that has certain basic functionality can be used to create an interpreter for any other kind of language. So we can switch between one and the other by paying the cost of this translation. It is in this sense that descriptions are language-invariant, and so there is something like an absolute measure of complexity in this formal sense.

It's a powerful idea, even in mundane affairs. I'm a simplicity hawk when it comes to creating new bureaucracy, for example, or creating a web interface. Most things end up with more whistles and bells than are good for them. And the more complex a system is, the more unexpected its behavior can be. This is probably why evolution hasn't invested a lot of "effort" in making organisms live forever.

Ever wonder about that? Why do our bodies suffer senescence? Why didn't we evolve with robust repair mechanisms so that we could go on reproducing ad infinitum? Think of all the effort it requires biologically to start all over again with a baby and create a new specimen capable of reproduction. There must be some good reason.

It's like a reboot. Running computers, unless they're running perfectly stable software, tend to accumulate problems in the machine's state. As I'm sure you've experienced, this can cause the thing to blue-screen and freeze. Rebooting eliminates the accumulated complexity. Same thing with biological reproduction, I assume. I imagine there's a limiting factor that works like this: any self-repair mechanism incurs a complexity cost. That is, it's easier to build a system that can't self-repair than one that can. So you have to add complexity in order to reduce it. At some point, it's self-defeating to try to add more repair ability because it isn't capable of recovering its own cost. I have no idea if this is true, but it's an amusing theory.

What does this have to do with higher education? More generally, it has a lot to do with self-governance. Faculty senates, state and federal legislative bodies, and so forth are run by systems of rules, at least in theory. Over time these accumulate complexity as exceptions are created and new conditions included: think of the tax code. I would hypothesize that it would be a good thing to build a reboot procedure into such things.

Consider a faculty senate, which in a moment of singular passion, changes its bylaws so that all future motions can only be carried with every voting member present and voting for the motion. This is tantamount to self-destruction. On the other hand, the rules and committees could proliferate to such a degree that processes slowed to a crawl. I almost wrote "glacial speeds" but the ice mountains are moving faster these days than most committees I observe.

No comments:

Post a Comment