Wednesday, May 06, 2009

Difficultie, er, Difficulty vs C0mp13xi7y

Sorry for the l337 garbage in the title. I've been absorbing the ideas from Dr. Ed Nuhfer (see previous post) about knowledge surveys, and in particular the idea that complexity and difficulty are orthogonal. That is to say, different dimensions. We could think of an abstraction of learning that incorporates these into a graph. Mathematicians love graphs. Well, applied mathematicians love graphs anyway.
I have used my considerable expertise with Paint to produce the glorious figure shown above. It's supposed to lay bare an abstraction of the learning process if we only consider difficulty (expressed here inversely as probability of success) and complexity. I think it's fair to say that we sometimes think of learning happening this way--that no matter what the task, probability of success is increased by training, and that more complex tasks are inherently more difficult than less complex ones. This may, of course, be utterly wrong.

We encountered Moravec's Paradox in a previous episode here. The idea is that some things that are very complex seem easy. For example, judging another person's character, or determining if another beer is worth four bucks at the bar. So, it may be that the vertical dimension of the graph isn't conveying anything of interest. But I have a way to wiggle out of that problem.

If we restrict ourselves to thinking about purely deductive reasoning tasks, then success depends on the learner's ability to execute an algorithm. Multiplication tables or solving standard differential equations--it's all step-by-step reasoning. In this case, it seems reasonable to assume that increased complexity (number of steps involved) reduces chance of success. In fact, if we abstract out a constant probability of success for each step, then we'd expect an exponentially decaying probability of success as complexity increases because we're multiplying probabilities together (if they're independent anyway).

We could test this with math placement test results. An analysis of the problems on a standard college placement test should give us pretty quickly the approximate number of steps required to solve any given problem (with suitable definitions and standards). The test results will give us a statistical bound on probability of success. Looking at the two together should be interesting. If we assume that Pr[success on item of complexity c] = exp( d*c), where d is some (negative) constant to be determined, and c is the complexity measured in solution-steps, then we could analyze which problems are unexpectedly easy or difficult. That is, we could analyze the preparation of students taking the placement not just by looking at the number of problems they got correct, but the level of complexity they can deal with.

I've written a lot about complexity and analytical thinking, which you can sieve from the blog via the search box if you're interested. I won't recapitulate here.

I'll see if I can find some placement test data to try this out on. It should make for an interesting afternoon some day this summer. Stay tuned.

Update: It occurs to me upon reflection that if the placement test is multiple-choice, this may not work. Many kinds of deterministic processes are easier to verify backwards than they are to run forwards. For example, if a problem asks for the roots of a quadratic equation, it's likely easier to plug in the candidates for the answers and see if they work than it is to use the quadratic equation or complete the square (i.e. actually solve the problem directly). This would make the complexity weights dubious.

No comments:

Post a Comment