Thursday, February 12, 2009

More Critical Thinking Debate

InsideHigherEd posts an article by Merilee Griffin that frames the debate over learning outcomes and the political pressures to measure them for accountability reasons. Her reference example is that of measuring thinking skills:
At the core of the problem is an issue that hasn’t been discussed much when it comes to measuring learning outcomes in the higher cognitive skills: We literally don’t know what we’re talking about.


There is widespread agreement that “critical thinking,” for example, is terribly important to teach: the term pops up in nearly every curriculum guide and college catalog. There is no agreement, however, about what critical thinking is.
The argument recapitulates a running one I've presented here: testing is easy for analytic/deductive processes, hard or impossible for inductive/creative processes. What's more interesting is the long list of comments on the article--the beauty of the internet. These run the gamut from blaming the faculty to "of course it can be measured" to the interesting notion that the faculty themselves can't think critically, so why should they be expected to teach such a thing?

It's interesting to view this debate through the lens it attempts to focus: does this debate demonstrate critical thinking? Much of the debate embodied in the comments is speculation and opinion. I think we can agree that doesn't qualify. There's a claim or two of "such and such is supported by the evidence," but the hyperlink provided is tangential or lacking.

One comment author with some empirical evidence to bring to bear is (self-named on the comment form) Robert Tucker, President of InterEd, Inc. The conclusion of his experience in trying to build critical thinking tests is that:
Critical thinking is not so much a construct as a family resemblance concept (i.e., there is no single criterion common to all cases we want to call “critical thinking;” instead, like a braided rope where no single strand runs the full length, individual criteria are shared with a subset of cases with considerable overlap across various criteria and cases). I have come up empty whenever I have attempted to find a single non-trivial, non-tautological facet of critical thinking that all cases of critical thinking have in common.
This adds another dimension to the 'I know it when I see it' definition of critical thinking and its ilk--it isn't even the same thing to everyone. You may think that X is evidence of critical thinking, and I may disagree. In fact, one can argue that we should be teaching our students to be good citizens by thinking critically about civic engagment, including assessing who is the best candidate. But first we'd have to agree on the process and outcomes. So theoretically, we could present an array of facts, test students on whether or not they judged the correct candidate to be the best choice. This gives rise to a couple of questions:
  1. Are we so sure that this magic process exists, of finding the best solution to a fuzzy problem?
  2. If so, then why are we so hesitant to apply it and advertise the results?
Any instructor who did what I suggest--'analyzing' and pronouncing fit the 'best' candidate in a political race--would be pronounced biased and unobjective. Why? Because significant portions of the population will disagree with any decision. Are they all faulty in the thinking department?

Big fuzzy problems don't admit nice neat solutions. Evidence of critical thinking is not found in the solution, but the methods of approach, which are not guaranteed to work. So forget about the outcome, this demonstration of critical thinking. It's much better to focus on building thinking tools individually: tools and techniques that can be used in appropriate context. Higher education has bitten off more than it can chew with this unfortunate idea of teaching "critical thinking."

No comments:

Post a Comment