Thursday, October 29, 2009

Critical Thinking Buzz

Yesterday I promised to revisit the different points of view about critical thinking evidenced at the Assessment Institute. This is a big topic, and I've babbled about it before (here). It's important because so many institutions plug it in as a learning objective. It's especially dear and near to the circulatory organs of liberal arts schools.

First, I have some bad news. Just look at the graph of search hits from google trends to unveil the sad tale.
The story is clear: searchers are becoming less interested in thinking and more interested in ignorance, with the latter spiking in about June 2009. In case you think student engagement might be riding to the rescue, the sad little red line at the bottom says otherwise. Interestingly, searchers seem to lose interest in either aspect of cognition around the holidays. After much research, I discovered the reason behind this, shown below.

Okay, maybe that's enough silliness :-)

Here's what I heard about the subject at the conference. Three things. First, as I mentioned yesterday, the opening plenary revealed a divide among the panelists when the topic came up as a part of a wider conversation. I was writing furiously and am not sure who to attribute this statement to:
There is no evidence that there are generalizable skills like critical thinking. You have to master one domain first.
This echoes my own thoughts and classroom experiences on the matter, which I've described previously. In a nutshell: developing significant ability to do deductive reasoning is a prerequisite to doing interesting inductive reasoning. If this is true, major programs are more likely to cultivate complex thinking skills than a broad curriculum like general education.

Other panelists disagreed, but there wasn't time to have a proper debate on the issue.

The second encounter was in a session about integrated assessment, where gen ed-type skills like communicating and thinking are threaded throughout the curriculum. Critical thinking was explicitly one of those. During Q&A I offered my own (heretical, I suppose) thoughts on the topic and was rather sternly admonished that the problem had been solved by psychologists and all I had to do was look in the literature. Heck, that may be true, and I've started looking. I'm a math guy after all, not a psychometrician. But if there was overwhelming theory and empirical evidence for a particular model, why is there still debate? Is it like Darwinian evolution, where some simply reject it because of dogma? I doubt it, but I'll try to find out. In the meantime, color me dubious. I asked the session speaker for some references yesterday by email.

I did find an accessible article by Tom Angelo, who was on the plenary panel, called "Beginning the Dialogue: Thoughts on Promoting Critical Thinking." It was published in 1995, and the opens by saying about critical thinking that "Despite years of debate, no single definition is widely accepted." This was actually confirmed in the session I mentioned, where it was taken for granted that critical thinking in an English course is different from in a Math course. This by itself isn't fatal to the idea; after all, writing is different too, but we can try to teach writing across the curriculum. But the subtleties are important.

For one thing, on a very basic level, I can watch Tatiana write a paragraph and say with confidence "this student did some writing." Evidence of thinking is different. I can look at that same paper and try to imagine what went on inside her head when she wrote it, but I can't really know that she was thinking at all. Maybe she put random words on the paper, or quoted something she had memorized. There's an empiricism gap. I can count words written. Quantifying (even with a binary yes/no) critical thinking is not so straightforward. To push that a bit further, imagine taking a piece of work from an English class and sticking it into a stack of math papers being graded. The math prof squints at it and wonders what the heck this is. Can the prof then pronounce whether or not critical thinking has taken place in the English class by inspection? Or is he/she only competent to judge in the domain of math? Reverse the situation. An English instructor sees a complex page of handwritten formulas and text, purporting to settle the Continuum Hypothesis once and for all. If you don't have technical expertise in an area, it's virtually impossible to judge what level of thinking has occurred. But maybe that's not what we mean. Here's a definition Dr. Angelo likes in the article, quoting Kurfiss:
[Critical thinking is] an investigation whose purpose is to explore a situation, phenomenon, question, or problem to arrive at a hypothesis or conclusion about it that integrates all available information and that can therefore be convincingly justified. In critical thinking, all assumptions are open to question, divergent views are aggressively sought, and the inquiry is not biased in favor of a particular outcome.
Allow me to point out a couple of things here. First, the creation of a hypothesis is (if true) the creation of new knowledge. And because we want it to conform to facts and hold up to new evidence that arrives, this is very similar to inductive reasoning. It sounds like the scientific method. But in order to do any kind of inductive reasoning, you have to have some knowledge of the deductive processes active in that domain. You can't write a math proof without knowing propositional logic; you can't solve a problem the rudder on your 747 unless you know how the thing works; you can't create complex financial leveraging instruments unless you understand the risks. Well, maybe that last one was a bad example.

One curious aspect of critical thinking assessment is that although the language surrounding it mentions all kinds of desirable habits of mind, higher ed kind of dodges the issue. Maybe there's some institution out there that really tackles teaching of self-assessment, open-mindedness, thoroughness, focus, and so forth. I'd love to know. Outcomes like what a student wrote on a piece of paper is far downstream from the actual events that led to its creation. There are some surveys that try to assess habits of mind--the CIRP for one. Who's trying to teach them that outside of orientation class? There is therefore a curious disconnect between our actual desired outcomes and what we teach.

I think it's healthy to ask another level of why. Why do we want students to think critically? We get stuck in our own curricular bubbles, perhaps. Let's step outside for a moment. What is gained if critical thinking--however you define it--is employed?

Is it because we want active citizens? Or we want good problem-solvers? Or we want entrepreneurs? Whatever the answer is, it's likely to be more easily pinned down than the amorphous one of critical thinking. We can actually look at evidence of citizenship or problem-solving. We can create programs and curricula to address them. And don't forget the habits of mind thing--for my money, that's an untapped vein of gold, and also amenable to assessment. At the right point in the report cycle, the assessment director can still aggregate the heck out of all the types of "critical thinking" on record and serve up a glop of statistical goo to the admins. Just put the graphs in color--that means more than any data you put on there. And be sure you put the right logo on the thing.

The third encounter with critical thinking I had was indirect; I heard about it through a conversation. What I was told was that on a Tuesday session U. Phoenix presented served up "Assessment Methods: Creating a Critical Thinking Scoring Instrument as a Tool for Programmatic Assessment." I can't find the slides online, but I'll try to get them, and ask the presenters what happened. Apparently there was essentially heckling of the presenter(s) about the definition of critical thinking, and my companion's assumption was that this was related to the fact that it was U. Phoenix presenting and not, say, Alverno College. If so, this is a sad irony. If a group of professionals gather to talk about critical thinking and don't actually demonstrate the ability to do it, where are we? I will suggest to the organizers that they replace iced tea refreshments with Scotch next time.

2 comments:

  1. Here's my completely opinionated and subjective view: the teaching of writing, combined with lots of reading and real science instruction (as opposed to following recipes and calling them experiments, or building volcanoes every year), is the best route to critical thinking.
    Allowing students to experience the sheer pleasure of deep thinking and discussion without testing anything works best. Their writing will show the depth of their thinking perfectly well without any need for critical thinking assessment tests.

    ReplyDelete
  2. Rebecca, that's how we structured our portfolio system review at my previous place. Even in math, we used written evidence from the capstone course to look for analytical and creative thought. We set it up to compare to sophomore efforts, so we could look for growth. Very subjective, but rich in meaning.

    ReplyDelete