With ADS, you get minute-by-minute teaching guides, thousands of practice tests, and other materials for turning your school into a 21st-century test-preparation factory.The whole thing is pretty cringe-worthy, because like any good parody it's not far from truth. For example, the SAT critical reading section gives the test-taker a chance to show how well she understood what she read. From the College Board's SAT test preparation web page:
Here, 'critical reading' has been reduced to simple understanding of what you've read. No wonder, since actual critical thinking is such a mess to assess.There are two types of multiple-choice questions in the critical reading section:
- Sentence completion questions test your vocabulary and your understanding of sentence structure. (19 questions)
- Passage-based reading questions test your comprehension of what is stated in or implied by the passage, not your prior knowledge of the topic. (48 questions)
So it's easy to take shots at the current methodology of mass production testing, which I refer to sometimes as neo-phrenology, but is it so easy to put your finger on the exact problem? Obviously, we want to know whether or not the students are learning, and so we test them. If the tests and test conditions are all the same, then it's fairer than testing willy-nilly. Two short steps, and we have arrived at industrial testing. So where did we go wrong?
The problem is a nasty little enthymeme.* Instruction and testing, as I experienced it in school, and as my daughter is now, focus primarily on analytic/deductive processes. That is, low-complexity thinking. Let me explain.
Suppose you take a quiz on US state capitals, and you know 49 of them. How hard it is for you to figure out the 50th from your knowledge of the 49? It's clearly impossible--there is no information in the first 49 that contributes to the knowledge of the 50th. In informational terms, the names are random. This just means that you have to know it because it's not figureable out. In other words, you can't get there by generalizing.
Much of education is of this sort. What date was the Pearl Harbor attack? You either know or you don't. It's not like you can go scribble in the corner for a while and come up with the answer through sheer mental effort.
Some problems are solvable from given information, of course. This is largely in the sciences, where some raw materials are given, and the solution to a problem is latent in those materials. Chemical reactions, physics modeling, math word problems, and the like all depend on more advanced forms of deductive reasoning (or analytical thinking if you like). This can become quite sophisticated, such as solving systems of linear differential equations by finding eigenvalues of matrices, and so forth. But this is still not generalizing, it's just applying more complicated methods of analysis--the rules are more dense.
The hidden assumption inherent to standardized testing is that if students are good at analytical/deductive thinking, they will be good at creative/inductive thinking that leads to insightful generalization. That is, that students who test well will be good at generalizing problem solving. This skill is much harder than analytical reasoning, because there is only trial and error, seasoned by experience, as a guide. And yet the ability to solve problems creatively, generalizing from what one already knows, is arguably the most prized kind of thinking; this is where new knowledge comes from.
In science class, students sometimes taught the scientific method by looking at information and forming hypotheses. Literature classes are hard because of the fuzziness inherent in generalized, creative thinking. Same for geometry, where analytical methods are taught as a conduit to creative proof construction. These are valuable experiences because they teach new modes of thought, essentially finding patterns and exploiting them.
It's hard to use a fill-in-the-bubble test to look for generalized thinking because the answers are right in front of you. The whole point of a creative endeavor is that the answer isn't given, and may not even exist! Even free-form elements like writing a short essay are subjected to analytical measures when scoring in a standardized context. How do you teach a computer to measure creativity, after all?
By placing so much importance on standardization, we emphasize rote problem solving and step-by-step methods--the hallmarks of deductive reasoning. This is not unimportant--one does need to know ones times tables--but it's setting the bar too low. Conveniently, it's far easier to assess and teach analytical skills. Unfortunately, it misses the larger point. We need thinkers who can do the following:
- Look at what's given and recognize connections because of analytical training
- Form ideas about patterns that may exist and test these guesses against what is known
- Generate paths of inquiry to find new information that solidifies or overturns existing knowledge
*I think I may have gotten that line from Giles Goat Boy (another parody of higher ed) by John Barth. Highly recommended.
Although the story is probably more legend than fact, this discussion makes me recall the story of Frederick Smith.
ReplyDeleteWhen Smith went to Yale he wrote a paper for an economics class that showed how an overnight delivery service could operate using modern computers and transportation techniques. Supposedly the professor gave him a "C" and commented that while the idea was interesting, it would have to actually be possible for the professor to give Smith a higher grade.
Smith went on to found FedEx, which delivers packages all over the word overnight using computers and modern transportation techniques.
The sample package displayed in the company's print advertisements encouraged this idea by featuring a return address at Yale.
Even if it's not true, the story for me illustrates some of what you're saying, Smith was given a lower grade to be "punished' for a creative idea that "couldn't work," despite the fact that the global business market depends on that very idea to conduct business today.
Just because it's not a "fact" doesn't mean that it won't be one down the road.