Tuesday, August 18, 2009

Assessing Away Grades

It's an Assessment Director's dream or perhaps nightmare: to replace course grades with assessments. Further, to have such buy-in for said instruments that students need not even take a course to get credit. This is radical and commonsensical at the same time. If it's the result we care about, and not the process by which it was reached, then why not check for the result right off the bat? We do that with placement tests, for example. There's also the practice of granting course credit for "life experiences," but this conjures diploma mills, and tends to lift eyebrows among the conservative types who think mill work should be properly done overseas if at all.

The conservative types are going to be interesting to watch. Remember when digital cameras came out? I was a hard core darkroom, wet film rolled into the can, fixer mixed from a bag, stink of stop bath kind of guy. It took me a long time to buy a digital camera and put my 35mm Pentax on the shelf for good. It's tough to give up a whole skill set and have to learn a new one (Photoshop, or the free software The Gimp). Imagine what it would be like to change from grades to assessments as an industry. Minds will be boggled.

It could be bad. Assessments can be what they want to be now, more or less. As long as a program makes a credible effort to close the proverbial loop, there's a pat on the back and a smile from the accreditor due to the hard-working Director. But that all changes if the ultimate certification of a course, program, or institution rides on those assessments. One effect could be to squeeze subjectivity out of the process, because it has to be defensible in court if necessary. (I've argued that subjectivity is not a bad thing when considering complex outcomes.) In order to protect the part of one's body most necessary for meetings, instructors would (I imagine) tend to inflate assessments that can be questioned. I'm certain this happens with grades--basic game theory. In any event, it's more visible and probably more stressful job for the assessment director.

It would be good, however, to focus on actual outcomes rather than statistical goo resulting from a formalized melenge of marks. This comes back to process vs. result. In my experience, most grades are a combination of the course's work. This may have resulted from the need to motivate students to do work continuously throughout the semester. It works differently in other countries, of course, where Das Exam may be the single grade a student earns.

Extending the concept of assessment-as-certification, one might imagine that a university would grant a degree without the student ever participating in a class at all, but rather by passing all of the outcome assessments. If these assessments are similar, I presume that this could occur at multiple institutions, so that a resume could record several degrees from different colleges, all obtained at the same time. This is weird stuff from a registrar's perspective--it changes the way higher education works.

Not far from this idea is the related conception of a central agency that does the actual certification. This solves legal problems for an institution regarding validity of assessments, and creates a standardization that would warm the heart of the erstwhile Dept. of Ed. (I'm not yet sure about the current one), and probably accreditors too. I'm sure ETS or ACT would kill to get into that business. Individual departments and programs wouldn't like it much, I imagine--there's nothing left for them to do but play the tune on the sheet.

If the outcome is the thing, the process can be standardized and probably should be; this is the central philosophical fulcrum upon which this debate must turn. Years ago I remember reading about a commercial chemical lab coming up with a formula for creating Scotch imitations that didn't need to be aged twelve years. If it's chemically the same, is it the same? It's difficult to find a reason to say no.

This debate is not "merely academic." Western Governors University uses outcomes assessment exclusively to offer a full catalog of undergraduate and graduate programs. I mentioned in a recent post that it's accredited by the Northwest Commission. It also has NCATE and CCNE accreditations. The process for earning credit is described on the web site:
Our online degrees are based on real-world competencies as opposed to seat time or credit hours. Our focus is on ensuring you possess the skills and knowledge you need to be successful, not whether you’ve attended class or not. ("Show us what you know, not how long you’ve been there.")
Results are valued over process, in other words. I exchanged emails with Tom Zane, who is the Director of Assessment Quality and Validity. I asked about the assessment process and asked his permission to post his response. Here it is:
Western Governors University is a competency-based institution, so we warrant that our graduates possess certain knowledge, skills, abilities, and dispositions. Therefore, it is essential that we measure all aspects of competence that our faculty places into our domains. I am proud to say that our assessment systems are exemplary. We follow the Joint Standards (AERA, APA, NCME) and I’m confident that we have fair and legally defensible assessments. The assessment system is very large, with hundreds of professionally developed instruments of many types. Our assessment department staff builds, administers, scores, and maintains most of the assessments in-house. We use some standardized national exams when the exams appropriately match the domains (e.g., state mandated teacher licensure exams, IT certifications, etc.).
Tom also shared with me a success story:
A single mother of three had been a teacher for several years, but she didn’t have a degree or a license. She was a great teacher. She understood her students, pedagogical strategies, and the content she was tasked to teach. She was slated to lose her job soon after NCLB began. Her local colleges of education told her she would need to attend classes for anywhere from three to five years. Obviously, this was not a good fit for her situation. Instead, she came to WGU. Because she already knew so much, and more importantly, could teach successfully in a live classroom, she was able to skip classes or other learning opportunities and move directly to the assessments to prove her competence. In her program, that meant passing 59 assessments of various types (e.g., traditional exams, performance assessments, portfolio requirements, reflection pieces, a series of 8 live in-classroom performances during student teaching, and others). She was able to hold onto her job, feed her family, and graduate in 18 months.
Finally, here are some technical references for the assessment leaders out there who want to know more about Tom's work at WGU:
  • Zane, T. W. (2009) Reframing our thinking about university-wide assessment. [PowerPoint Slides]. Presentation at the NASPA Annual Conference, (New Orleans, June 10-13).
  • Zane T. W. (2009, April). Tenets of psychological theory that guide performance assessment design and development. [PowerPoint Slides]. Presentation at the annual meeting of the American Educational Research Association, San Diego, April 12-18, 2009).
  • Zane T. W. & Johnson, L. J. (2009, April). Decision precision: Building low-inference rubrics for scoring essays, projects, and other performance-based artifacts. [PowerPoint Slides]. Workshop presented at the annual meeting of the American Educational Research Association, (San Diego, April 12-18, 2009).
  • Zane, T. W. (2009). Performance assessment design principles gleaned from constructivist learning theory [Part 2]. TechTrends, March/April. http://www.springerlink.com/content/g180503016228083/
  • Zane, T. W. (2009). Performance assessment design principles gleaned from constructivist learning theory [Part 1]. TechTrends, January/February. http://www.springerlink.com/content/g180503016228083/
  • Zane, T. W. (2009). Guided reflection: A tool for fostering and measuring deep learning of characteristics such as open-mindedness and critical thinking. Paper presented at the American Association of Colleges and Universities Conference, (Baltimore, February 26-28, 2009). http://epac.pbworks.com/2009_02-AACU-Network-for-Academic-Renewal {Note: The paper is unavailable because it is under consideration for publication.}
  • Zane, T. W. (2008). Domain definition: The foundation of competency assessment. Assessment Update: Progress, Trends, and Practices in Higher Education, 20(1), 3-4. http://www3.interscience.wiley.com/journal/117887274/abstract?CRETRY=1&SRETRY=0 {Note that the Assessment Update article was published in a special edition that contains multiple articles about WGU.}
  • Zane, T. W. & Schnitz, J. A. (2008). Linking theory to practice: Using performance assessment and reflection to prepare candidates for student teaching and reflective practice. Paper presented at the 14th Annual Sloan-C International Conference on Online Learning (Orlando, November 5-7, 2008). {Note: The paper is unavailable because it is under consideration for publication.}
  • Zane, T. W. (2008). How online programs can assure students attain crosscutting themes: Diversity, reflection, and more. Paper presented at the 14th Annual Sloan-C International Conference on Online Learning, (Orlando, November 5-7, 2008). {Note: The paper is unavailable because it is under consideration for publication.}
  • Zane, T. W. (2008). Large-scale e-portfolio testing program design and management. [PowerPoint Slides]. Presentation at the Assessment Institute conference, ( Indianapolis, October 26-28, 2008). http://planning.iupui.edu/625.html

No comments:

Post a Comment