Monday, October 27, 2008

IUPUI Presentation: Why it's hard to use assessment results...

Thanks to all who attended the session! This post is to summarize a couple of main points, give you a link to the powerpoint slides, and give you an opportunitity to continue the discussion by leaving comments.

First, the slides are here. You can also find information about other efforts, like our general education assessment program here.

Main points to consider in designing, analyzing, and reporting, from the talk:
  • Do your own design for anything complex. Only you know what you really want, and remember that you'll have to understand and explain the results. Sorry, CLA!
  • You can assess even what you're not teaching. Faculty love it, if it's done right. For more information, you can read a whole report about our gen ed assessment.Link
  • For learning outcomes, try to find a scale that matches what you want to report longitudinally. For example, remedial work through graduate level. We use four levels, which has worked really well.
  • We don't really measure learning. We estimate parameters associated with it, in a statistical sense. This makes aggragation difficult to justify in most cases (a thousand blabbering fools cannot be summed to equal the product of a fine orator). So avoid averages like the plague--they compress the data too much to be useful. Use proportions and frequencies, or mininum and maxiumum ratings, for example. Of course, when you want pretty graphs that go up, feel free to foist off averages on some constituents who need that kind of thing.
  • [not mentioned] You don't have to report assessment data to deans and presidents. This can make faculty and chairs suspicious about how the information is used, and more likely that it will be 'improved' artificially. Instead, you can have them report improvements they've made, and ask them to demostrate how the data clued them in. This is what you want anyway--continual improvement, not proof of absolute achievement.
  • Learn how to use Excel pivot tables! Really--you'll be glad you did. Logistical regression is useful for predicting either/or conditions like retention. ANOVA is good for sifting through a lot of data for resonant bits.
  • Attitude and behavior surveys are great for linking to achievement data. NSSE, CIRP, and the rest are very useful, but you need student IDs to link with.

I'd love to know if any of this helps you out!