Wednesday, July 14, 2010

Publishing Learning Outcomes

The National Institute for Learning Outcomes Assessment, one of George Kuh's projects (at least, he's the director) has a new report "Exploring the Landscape: What Institutional Websites Reveal About Student Learning Outcomes Assessment Activities" (pdf) that reports on a survey of web sites. This compares what schools reported about what they were doing via a NILOA survey to what is actually found on their website. The findings are not shocking to anyone who's ever gone looking for learning outcomes data (as opposed to plans, which are plentiful). See my post "404: Learning Outcomes," for example.

Judging from the web, the report says, there's a lot going on that's invisible to the public. Also, learning outcomes seems to be driven by regional accreditors. Not surprising. One of the interesting bits is the detail of what institutions are publicizing as their outcomes assessments. The graph below is taken from page 14 of the he report:

This graph is not about who is doing what so much as who is bragging about doing it. So privates are advertising capstones more than publics. Unfortunately, the capstone experience is not an item surveyed on the previous "what are you doing" survey from NILOA cited earlier. The closest comparison I could find is the following graph from page 11 of that report.
The mismatch may be that capstone experiences by themselves aren't really assessments in the usual meaning. It's interesting to note that the for-profits report doing more assessment than other types almost across the board, with the notable exception of national surveys. This may be because national surveys are expensive, or it may be that the for-profits don't want to invite comparisons to national benchmarks, or it may just be that they are deemed inappropriate to what are mostly online programs. This graph does show that the for-profits can play the "assessment game" better than traditional institutions. They have more control and more money. (By "game" I mean satisfying the the external requirements).

Conclusion of the report:
Institutions have more student learning outcomes assessment activities underway than they report on their websites. To meet transparency obligations and responsibilities, institutions should make more information about student and institutional performance accessible via their websites. 
I'm not quite sure what transparency obligations there are. I'm unaware of any from our regional accreditor. Perhaps this just picks up language from the Spellings Report. I do think institutions miss a trick in using learning outcomes in marketing programs, although the privates seem to be talking about capstone courses. Providing comparable and meaningful information that one can put in a spreadsheet and compare colleges is probably a pipe dream, if that's what's meant by transparency.

So much of a college experience is singular, random, and individual. I am on vacation back in Illinois this week, and arranged to have lunch with my math advisor and eventual collaborator on a software project. My daughter was at the lunch too, so we bored her with the story of my first encounter with the guy who would change my life. I mean this literally: without his encouragement I would never have gotten a doctorate in math.

I was seriously into computers at the time, banging out artificial life simulations and space simulators on an Apple II, and was attracted to MAT 475 Numerical Analysis as a junior at SIU. The undergrad advisor I was assigned to told me I'd have to get permission of the instructor, so I went up to the fourth floor with some trepidation.
Me: I'd like to be in the Numerical Analysis class.
He: It'll be bloody hard.
Me: Okay.
He wasn't kidding. I think I had a 50% average going into the final, but was having too much fun to quit. I didn't realize that 50% was a solid B in his class until later. That class led to a long friendship and collaboration that has now lasted decades. But it was random, not the sort of thing that is ever going to be revealed with mass "transparency" reports. This is not to say transparency is a bad thing, but like standardized testing, it's not a substitute for the real thing (the actual experience of an individual student), which is probably not remotely predictable.

1 comment:

  1. Thank you for sharing this information. It seems that there is a discrepancy between the reports issued by the schools. I think by now transparency is not visible but its possible in the future.