In grad school there was a horror story that circulated about a friend of a friend of a cousin, who was a math grad student in algebra. He had created a beautiful theory with wonderful results, and was ready to submit when it was pointed out to him that his axioms were inconsistent--they contradicted one another. The punchline is that you can prove anything of the empty set. This sometimes also happens to degree programs that suddenly have to prove that they've been doing assessment loops, except in reverse: building grand theories from the empty set.
I complained the other day the there weren't many completed learning outcomes reports from universities to be found on the web. So when I noticed Mary Bold's post at Higher Ed Assessment "Reporting Assessment Results (Well): Pairing UNLV and OERL" I thought I'd hit paydirt. The hyperlink took me to a page at University of Nevada, Las Vegas with a link advertising "Student Learning Outcomes by College." Without futher ado, here's the page:
That's just too funny. There are, however, excerpts from actual results listed in the teacher education site, which you can find here. That site is the OERL that Mary refers to in her post.
It did make me think, however. There must be a bunch of SACS compliance certifications out there on the web now, and section 3.3.1 (used to be 3.4.1) covers learning outcomes. Want to see how your peers have handled it? The name of the school in the table below links to the compliance certification home page for that institution. For good measure I'll throw in 3.5.1, general education assessment, too. You're welcome.
I did not try to make a complete list of all available reports. If you find a good one, send me the url and I'll add it. Here's my google search criterion.
Disclosure: I was the liaison, IE chair, webmaster, and editor for the Coker College process (as well as doing the IR and a bunch of other stuff--no wonder I have white hair). The linked documents to that one are turned off for confidentiality, but you can find the complete list of program learning outcomes plans and results here.
Observations:
First, hats off to all the institutions who make these reports public. This is a great resource to anyone else going through the process.
I only scanned through the reports, looking for evidence of learning outcomes. I probably missed a lot, so take my remarks with a grain of salt--go look for yourself and leave a comment if you find something interesting. It should go without saying that in order to be helpful, this has to be a constructive dialogue.
For learning outcomes I didn't find as much evidence-based action as I would have expected from all the emphasis that SACS puts on it. My own experience was that programs were uneven in their application of what is now 3.3.1 (at the time SACS didn't even have section numbers for the requirements--how crazy is that? I invented my own, and then they published official ones just before we had to turn the thing in.). So there was a lot of taking the empty set and trying to build something out of it. That can take various forms, which one notices in scanning certification reports:
- Quick fixes: use a standardized instrument like MAAP, MFAT, CLA, NSSE. Of course, it's not really that quick since it would take at least a year to get results, analyze them, and use them. The conceptual problem is tying results to the curriculum (except for MFAT).
- Use coursework: passing a certain course certifies students in X (e.g. use of technology), passing a capstone course with grade X ensures broad learning outcomes. This is fairly convincing as gate keeping, but hard to link to changes unless specific learning outcomes are assessed.
- Rubric-like reporting. Okay, I'm not a big fan of rubrics when employed with religious zeal and utter faith in their validity. But I have to admit that the most convincing report summary I saw on learning outcomes was the one below from Mitchell Community College. Not all the data points are there, but that's realistic. Take a look.
I believe that if our efforts as assessment leaders are to be maximally useful, we have to make the annual, messy, incomplete, inconsistent, but authentic program-level plans and results available to the public. This would encourage us to adopt some kind of uniformity in reporting, and improve the quality of the presentation (maybe I'm a fool for saying that). The only downside is that if we're honest, there will be empty sets here and there--programs that have not been dragged into the 21st century yet. But transparency can help there too, perhaps into shaming some into compliance. Just imagine (really dreaming now) if the quality of the reports were good enough to use for recruiting and paint across the program web page.
The Voluntary System of Accountability tries to do something like that. Unfortunately, that group seems to be enamored of standardized tests for learning outcomes. There's a validity study they just published here that you can consider. This post isn't the place to go into all the reasons I think standardized testing is the wrong approach, so let me just leave it at that.
Thinking more positively: is there any interest out there to form a loose consortium of schools that report out annual learning outcomes for programs? The role of the consortium could be to settle on some standard ways of reporting and defining best practices?
Fascinating topic, but I really have a hard time with "404s" at Coker -- why is the Faculty Manual off-line? How can I link to your SACS faculty rosters?
ReplyDeleteGlen S. McGhee, FHEAP
Glen, There's no reason for those particular docs to not be linked. When I got approval to post the report online, it was with the understanding that sensitive documents like board minutes would not be included. At the time, the easiest thing to do was simply not upload the reference document folder, so I wouldn't have to inspect the thousand or so documents one by one to see what should be protected. Coker did make its learning outcomes reports available in their entirety, which is very unusual. I don't actually work there anymore, so it's not in my power to go back and link to the faculty manual or rosters.
ReplyDeletedave
"I don't actually work there anymore, so it's not in my power to go back and link to the faculty manual or rosters."
ReplyDeleteI am very sorry to hear that, although it doesn't surprise me.
Often, IE folks are like hired guns, that come into town to clean it up on contract, make lots of enemies getting the hapless school through review, only to be dumped over the side at the first opportunity. It is not a life for the faint-hearted.
I quite enjoyed your Assessing the Elephant ...
Actually, it was a very tough decision to leave. Coker is a wonderful place to work, and I'd been there 17 years.
ReplyDeleteThanks for the note on the Elephant. Have to find time to update it.
This is very interesting as my community college has not yet been required to perform Learning Outcomes - a colleague and I attended a conference for student housing and Learning Outcomes was discussed. I never heard of it before then, and just now I'm looking for more information so my college can get a head-start.
ReplyDeleteThe questions I have so far are:
1. What departments are required to assess?
A> Looks like the answer is "all non administrative departments" from what I've read so far...
2. How often does the assessment have to be applied?
A> Annually...?
3. Whom do we report to?
A> SACS?
4: From whom will the requirement come from?
A> SACS...?
5: What format must the assessment reports be in?
A> ???
Any input would be appreciated, we'll keep searching. Thanks so much for the article and the links.
The game is afoot!
Cannon, I am most familiar with SACSCOC standards, so your mileage may vary. All academic programs need learning outcomes plans, as well as student-services. Assessment cycles are typically annual, but the activities happen all year long. You would be responsible to your regional accreditor's standards. For the format of assessment reports, it's generally three or more objectives of this form, but there are a number of variations:
ReplyDeleteDate
Program Name
Coordinator Name
Statement of Objective
Assessment method for the objective
Assessment results
Analysis of assessment results
Actions planned to be taken
Follow-up: what was actually done?
It takes a whole cycle to be able to write down all of this. Up through "assessment method" is the plan of action, and everything afterwards is what actually happened.
See "Assessment Essentials" by Banta and Palomba for a good reference--one of many out there. There's also an email list called ASSESS-L you can join.
Good luck!
Thank you kindly Dave! This is superb; more than I hoped for. I'll certainly look up "Assessment Essentials" by Banta & Palomba.
ReplyDeleteI look forward to checking your other writings - you've made a fan of me. :]
Big thanks from West Texas!
You're quite welcome! My contact information is on my resume (upper left on this page). Please call or email if there's something I can help with.
ReplyDelete