Friday, June 09, 2006

General Education Assessment

We now have three years' data on our general education assessment model. You can read about it in great detail in Assessing the Elephant. I have so far focused on checking the validity of the data to see if it actually means anything. I'm beginning to think it does. Here's a graph showing the distribution of scores over time as a class ages.
The data here aren't truly longitudinal--it's a composite of three classes to give a good sample size and show performance shifts over four years. A similar approach compares students based on their overall grade point averages. All students in these data sets were still attending as of Spring 2006, so survivorship issues are controlled for.
Here you can see that the better students actually seem to be learning. The middle class of students learns more slowly, and the students with GPA < 2 don't seem to be improving at all. These studies and others seem to validate our approach. The holy grail of this investigation is to be able to pinpoint the contribution of individual courses to a student's skill improvement. I've worked on that quite a bit but have concluded that I don't have enough samples yet, and I haven't developed a sophisticated enough approach to the problem. Stay tuned.

Wednesday, June 07, 2006

Article on the Commission on Higher Ed

The article in University Business is out now. The online version has a cool Flash-driven interface that lets you flip pages. I had a conversation with a friend--I'll just call him Bob--the other night about the topic. He was pretty critical of the idea of using earned income as a proxy for learning. Bob was also quite dubious about the prospect of trying to isolate the effect of college from other factors. This problem would, of course, be shared by any other approach as well. The reason is that people who choose to go to college are different from those who do not. How much difference there is might be detectable by looking at partial completion, two years vs four years, etc. I think this can be overcome, and would certainly be easier using income rather than scores on a test. Another problem is how to account for graduates who start their own business or otherwise have their actual "income" being non-comparable to a salaried position. And what about people who intentionally choose lower-paying jobs for personal reasons. Someone who graduates and heads off on a Peace Corps mission may not really be judged simply by looking at salary. This begins to prompt questions about what it is we really value. That discussion should precede any measurements and judgments we make about the effectiveness of higher education, and certainly precede actions.

Friday, June 02, 2006

WEAVEonline

I went to a very good conference at Elon University on Wednesday--the North Carolina Independent Schools and Universities Assessment Conference. One of the speakers was Jean Yerian from Virginia Commonwealth University. They have developed their effectiveness planning management system into a commercial product that she demonstrated for us. You can get more information at www.weaveonline.net. I'm not sure what the cost is, other than it's FTE-driven. The strengths of the system seem to be rich reporting features including and administrative overview that quickly shows compliance status. So if the English Department is slow about putting their plans, assessments, actions, or mission on the system, it shows up as Not Begun, In Progress, or Complete. They are planning to add a curriculum mapping feature that creates a matrix of courses and content per discipline. There is a comment box that allows notes about budgetary impact of activities and follow-up plans. As yet they do not have a document repository built into it, but I think that may be in the works too.

This product has many more features than openIGOR does, and it's designed with a slightly different purpose in mind. The functionality of openIGOR starts with the file repository and builds up from there, whereas WEAVEonline goes in the other direction. A weakness of both systems currently is that they do not tie directly to evidence of student learning without extra work. The logical extension of each of these systems would be a link to a portfolio system. We currently have an electronic portfolio system, but it's not connected directly to assessment reporting yet. I hope to have that programming done this summer, though.