Thursday, January 14, 2010

2020 Gogs and Recruiting Costs

Everyone, it seems, is developing "2020" plans, a riff on the perfect vision score. The best quote I saw (on reddit) was that in 10 years we'll look back with 2020 hindsight. In that vein, The Chronicle has published an article called "The College of 2020: Students." You have to pay for the full report, but the executive summary is here (pdf). The abstract:
This is the first Chronicle Research Services report in a three-part series on what higher education will look like in the year 2020. It is based on reviews of research and data on trends in higher education, interviews with experts who are shaping the future of colleges, and the results of a poll of members of a Chronicle Research Services panel of admissions officials. ... Later reports in this series will look at college technology and facilities.
For the IR folks, Noel-Levitz has an interesting Cost of Recruiting report, which you can find in pdf form here. This gives good benchmarking information for assessing the efficiency of your admissions office.

Wednesday, January 13, 2010

Writing Projects

In the Southern Association (SACS) region, a Quality Enhancement Plan is now part of the decennial accreditation reaffirmation process. This is a project to improve student learning. At Coker we focused on writing, and I've stayed interested in the idea of how to better teach and assess writing. After bumping into several others at the annual SACS meeting with similar challenges in this area, I decided to try to make a list of writing QEPs. This is necessarily incomplete. If you have others I can add to the list, please email me.

The hyperlinks are to QEP documents where I could easily find them. I will update this list as I get more information.

Auburn University-Montgomery (WAC site)
Caldwell Community College & Technical Institute
Catawba Valley Community College
Central Carolina Community College
Clear Creek Baptist Bible College
Coker College
Columbus State University
Judson College
King College
Liberty University (pdf)
Lubbock Christian University (pdf)
South College
Texas A&M International University
The University of Mississippi
University of North Carolina Pembroke (pdf)
University of Southern Mississippi (pdf)
Virginia Military Institute (qep) (core curriculum)

One source: List of 2004 class QEPs from SACS (pdf)

My blog posts on writing assessment

Friday, January 08, 2010

Assessment Committees

In planning for a review of SACS 3.3.1 compliance university-wide this spring, we had to consider how to structure the process and committee structure. After discussion, it occurred to me that there are really two different things going on, and that they might be profitably separated. One is the ubiquitous Assessment Committee, which has been useful over the last year as a kind of R&D group--thinking up ways to more effectively assess learning outcomes. One subcommittee worked on technology (eportfolios, for example), and another on results and meaning (I called it the epistemology committee). Both of these are composed of mostly faculty.

But the process of review is something else. For one thing, assessment is only one component, and arguably not even (gasp) the most important. In order for the assessments to be useful, several things have to go right. Things like thinking about what assessments would be meaningful in advance of other planning, the organizational follow-through, and use of results while bearing the big-picture in mind. To me, this sounds like a job for department chairs. So, I will see if I can get a small group of chairs to for an Academic Effectiveness Committee to compliment one on the administrative side. The Assessment Committee can still do the R&D, but the actual review of program reports will be done by the new creature.

Speaking of which, there is an interesting discussion on the SACS-L listserv (see this post) about what the standard of success should be for 3.3.1. For the non-SACS folks, this is accreditation speak for the requirement to close the loop in effectiveness planning, expressed for learning outcomes thus:
3.3.1 The institution identifies expected outcomes, assesses the extent to
which it achieves these outcomes, and provides evidence of
improvement based on analysis of the results in each of the following
areas: educational programs, to include student learning outcomes
At the heart of the discussion is the meaning of the language in 3.3.1, which hasn't changed much since 2006, when I wrote to the authors to ask that they resolve the ambiguity (posted here).

It occurred to me that there is an odd thing about 3.3.1, if interpreted most strictly. It essentially asks every institution and every program to conduct independent scholarly research on the learning of students. To do this right is obviously impractical, so it starts to resemble China's Great Leap Forward:
Mao encouraged the establishment of small backyard steel furnaces in every commune and in each urban neighborhood. Huge efforts on the part of peasants and other workers were made to produce steel out of scrap metal. To fuel the furnaces the local environment was denuded of trees and wood taken from the doors and furniture of peasants' houses. Pots, pans, and other metal artifacts were requisitioned to supply the "scrap" for the furnaces so that the wildly optimistic production targets could be met. Many of the male agricultural workers were diverted from the harvest to help the iron production as were the workers at many factories, schools and even hospitals. [wikipedia]
It would make more sense to give programs a choice. Either they could sign up to do real research on outcomes, OR simply adopt proven techniques that resulted from actual research at a well-funded institution. Why do we need to run thousands of ill-designed experiments on the same subject in parallel instead of a few good ones, and just use the results of those? Of course, this has a dark side, since the big psychometric industry would love to lock up this business. Still, if learning outcomes is our goal, why aren't we focused more on using techniques that are known to work instead of continually trying to discover them?

Wednesday, January 06, 2010

Sliding Toward Vocation?

One of the things I learned about the history of the academy from Robert Zimmer's speech (see this post). He quotes the Berlin model's objectives, the first of which was "to teach students to think, not simply to master a craft." Compare that to some statistics from a recent NY Times article "Making College 'Relevant'."
Consider the change captured in the annual survey by the University of California, Los Angeles, of more than 400,000 incoming freshmen. In 1971, 37 percent responded that it was essential or very important to be “very well-off financially,” while 73 percent said the same about “developing a meaningful philosophy of life.” In 2009, the values were nearly reversed: 78 percent identified wealth as a goal, while 48 percent were after a meaningful philosophy.
Of course, some of this must have to do with the economic malaise. From the article:
In Michigan, where the recession hit early and hard, universities are particularly focused on being relevant to the job market.
On the other hand, the democratization of higher education as an expectation for a white collar job rather than as an experience for elites would be expected to have this kind of effect. Given the time frame of the survey (1979 to 2009), the democratization argument fails, however, because the number of college grads has grown very slowly over that period (see this post for a graph). It would be interesting to see the year by year data from the survey to perhaps decipher the effect of economics on the results.

Whatever the causes, the effect is pressure on college curricula to rid itself of 'irrelevant' programs like classics and philosophy. This last is particularly ironic, considering how well philosophy majors actually do in their careers (I refer back to the WSJ article I've cited several times, here).

I have my own biases, which I'll declare here. I think it's not necessarily a bad thing to have a Malthusian weeding of content now and again. Given unlimited resources, academics naturally expands to create ever more esoteric disciplines. The test for real relevance is how a subject of study ultimately intersects with the universe outside of the world of academia. This need not be vocational. Philosophy has obvious and real applications.

The article quotes the AAC&U survey that led to the LEAP initiative as evidence that a freshman interest in vocational outcomes may be misplaced:
There’s evidence, though, that employers also don’t want students specializing too soon. The Association of American Colleges and Universities recently asked employers who hire at least 25 percent of their workforce from two- or four-year colleges what they want institutions to teach. The answers did not suggest a narrow focus. Instead, 89 percent said they wanted more emphasis on “the ability to effectively communicate orally and in writing,” 81 percent asked for better “critical thinking and analytical reasoning skills” and 70 percent were looking for “the ability to innovate and be creative.”
Notice that these skills are not particularly discipline-based. This suggests some strategies for institutions that are light on their feet:
  • Create a liberal arts curriculum that can allow any major to sell itself as preparation for the workforce
  • Infuse thinking and communication skills into the majors through a senior experience that assesses and amplifies them
  • Use alumni surveys to see what the effects are 5,10,15 years after graduation and use that information to improve and sell the programs
Perhaps it is the case that the market for freshmen is so consumer-oriented that they just want a shiny package that says "New and Improved!" "Get a Great Job!" "Make Lotsa Dough!", and comes in various colors and sizes with different brands like "Dolphin Psychologist" or "Business Consultant." Just like buying cereal at the grocery store. If this is the case, then one strategy to combat that is to wrap those marketing slogans around the institution itself and de-emphasize the majors. The easier and more cynical strategy is to play to the audience and try to create slick marketing for majors and let Darwinian evolution take over. Infomercials for your MBA program might be a good idea.

Tuesday, January 05, 2010

Happy New Year! Oh, and SACS

Dear blogospherians, cyberdenizens, and hypertextuals,

Here's hoping your 2010 is a happy and productive one! I've had a lovely and relaxing two week break, in which to read and write and do the inevitable projects around the house. At some point I came across an article about why we humans behave inconsistently. For example, we may have fabulous will power one day to stick to a diet, and then throw it out the window the next. The explanation in this psychology piece was that we have what you might call different personalities that inhabit our cranial chambers at different times. This sounds creepy and like something out of a horror film, but it was meant in a mild sense. Whatever the case, I decided to embrace that idea and just read sci-fi novels and not think about work for two weeks, and boy did it feel good. I discovered Richard K. Morgan, Charles Stross, and Jack McDevitt. Now I have a stack of unread books by these gentlemen that will take me months to read. At the bitter end I did some writing myself on a novel that progresses asymptotically (i.e. only to be finished if there is infinite time available).

If you are in the south, the Southern Association needs no introduction. There is now a list serv to share ideas about accreditation issues thanks to Patrick S. Williams, PhD, Associate VP for Institutional Effectiveness, University of Houston-Downtown. This is very welcome! You can sign up here (instructions copied from Pat's email):
To subscribe:
- Send an email to LISTSERV@LISTSERV.UHD.EDU (Caps not required; I'm
only using them for clarity.)
- You can leave the subject line blank.
- In the body of the email write SUBSCRIBE SACS-L YOURFIRSTNAME
YOURLASTNAME (Be sure to substitute YOUR first & last names.)
- It's best (but not essential) to delete your signature or anything
else that follows the SUBSCRIBE message.
Note that the listserv isn't actually moderated by SACS officials--they tend to hint and nudge rather than come out and say anything that's not official policy (and for good reason). But the crowdsourcing of experts from member institutions is a wonderful way to solve problems. Ultimately, it would be really nice to have something like StackOverflow for accreditation issues. It's a very smart interface for group problem solving. The overhead is considerably higher than that for a listserv, however, and it requires a large body of consistent users to be of much good.

In other news, there's an interesting comment exchange on my post "Assessment and Automation". Scroll to the bottom to see it.