Saturday, February 17, 2007

First, Do No Harm

Our state meeting of independent schools' institutional researchers was yesterday. As usual, I picked up some new ideas. I learned about Remark Office optical mark recognition, which can be used to create your own scannable forms for doing surveys. This eliminates having to buy the expensive forms from Scantron (which work well, however--that's what we currently use without problems). The scanner is apparently more forgiving about what kind of marks are made, also. So no more sharpening 1,000 pencils for student evaluation day.

There was also a good discussion of the role of the IR office in strategic planning. The 'safe' option for an IR office is to produce reports and forward them on to the decision-makers. I've done a fair amount of that myself when I was finding my way as a newly appointed IR director. The problem with that is that decision-makers aren't necessarily in the best position to use the data. Also, the information we work from is often incomplete and open to interpretation. The accreditors like SACS seem to imagine a perfect world where assessment inevitably leads to improvement. But the essence of leadership is courage, not data. In mulling this over, I have come to the conclusion that we in the IR business should help the decision makers as follows:
  1. First, do no harm. If the data are pretty strong against a current or proposed policy, use IR's influence to get the policy changed. Example: One of my colleagues at the meeting mentioned that he'd just finished a project having to do with AP testing. In this case, the faculty were convinced that students who used AP credit to skip the introductory class in a subject were ill-prepared to handle the second course. In looking at the data, however, he found that the opposite was true, except in one case. In that case the curriculum for the AP class wasn't a good match for the 'target' course. So he helped kill a proposal that would have eliminated AP credit at the university--a decision that would have had massive admissions implications.
  2. Most of the time, the data aren't going to point one way or the other. In these cases, it is better to encourage decision-makers to adopt policies that subjectively may have a positive affect. That is, rather than creating 'data paralysis' when there is no clear direction, keep moving and trying out new things. At worst you identify which policies are bad, and at best you may find the right one accidentally. Example. We adopted an initiative to improve student writing. We have some assessment data on writing, but nothing that clearly and unequivocally says DO THIS! So the committee assembled a variety of opinions about remedies and eventually adopted some of these as policy. They are quite reasonable and may be expected to improve the situation. In parallel, our assessments are improving, so we may be able to tell in a year or two whether or now we have succeeded. This is much better than simply giving up because there is no clear way forward.
  3. If the data clearly indicate the need for a new policy, do whatever you can to get it implemented. This is what an IR officer should live for--that one study or report that shows YES--this thing really matters and we should act on it. This is where the courage part of leadership comes into play because said IR director will be out there making the case as the spokesperson. Simply passing it on isn't good enough. Make it happen! Example. We recently did a shotgun-style survey of our students and found stark differences between first-generation students and students of college graduated parents. The differences were in attitudes about money, plans after graduation, and attitudes toward offices and services provided by the college. Given the national statistics on attrition for this group, this data cries out for action. So I've used three committees to sell the idea--the IE committee, our retention committee, and the one on improving writing. The issue of preparing first-gen students for their first year cuts across all these areas.
In summary, I advocate the position that the IR director form opinions about the usefulness of the data and then take the leadership to act on it. If it's equivocal, encourage the debate and look for opportunities to use data or at least improve assessment. But don't just publish a report full of charts and numbers with lists of standard errors and dump it on somebody's desk. Taking action by advocating for or against a policy takes courage because you can find yourself on the 'wrong' side of the administration, AND you can be wrong! But I believe it vastly increases the usefulness of the IR office to the institution. There is a danger, of course, that IR will seem to be more politicized. But if you call them as you see them rather than being a sycophant to the administration, you'll be respected for it.

This relates to another role of IR--to provide artificial certainty. The question "how many students do we have?" probably has a dozen possible answers, depending on what you count as a student (part-time? FTE? degree-seekers?) and when you take the snapshot. Decision-makers don't need to know all the possibilites. So figure out a method of measuring this fuzzy number and then publish it as fact. And it shall be so. As a story about three umpires has it, the third says of called plays: "They ain't nothing until I call them."

No comments:

Post a Comment