Wednesday, May 05, 2010

FACS Redux

I have finally gotten the faculty assessment of core skills (FACS) up and running here at the new place. My first two encouragements generated 552 ratings. This time I built the thing to work with a database, so I can get the results instantly. I read somewhere that students are more likely to take work seriously if they get immediate feedback, so I figure it works well on faculty too. The way it works, is faculty log into the staff portal (built on the open source project openIGOR).

The link at the bottom brings up a list of sections that were taught by that prof, showing what work has been done already.
 

A list of students for that section comes up, showing existing ratings as red numbers beside the drop-downs (this is so they don't get submitted again). I obviously redacted the names and IDs.  The skills are shown across the top (more on that below).  The drop-down menus can be used to make new ratings.


Once submitted, a report on current results is shown instantly, including graphs generated using the Google Chart Tools API, which is very cool.  An example is shown below.  If you click on the image source you'll see the google code that defines it.  Basically, the graph is defined by the url, with parameters specified by the documentation you can find online.

The graph below shows results for Analytical/Deductive, with N=130.  The bars are percentages, showing how the responses are distributed across the range "developmental" to "graduate."  The small numbers at the top are the average number of credit hours in that category.  Take a look.

Although I hope to do a lot better on sample size (1300 students x 5 classes = a max of 6500 ratings per skill!), it's fun to glean meaning from this little sample.  We have increased our admissions standards dramatically, which probably has something to do with the credit hour oddity in the graph.  Another interesting one is for Effort, with N=198 currently:

If this pattern holds up, it may be evidence that only students who are showing exceptional effort stand out in terms of persistence, which says interesting things about the curriculum.  This will be a good conversation to have, regardless of the solidity of the data.  This is the hallmark of a good assessment technique, I think: to draw attention to issues that are important.  Statistical certainty is not required (good thing, since it doesn't exist).

I also report out which disciplines have done the reporting. 

Here's the key at the bottom of the survey.

Key:
Analytical/DeductiveAbility to learn facts, processes, theory, language and terminology, and apply those correctly in cases of increasing sophistication. For example, following formulas with deductive reasoning: going from general statements to specific application.
Creative/InductiveAbility to reason inductively, from specific to general, showing increasingly sophisticated insights. Demonstrate the production of new knowlege, formulas, theory, language or terminology. Able to use trial and error to solve problems non-deductively. Usually a strong analytic/deductive base is required in order to understand problems an check solutions.
Writing EffectivenessFormal writing in the context of the course or discipline.
Speaking EffectivenessFormal speech and use of language in the context of the course or discipline.
EffortExpenditure of time and energy in a good-faith effort to succeed.
Scale:
DevelopmentalNot performing at the level we would expect of incoming freshmen.
Fresh-SophPerforming at the level we would expect of a first or second year student.
Jr-SrPerforming at the level we would expect of a third or fourth year student.
GraduatePerforming at the level we expect of our graduates.


General:Only rate students you have something to say about. Rate only skills you think you can say something about. You do not need to have taught them writing (for example) to rate their writing, if you have seen it. If you have no opinion, leave it blank.

No comments:

Post a Comment