Portfolios can be accumulators for this sort of work. There are commercial systems available as well as open source ones like OSP (open source portfolio, literally enough), which integrates with the online learning system Sakai. The OSP home page advertises that portfolio owners have access to:
- tools to collect items that best represent their accomplishments, their learning, or their work;
- tools to reflect upon these items and their connections;
- tools to design a portfolio that showcases the best selections of this work;
- and tools to publish the portfolio to designated audiences.
Advantages of eportfolios are obvious. Years ago I was on a committee to review the old-fashioned kind--manila folders bursting with papers that supposedly documented student accomplishments in language and numeracy. The stakes were high--if a student failed to pass this review, his or her graduation could be held up. In practice, the reliability and validity of this review were very doubtful, and the record-keeping terribly time-consuming and imperfect. There was little transparency throughout the process, and it ultimately was abandoned. A few years later, as chair of the Institutional Effectiveness Committee, the accreditation process presented me with the problem of assessing our Quality Enhancement Plan, which addressed student writing. This time, we built an eportfolio. I used the ASAP approach (as simple as possible), so as to minimize support problems and maximize usage. It takes very little in the way of barriers to keep users away. In this, we were very successful--we gathered oodles of writing samples without having to try hard at all. I called the thing {iceBox} because that's what my grandmother always called her refrigerator, and it tickled me to give new meaning to the term.
Assessment was harder than collection. We tried what seemed like the obvious approach, creating a rubric for writing, assembling a sample of portfolios (each was a selection of three writing samples, so it was a "portfolio" only in a general sense), and assembling a committee to rate them. After spending a lot of effort, we concluded that this approach was a good example of an idea that seems obviously correct until you actually try it. It's much better to tie assessment very closely with coursework--that is, leave it centered in the classroom where it can have an immediate impact and has authenticity.
There are two lessons from my experience. First, software that does one thing simply and well is preferred over general tools that try to do everything. In Vinge's novel, this is analogous to the black-box components that fit together like Legos. For two excellent examples of this kind of thing, see my post on collaborative software. There are many, many components on the web that can be used to assemble portfolio-type materials of all kinds, from music to language to geography to graphics to publishing and beyond. Moreover, they are evolving all the time. A hyperlink can take you to an audio file of a speech, a movie, or a location in Second Life, to name three of a virtually unlimited number of species of potential portfolio artifacts.
The early cars looked like carriages, because that was an obvious transition from horse to no horse. Similarly, our first generation of eportfolios tries to be a big electronic manila folder. This too will evolve, I'm sure. Because what's called for is not the design and publish components (3 and 4 in the OSP list), but the means to link together artifacts that can live anywhere in cyberspace. A student's portfolio per se is just the connections between presentations, and could consist of a single hyperlink to a blog as entry point.
The second lesson I learned relates to the second requirement of a portfolio, if it is to be used for educational purposes. Assessment documentation has to get done in a way that makes sense; there are a lot of ways to do it wrong. Assessments need to be pertinent to ongoing educational processes in real-time, need to be authentic (related to class work), and the burden of creating these assessments needs to be minimal if they are to get done at all. That's a tall order.
Yesterday I mused about implicit rules in the academy. The Center for Teaching, Learning, & Technology (TLT) at Washington State University has produced a very interesting spectrum of assessment focus. You can download it here as a pdf, and I've reproduced the first line below as a sample.
Working through this sheet is like taking an inventory of implicit rules and assumptions concerning assessment practices. The third item "Expert consensus from the community of practice validates the assessment instrument," sounds like the primary basis for the general education assessment we had the most success with.
This idea is related to portfolio assessment by WSU's TLT Center through the notion of a "Harvesting Gradebook," which I mentioned briefly yesterday as a kind of disruptive technology--ideas that challenge the implicit rules of how grades and grading work in this case. On their blog, Nils Peterson writes that
As originally articulated by Gary [Brown], the gradebook “harvested” student work, storing copies of the work within itself where it was assessed.The key here is that the components of the portfolio can live anywhere on the web. The piece that brings them all together is the assessment. This fits perfectly with the evolution from "horse-drawn" portfolios to an ASAP component model. Do the assessment part really well, in other words, and don't try to recreate a music-editing program or web-based word processor inside your eportfolio software. A sample assignment contains notes about how to write a blog. My understanding is that it doesn't specify where or what software to use, but what the content should look like. This separates the means of creation from the assessment of the product, which simplifies and enriches the portfolio. I recommend taking a look at the sample survey here, which permits feedback from various types of audience with rubrics and comment boxes.On further discussion, the concept became inverted, what was “harvested” were assessments, from work that remained in-situ.
There is a lot of material about this project on the WSU/TLT blog, and I look forward to learning more about it. It's interesting that for all the verbiage in the assessment community about grades being lousy assessments, this is the first time I've actually seen a proposal that would radically transform the practice of grading. I hope the registrar has a defibrillator in the office.
David,
ReplyDeleteThanks for the summary. Here is a reply to your analysis.
Thanks for the reply and link. It will be interesting to watch how the project evolves, particularly regarding how the community can participate.
ReplyDelete