Monday, December 14, 2009

Link Salad, Analytics, and Pell

Some links of interest for the numerically inclined:
Also, in "Want a Job? Analytics is the Thing, Says IBM" we find that data mining and business analytics are the new plastic:
“In this world, intelligence is replacing intuition,” said Ambuj Goyal, a former IBM researcher who is now General Manager of IBM’s Business Analytics and Process Optimization Group. 35,000 people now report to Goyal, according to IBM.

Simoudis believes the demand for these jobs will only grow thanks to several big trends. One is the sheer data explosion. When Simoudis was working in the software business in the 1980s, he said data warehouses use to handle two terabytes of data. Today, just one small online ad network is generating 100 terabytes of data, while social network Facebook is spewing out 1.5 petabytes of data a year, or 1,500 terabytes. All those status updates and party photos consume massive amounts of data.
I see a new learning outcome for general education... At least it's an argument for requiring computer programming in addition to math.

Finally, the story you probably saw in InsideHigherEd "Pell Costs Explode":
Obama administration officials confirmed on Thursday that unexpectedly strong demand for Pell Grants would sharply increase government spending on its primary need-based student aid program, requiring an extra $18 billion over the next three years.
The article links the overage to the economic hard times, and this is likely valid. But I wonder how much of it is due to the cyberdons (i.e. online for-profits). Surely, with their burgeoning market share of the lesser-qualified student, they're creating more of a market for Pell grants. If so, this pressure will not abate, but force Pell grants to be capped eventually, hurting the traditional schools--an indirect effect of competition. There are, ultimately, a finite amount of resources.

Friday, December 11, 2009

Survey Addresses Drop-Outs

Public Agenda's recent report "With Their Whole Lives Ahead of Them" is subtitled "Myths and Realities About Why So Many Students Fail to Finish College." It's essential reading for anyone interested in student retention. Citing an average 40% six year graduation rate for four-year degrees, the report tries to answer the "why?" question with a survey of 600 young adults who had first hand experience. The complete methodology can be found here. The report is released under the creative commons license.

The demographic characteristics of those surveyed belie the image of a typical college student. Quoting from the article:
  • Among students in four-year schools, 45 percent work more than 20 hours a week.
  • Among those attending community colleges, 6 in 10 work more than 20 hours a week, and more than a quarter work more than 35 hours a week.
  • Just 25 percent of students attend the sort of residential college we often envision.
  • Twenty-three percent of college students have dependent children.
The main part of the report is framed around "myths and realities," such as:
MYTH NO. 1: Most students go to college full-time. If they leave without a degree, it’s because they’re bored with their classes and don’t want to work hard.

REALITY NO. 1: Most students leave college because they are working to support themselves and going to school at the same time. At some point, the stress of work and study just becomes too difficult.
According to the survey, "Those who dropped out are almost twice as likely to cite problems juggling work and school as their main problem as they are to blame tuition bills (54 percent to 31 percent)."

Graphs display ranked survey items. A portion is shown below.


The third section addresses an issue that I've discovered independently.
MYTH NO. 3: Most students go through a meticulous process of choosing their college from an array of alternatives.

REALITY NO. 3: Among students who don’t graduate, the college selection process is far more limited and often seems happenstance and uninformed.
My retention study at one institution showed that students whose had reported on the CIRP that they were at their first-choice college had three other interesting characteristics. One was that they tended to be first-generation students. They also were by far at the highest risk for attrition. And in a subsequent survey two months after the CIRP, many of them had changed their minds about the first-choice qualification. In short, they were uninformed consumers and became quickly disaffected--or at least, that's the way I interpreted the data.

The survey asked for proposals to help other students get a degree. The top responses are shown below.

The other two "myths" presented in the report are:
MYTH NO. 2: Most college students are supported by their parents and take advantage of a multitude of available loans, scholarships, and savings plans.

REALITY NO. 2: Young people who fail to finish college are often going it alone financially. They’re essentially putting themselves through school.
and
MYTH NO. 4: Students who don’t graduate understand fully the value of a college degree and the consequences and trade-offs of leaving school without one.

REALITY NO. 4: Students who leave college realize that a diploma is an asset, but they may not fully recognize the impact dropping out of school will have on their future.
Much more information and analysis is available on the report itself. I'm not crazy about post hoc retention surveys because if we want real predictors, we need to find out information before students leave, and afterwards causes and effects may change over time in the minds of the students, as with the first-choice question noted above. On the other hand, this report is thoughtfully done, and does seem to illuminate some interesting issues.

There is a section on what can be done to help. Providing more financial aid for part-time students is one, as well as more flexible options for attending. I presume that online courses would fill the second bill nicely. It seems obvious that more work-study options on campus would help too--a double win for the college, since students will be more engaged, and provide cheap labor. Addressing child-care problems for students ought to be high on the list too.

Thursday, December 10, 2009

SACS 2009

The annual Commission on Colleges meeting was in Atlanta this year, and I drove down from Charlotte with a colleague under blue skies. Drove back in the rain. In between, the days and nights were packed. I had optimistically taken along one book on higher ed, one text on complexity theory, two sci-fi novels, and a briefing book of articles on college enrollment (is there a word for fear of being stuck somewhere without something to read?) I got through a total of about six pages, I think.

The sessions were mostly good, and the early (as in 7:30am) round tables were even better. It's an open secret that the round table discussions can be the best part of the conference. I tried to find sessions on the SACS five year report, which is a new requirement for reporting mid-cycle. Here are some of the more interesting bits from my notes:
  • You can change the QEP mid-stream. Radical change, like revising the goals isn't recommended, but it is accepted that institutions change, and plans don't always work as intended. Liberty University gave a presentation on this.
  • After the five year report, the QEP is no longer relevant to accreditation. Some projects may be over after five years. Others are property of the institution, to do as it pleases. Some will become institutionalized, others quietly dropped. Basically, after the report, it's time to start thinking about the next one.
  • For the pilot institutions--the first to come under the Principles of Accreditation--no one flunked the impact report on the QEP except for institutions who simply didn't execute it at all. On the other hand, several sections of the limited compliance certification that goes with the report were problematic, including documenting policy for handling student complaints and properly addressing distance learning programs.
  • Some institutions have prepared drafts of the impact report and are willing to share. For example, University of West Florida has a wealth of public documents about their QEP here.
  • On the SACS website, under Institutional Resources, there are the official documents about the report. The direct page is here, which includes report instructions and timeline.
  • SACS voted to change the rules to make it easier to pass the QEP (section 12 of the Principles). This is a technical change that allows recommendations to be made about the QEP proposal during the decennial reaffirmation process without triggering a full-fledged punitive sanction. This doesn't have anything to do with the five-year report, but signals that SACS is being reasonable about the requirements.
  • The five year report is encouraged to be electronic (CD or website, but don't do a website), but should be self-contained. We are not supposed to submit in both paper and electronically, unlike my experience with the compliance certification, where I learned at the last minute that they wanted both (in addition to renumbering all the sections). Any electronic report should be user friendly. I'd like to underline that. The typical higher ed admin is NOT tech-savvy. Use low-tech solutions. I still like paper, based on my experiences with review committees, and don't see any advantage to risking electronic submissions. We were advised that whatever we present should not depend on links to the main university site--the report has to be self-contained, even if electronic.
There's much more from the conference, which I'll get to later. Let me sign off with some proof of my assertion that SACS attendees aren't much into technology. I used one of the commons computers at the conference to browse to Twitter to see what the backchannel chatter was. Here's all that's there:
One lousy post. Compare that to #educause...