Testing Colleges

In an effort to prove that college students gain knowledge, administrators are creating assessment tests to measure the knowledge gain in their students. David Brooks writes about the findings from those tests in today's New York Times.

Colleges are supposed to produce learning. But, in their landmark study, “Academically Adrift,” Richard Arum and Josipa Roksa found that, on average, students experienced a pathetic seven percentile point gain in skills during their first two years in college and a marginal gain in the two years after that. The exact numbers are disputed, but the study suggests that nearly half the students showed no significant gain in critical thinking, complex reasoning and writing skills during their first two years in college.

Parents are demanding that their investment in a college education leads to more than an over priced t-shirt from the campus bookstore. They want to see results. However, these assessment tests are incredibly difficult to create. How do you make one test that will accurately measure the knowledge that a philosophy major gains versus the gain in knowledge from a kid who majors in chemistry?  I'm not that confident that a perfect test can be created. 

However, we are entering a new era where colleges are going to have to prove their worth. Parents aren't going to write blank checks any longer. If colleges can't create a perfect assessment test, then other forms of assessment are going to happen.

I like the idea of a college ranking system that measures a university's commitment to their undergraduate population using quantifiable variables like percentage of tenured or tenure track faculty, number of students who graduate in four years, percentage of faculty teaching a full course load, class size, and average student loan debt. Other useful information, though trickier to fit  into a ranking system, would be support for pedagogy, the existence of graduate programs, the availability of faculty, and the availability of required, lower level classes. 

Higher education is a hot topic right now. I'm glad to be part of the discussion. 

26 thoughts on “Testing Colleges

  1. Some of those variables are inputs (e.g., percentage of courses taught by tenured faculty). To rank by inputs defeats the goals of (i) having objective measures of value and (ii) encouraging innovation. Maybe colleges would be better if there were no tenured faculty. I’m not particularly advocating such a system–because I know too little to advocate anything in this area–but I wouldn’t want to discourage someone from trying it. However, Laura’s rating would immediately downgrade such a college.
    The same thing might be said about class size. I have no reason to believe that small class sizes enhance learning in higher education; it seems like something that needs examination, not a priori valorization.

  2. I get what you’re saying, y81. I’m making the assumption that if the inputs are good, then the outputs will be good, too.
    I briefly watched the last college that I worked at impose assessment tests. It was very, very tricky. Even within political science, a student who specialized in international relations would come out with an entirely different knowledge base than a student who specialized in American politics.
    I suppose the best evaluation of a liberal arts education would be a test that measured general critical thinking. I suppose that a test could be created that examined a student’s ability to break down a complicated task and write an essay that reflected advanced thinking. However, there are practical complications. Who would grade such a test? Should colleges employ a full time staff to grade all those essays? Is there really an objective measure of critical thinking? It’s tough. That’s why I went with an input-based assessment.

  3. I’m interested in the percentage of tenured v. adjunct instruction, because, in general, adjuncts are poorly paid and under appreciated. They might not even have an office to meet with students. A high level of adjunct employees is a sign to me that a university does not put resources into teaching undergraduates especially in the lower level classes.

  4. “The exact numbers are disputed, but the study suggests that nearly half the students showed no significant gain in critical thinking, complex reasoning and writing skills during their first two years in college.”
    Here’s another approach–figure out which students those are and don’t admit them.
    You may wonder how possible it is to predict that sort of thing in advance, but there is software for predicting final student course grades before they even set foot in the classroom. It’s being used to help match students with the right courses (although compliance with recommendations is voluntary).
    “In its first trial, Degree Compass predicted with 90 percent accuracy whether a student would earn a C or higher in any given course.”
    http://communitycollegespotlight.org/content/software-predicts-wholl-pass-the-class_8639/

  5. That sort of software would also be very helpful in figuring out which students to provide federally subsidized student loans (if we insist on doing that) or Pell Grants to.
    If we wanted to go all nanny state, there could be a cigarette package type warning on admissions letters that say something like “ONLY 10% OF STUDENTS WITH YOUR GRADE AND TEST SCORE PROFILE SUCCESSFULLY GRADUATE IN FOUR YEARS FROM THIS COLLEGE”. Or maybe “50% OF STUDENTS WITH YOUR MAJOR DEFAULT ON THEIR LOANS WITHIN 7 YEARS OF GETTING THEIR ACCEPTANCE LETTER”.

  6. I like the idea of a college ranking system that measures a university’s commitment to their undergraduate population using quantifiable variables like percentage of tenured or tenure track faculty, number of students who graduate in four years, percentage of faculty teaching a full course load, class size, and average student loan debt.
    I think I (uncharacteristically) agree with y81. Setting up a ranking scheme like this encourages schools to exclude poor students (since they need to take out larger loans, and tend to drop out more often) and create “tenured” positions with $22K salaries and 5/5 course loads.
    By most accounts, Penn State is the best college in America for first generation college students. It makes a special effort to provide guidance to kids who have no familial background in how to “do” college. To the extent that it is relatively cheap, broadly available, and makes an effort to help kids reach their goals, it should be on the top of the list.
    But I doubt it would stack up in any of your proposed measures against (say) Bucknell, which is a private school full of over-privileged white kids who don’t need to take out loans because daddy is paying 100%.

  7. For example, USNews ranks inputs such as faculty-staff ratio, teacher salary, and alumni giving. How did Case Western Law School respond? Brilliantly, I think.
    the law school hired as adjunct professors local alumni who already had lucrative careers (thereby increasing the faculty-student ratio, a key U.S. News statistic used in determining ranking), paid them exorbitant salaries they did not need (thereby increasing average faculty salary, another U.S. News data point), then made it understood that since they did not really need all that money they were expected to donate it all back to the school (thereby increasing the alumni giving rate, another U.S. News data point): three birds with one stone!
    http://budiansky.blogspot.com/2012/02/us-news-root-of-all-evil.html#ixzz1saswqKxJ

  8. An interesting question but I’m afraid that what’s really lurking behind such stories is an interest in standardizing college curricula. “How can we test all these different types of history courses or English courses? We can’t, but if we make everyone teach the same thing or offer them course modules which all teach the same thing, that’ll work!”
    We tread a fine line with the current mania for objectives, outcomes and learning assessment. Educational gurus tell us to adopt these premises but rarely explain how this relates to real-world experiences in different disciplines and for different students. It strikes me as a kind of magical thinking, to be honest!
    Yes, it’s a good idea to ask “what’s the purpose of college and how well is it doing?” but I hear warning bells whenever we get more of a push to quantify outcomes and aims, because that sees educational as a process that can be streamlined. Each student really is different, each class is different and the whole process evolves in back-and-forth response.

  9. Oh, yes, Amy P and Ragtime are right. Lots of places improve their retention numbers and gpas by just not admitting poor students (poor in every sense of the word–needy and students who have fewer skills). But, the mission of most educational institutions is not to teach those who don’t need any help, and I don’t think that rejecting these students is a real answer.
    If you go by Academically Adrift, you also know who a lot of those students are (the ones who don’t learn much at college)–they are business students and students in communications and other applied majors. These are the most popular majors and the ones that students are lining up to be in because they think they lead to jobs. Students in traditional liberal arts majors did better on tests of critical thinking than those from applied majors. There is not always a clear connection between the major that will get you a job and the one that will actually make you learn. Many students certainly don’t see one, although those of us in the humanities certainly try to emphasize it.

  10. “There is not always a clear connection between the major that will get you a job and the one that will actually make you learn.”
    I think that a lot of students go into iffy majors (communications!) because they don’t want to learn.
    If you forcibly switched the business majors with the English majors, you’d probably get very similar results. It’s not necessarily the classes themselves that are making the difference–some students are more teachable than others.

  11. Business majors do lead to jobs. http://online.wsj.com/public/resources/documents/info-Degrees_that_Pay_you_Back-sort.html Business Management, Finance, Accounting, Marketing, Communications and Economics degrees are better financial investments than Journalism, Nursing, Art History, Biology, Anthropology, Spanish, Education or English–when you look at the long-term, mid-career median salaries, rather than the starting salaries.
    I have been looking at college data for a few months. I’m not interested in a test purporting to measure students’ “gain in critical thinking, complex reasoning and writing skills during their first two years in college.” Standardized testing is not a secret sauce. As far as I’m concerned, the SAT writing exam and state tests force schools to teach really bad writing. There’s no need to force colleges to continue the practice. Let the five paragraph essay die, already.
    I also shudder at what a committee composed of political appointees would decide a college graduate should know. I do not want to pay tuition for classes in American History Redux and Life Skills, so that some politician can spout off about “improving American Higher education.”
    Is there any proof that a college education improves “critical thinking, complex reasoning and writing skills,” in comparison the same student spending four years in the workforce? It seems to me to be an article of faith, rather than an assumption based on evidence.
    If you want to compare colleges by their graduates, release GRE, LSAT, MCAT scores for the students who go on to graduate or professional school, and the percentage of students employed within a year (two years, three years) of graduation. A chart of average time needed to repay student loans wouldn’t hurt either.

  12. Cranberry says: “I’m not interested in a test purporting to measure students’ “gain in critical thinking, complex reasoning and writing skills during their first two years in college.” Standardized testing is not a secret sauce. As far as I’m concerned, the SAT writing exam and state tests force schools to teach really bad writing. There’s no need to force colleges to continue the practice. Let the five paragraph essay die, already.”
    Right. A lot of standardized essay tests are pretty dubious exercises.
    “Let’s face it: they throw a question at you that you’ve likely never considered before and have no interest in, and then they give you 25 minutes (30 minutes on the ACT) to think about it, plan it, and answer it in writing. And then they’ll make a determination, based on that sole, bogus writing sample, about how well you write. Ridiculous!”
    http://collegeprepexpress.wordpress.com/2012/03/18/5-tips-to-high-scores-on-sat-and-act-essays/
    Cranberry says: “If you want to compare colleges by their graduates, release GRE, LSAT, MCAT scores for the students who go on to graduate or professional school, and the percentage of students employed within a year (two years, three years) of graduation. A chart of average time needed to repay student loans wouldn’t hurt either.”
    Very true. The last item will reflect both salary and tuition levels, which is not a bad thing.

  13. And the casw-western story illustrates the fundamental point — any high stakes metric will be gamed. And metrics that don’t measure results are particularly vulnerable (ie input measures).

  14. Ditto on the standardized essay testing. My very best student writer got a mediocre score on the GRE essay because, as he said, “They asked whether I thought technology was a good thing or a bad thing, so I spent some time thinking about the question…” Length is what gets the good score, according to a colleague who has studied these things.
    Assessment is a huge deal at most universities these days, at least the ones like Directional State University, where I teach. Unlike big-name colleges, we are scrambling to keep our enrollments stable and cannot afford to reject students who lack basic skills. We also strive to increase the diversity of our rural school by admitting lots of minorities served poorly by the big city school system a few hours away, and we have quite a few first-generation college students from rural areas. So we have several assessment issues:
    1) Can we take students with a low level of basic reading, writing, math etc. skills and get them to the point where we can legitimately give them a college degree? What kind of remedial ed do we do?
    2) Can we reach a consensus within a department on what content needs to be taught (a very challenging proposition, not because of cranky faculty – though those exist – but because it’s a serious question), and then come up with a good way to test this? And is this really worth our time to do, give that it’s something we’re always doing in our individual classes?
    3) Are the broad-based assessments that ask us to measure what percent of our students in these classes have gained the ability to write well, learned basic content, developed an understanding of multiculturalism, etc., worth anything? (Most of us think not – and even if some do it would require much more standardized grading practices than we currently have to assess it.)
    This is for the humanities, so it’s harder than math or the natural sciences, but even those are no piece of cake. I’ve spent a lot of time on committees the past few years addressing these questions and nobody has good answers.

  15. “The exact numbers are disputed, but the study suggests that nearly half the students showed no significant gain in critical thinking, complex reasoning and writing skills during their first two years in college.”
    The more I think about this, the more problematic I think it is.
    1. First of all, I wonder if there isn’t a ceiling on the performance of most students on this sort of random essay prompt. A lot of students will have gotten as good as they’re going to get at that by the end of high school.
    2. I think this sort of standardized essay testing ignores the fact that smart is area specific, and that this will become more and more true as students specialize during their college careers. Writing within a discipline is where there may be much more room for personal growth. I would argue that both “critical thinking” and writing performance are very dependent on background knowledge, and that especially at this level, they should not be divorced from content knowledge. Maybe a writing exercise like the DBQs we did in AP US history would be helpful? In that sort of exercise, they give you raw materials (primary sources), and you are expected to analyze them with the help of your background knowledge and whip up an essay.
    http://apcentral.collegeboard.com/apc/members/courses/teachers_corner/3497.html
    I can imagine something similar for other content areas.
    3. I suspect that writing as a craft differs from area to area. At dinner, I was just hearing a story of how horrified some of the people from philosophy were by the stuff that the writing center was encouraging students to do (it was very reminiscent of the advice in my previous comment on how to game the standardized tests). What students take away from their writing center help is that they need to use big words and combine short sentences into long sentences, none of which cuts ice with analytic philosophers. Meanwhile, a guy from my husband’s department teaches a philosophy writing class where students need to do an essay where all of the sentences have seven words or less.

  16. Right now, as a parent, my kids are still pretty far away from picking a college or a major, and they’re pretty steeped in privilege. They’re going to be able to pay without loans, be able to buy the perks, and use college as a finishing school if they so desire (so, the parents demanding accountability isn’t on my radar). Right now, my general advice is that they should do what their heart desires in college and then, if they find it’s not getting them where they want to go, law school, teaching, and other post-grad choices will still be available.
    Things could change of course, but, if they did, if it was of critical importance that they have skills to be hired after their undergraduate years, I’d heavily recommend a technical field for them (engineering, computer science, potentially a science, but it would have to have serious rigor).
    (And, as a science major, there’s no question that I learned a lot of stuff in my first two years, from matrix algebra to genetics to organic chemistry to thermodynamics. I’m not saying that was a good thing, since I forgot about 90% of it within a few years. But, if you’d tested me after my first year, there’d be no question whatsoever that I’d learned a lot of stuff I didn’t know before).

  17. I think the short version of what I just wrote is that if I were thinking of college as an investment, I’d expect my kids to major in an employable STEM field. I’m not, so I don’t expect to demand it.

  18. “And, as a science major, there’s no question that I learned a lot of stuff in my first two years, from matrix algebra to genetics to organic chemistry to thermodynamics. I’m not saying that was a good thing, since I forgot about 90% of it within a few years.”
    But you perhaps would not have tested much better in writing or critical thinking in general (as opposed to critical thinking as applied to your area of study).

  19. “I would argue that both “critical thinking” and writing performance are very dependent on background knowledge, and that especially at this level, they should not be divorced from content knowledge.”
    This is what the research seems to be showing now, though the reverse is true as well: without critical thinking and reading skills, it’s much harder to acquire content knowledge of any kind. And the question of “transfer” is important: if you learn to write well in one context, how do you transfer that knowledge/ability to the many other contexts in which you will need to write? Both are hot topics.

  20. But, if you’d tested me after my first year, there’d be no question whatsoever that I’d learned a lot of stuff I didn’t know before.
    Certainly, if they tested you on knowledge you’d gained in your area of study. However, after two years at college, were you better at deciphering subway maps, than if you had spent two years as an insurance adjustor? Would you be better able to critique an op-ed? Would you produce a higher score on the Collegiate Learning Assessment? (Which the authors of Academically Adrift used to measure student growth.)
    http://www.collegiatelearningassessment.org/files/Architecture_of_the_CLA_Tasks.pdf

  21. I’m thinking today about “who pays” for higher education, and what that should mean.
    In Florida, the (Republican) Governor if famous for saying: “If I’m going to take money from a citizen to put into education then I’m going to take that money to create jobs. So I want that money to go to degrees where people can get jobs in this state. Is it a vital interest of the state to have more anthropologists? I don’t think so.”
    Leaving aside whether he has a good point or not, the practical result of his beliefs is that he cut funding for University of Florida (hotbed or radical anthropologists?) and has signed on the create “Florida Polytechnic University” where, presumably, there won’t be many polytechnic anthropologists.
    So, how does UF react to the cut in state funding? The eliminate their computer science department.
    http://ireport.cnn.com/docs/DOC-776687
    (I don’t really know what that link is. It’s under the CNN banner, but appears to be more of an editorial. It’s increasingly rare that I can figure out what’s news and what isn’t anymore.)
    Apparently, no anthropologists were harmed in the working out of this policy. This can’t be the right way to go about doing this stuff, but with money coming from the students, the loan companies, the state, the scholarships, etc., I’m not sure who actually does (or should) get to decide.

  22. U of Florida (or maybe Florida State?) has the top marine archeology program in the US, and probably in the world. It’s also tied to their marine engineering program, because to do archeology in the ocean you crazy fancy and expensive deep sea equipment. Just thought I’d throw that out there. Not too many other sorts of anthropologists come out of Florida, and at this point, not too many anthropologists are exactly looking to move there.
    As to how you would measure what students learn in college, it’s an interesting idea, but I’m worried that the remedy is worse than the cure…’teaching to the test’ has damaged a lot of K-12 schools, and I’m not sure how it would work in college. Also, the gaming would be absolutely crazy. The gaming that goes on for the US News & World report is already ridiculous, so this would probably just kick it up a notch.

  23. They’re not eliminating computer science. It looks as if they’re eliminating research, and consolidating much of the department with the Electrical and Computer Engineering Department. From the Dean’s budget for UF: Under this proposed plan, all of the Computer Engineering Degree programs, BS, MS and PhD, would be moved from the Computer & Information Science and Engineering Dept. to the Electrical and Computer Engineering Dept. along with most of the advising staff. This move would allow us to support these degree programs using the existing faculty support staff in other depts. Roughly half of the faculty would be offered the opportunity to move to ECE, BME or ISE. These faculty would continue to support the graduate and research mission in the Computer Engineering degree track. The choice of which faculty and which departments will be made based on fit with the research program and with the receiving departments. Staff positions in CISE which are currently supporting research and graduate programs would be eliminated. The activities currently covered by TAs would be reassigned to faculty and the TA budget for CISE would be eliminated. The faculty remaining in CISE would then focus their efforts on teaching and advising students in the existing Computer Science BS and MS degree programs, offered through both COE and CLAS. Their assignments would change to reflect this new educational mission with sole focus on delivering quality education for students in these degree programs. Any faculty member who wishes to stay in CISE may do so, but with a revised assignment focused on teaching and advising. Tremendous demand for graduates with these degrees exists, and this new mission would allow us to devote more faculty time to grow both the size and excellence of the Computer Science degree program.
    http://dl.dropbox.com/u/72753329/Budget%20Cut%20Plan%202012.pdf
    The Slashdot comment section discussion of the issue is more knowledgeable and has less “spin” than the CNN piece (reprinting a group’s flyer, without comment), and less “spin” than the Forbes piece (which insinuated that the cuts were balanced by increases to the athletics budget.) So, Slashdot’s a better resource than CNN or Forbes on this issue.
    http://developers.slashdot.org/story/12/04/22/2326206/university-of-florida-eliminates-computer-science-department

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s