Accessibility Navigation:

College Rankings

Sometimes questions from constituencies about the various rankings require detailed explanations of organizational mission, methodology, and statistical analyses. Other times what is most needed is a succinct response that acknowledges the prevalence-and proliferation-of these rankings and how Davidson views the contribution of any one of them to the best interests of its students. The purpose of this document is to bring the most critical elements together in one place and offer suggestions for responding in those instances when details are not required. A second document-appended here but also available separately-provides quick identifying information on the various rankings and possible responses only.

The various college rankings are based on variables that the organizations doing the rankings believe are important or reflect quality. There is no universal set of variables on which everyone agrees, and the disagreements about what matters can be quite energetic. It is not unusual, therefore, for a college to appear near the top of one list and near the bottom of another. The varying degrees of rigor with which the information that forms the basis of the lists is collected can also result in dizzying changes in placement from one year to the next.

For some organizations, the focus is on factors that create significant differentiation among schools along a small, carefully circumscribed set of dimensions. Others-most notably the federal government-tend to whitewash differences not only among colleges but across higher education sectors and settle on a set of metrics that ostensibly apply to all-public private, residential, online, vocational, liberal arts, selective, open-but in fact apply, in the aggregate, to none.

So, reader-unfriendly though it may be, methodology matters. In what is often a sincere effort to assist prospective students and their families as they navigate the overwhelming amount of information on an increasingly complex array of higher education options, correspondingly complex numbers are reduced to summary measures that, at their least harmful, mask important context or, at their most harmful, exploit it.

In combination, these two considerations-organizational agenda and the winnowing of complex data into sound bites-can be a disservice to the very students the ranking organizations wish to help.

That said, prospective Davidson students are not helped by a dry reiteration of methodology. Nor does the college want to appear to be dismissive or defensive when asked about rankings that grab the attention of the media. The following are suggestions for responding to questions about the various college rankings.

Comment from the Office of Planning and Institutional Research

The various college rankings are based on variables that the organizations doing the rankings believe are important or reflect quality. There is no universal set of variables on which everyone agrees, and the disagreements about what matters can be quite energetic. It is not unusual, therefore, for a college to appear near the top of one list and near the bottom of another. The varying degrees of rigor with which the information that forms the basis of the lists is collected can also result in dizzying changes in placement from one year to the next.

Rankings reflect opinions about what matters in higher education and how well colleges meet presumed obligations. They are also, candidly, often a vehicle to draw attention-and web clicks-to organizations' agenda, which run the gamut from educational to political to commercial to controversy for controversy's sake. Methodology runs the gamut as well, from data-driven algorithms to anecdotes offered by individuals who may or may not be students, and may or may not attend the school in question. The result of such variety in both philosophy and rigor is that no school-not a single one-appears at or near the top of every ranking, nor is there any guarantee that doing well one year means doing well the next.

Colleges have a choice. They can dismiss all rankings on methodological grounds. They can expend time and resources improving placement on a select few. Or they can focus on aspects of the various lists that, overall rank notwithstanding, resonate with campus mission and priorities. Which of these metrics matter to us? What can we learn from this comparison with our peers? With whom do we share proximity on the list?

Davidson asks these questions every time a rankings list is released. We take a critical look at methodology, and we assess the reliability of what is being measured. We take particular note of where Davidson is relative to schools with similar missions and resources. We are mindful that prospective students, donors, and alumni are unlikely to have delved into the often hard to find explanations of data collection and calculation, and that there is a fine line to walk between shining an objective light on those issues and defensively downgrading their impact.

The most productive approach Davidson can take to the proliferation of rankings is to have a solid understanding of what each of these organizations hopes to accomplish with them, a firm grasp on campus priorities and whether or not they are truly reflected in any particular ranking, the means for conveying accurate information to its various constituencies, and the standing to give invalid conclusions only the weight they deserve.

This last is critical. Schools that have complained most publicly about their own rankings have been, almost without exception, schools whose company Davidson does not wish to keep. The top schools, on the other hand, have the standing to say, "this is not a list to which we aspire" or to greet inclusion among highly ranked schools with "we are pleased to be recognized as part of this group." Davidson is privileged to be counted among the schools able to dedicate time and resources not to what a rankings organization tells us matters but to what the college knows matters, to its mission, to its aspirations, and to its students.

The Rankings

The alphabetical listing of the better known rankings below are accompanied by statements that can be made about them as well as answers, as applicable, to the questions each is most likely to generate. "Possible response" appears twice under each listing: At the top for quick perusal, and again at the end as the conclusion to the information provided.

American Council of Trustees and Alumni (ACTA)

Rankings: What Will They Learn
How Davidson does: Fairly well

Background. ACTA's mission includes ensuring that "the next generation receives a philosophically rich, high-quality education at an affordable price." This translates into some pretty strongly held beliefs about what constitutes an appropriate course of study for college students. Colleges that do well on the ACTA rankings have core curricula or general education requirements that include U.S. history, economics, and survey courses (that is, an introductory literature class rather than one focused on a particular literary topic, genre, or author). In order to receive credit for a composition requirement, it must be at the introductory level and include grammar; writing-intensive and writing-across-the-curriculum are explicitly excluded. ACTA only recently removed a required course in Shakespeare as a criterion of excellence. Schools are assigned letter grades that reflect how closely they meet the ACTA definitions of quality.

Since the criteria for doing well on the ACTA rankings are rather proscriptive, they are also inconsistent with some aspects of Davidson's educational philosophy. It would require a fundamental shift in academic requirements to receive a grade of A from ACTA (and, in fact, only 22 of the 1,091 schools rated received one). It is also worth noting that Davidson's peers, the Ivies, and most of the colleges and universities widely recognized for their academic quality tend to do very poorly. Amherst, Bowdoin, and Berkeley were among the schools receiving an F from ACTA in 2013. Williams and Harvard received a D. Dartmouth joined Davidson with a grade of B. Schools that qualify for an A grade tend to be less selective and more fundamentally religious.

All that counts toward the ACTA grade are the schools' core curriculum or general education requirements. It is possible for a school to receive a grade of A while graduating less than 30% of its students, as do Kennesaw State University and Colorado Christian College.

Bottom line. Although Davidson's grade of B looks like a good result, this is not a list that should be touted. The schools whose company we like to keep tend to occupy real estate at the bottom on the ACTA list. Nor is ACTA an organization whose criteria for excellence Davidson should be striving to meet.

Forbes Magazine

Rankings: America’s Best Colleges
How Davidson does: Ranks in the past have ranged from 32 to 61

Background. Although the rankings are published in Forbes, they are created by the Center for College Affordability and Productivity (CCAP). Their reliance on self-reported, un-vetted data and non-representative sources explains a large part of the wide variation in rank from year to year (swings that apply to virtually all of the colleges ranked).

CCAP prides itself on using "outcomes/satisfaction based criteria, not input/reputation criteria used by some other rankings services." Presumably they are referring to US News and its inclusion of standardized test scores, selectivity, and peer assessment. However, CCAP's focus on satisfaction is driven by this philosophy, which figures prominently in its explanation of methodology: "Asking students what they think about their courses is akin to what some agencies like Consumers Report or J.D. Powers do when they provide information on various good and services." Note that CCAP views this analogy as positive.

Also positive in the CCAP philosophy is high salary and high profile. The problems with the way these data are collected are myriad, but the larger issue is that salary and the inclusion of alumni on various lists that tend to weight fame higher than societal contribution are at odds with what Davidson considers to be markers of success.

Within the individual components from which overall rank is derived are measures that matter to Davidson, including retention, graduation rates, debt, and loan default rates. Unfortunately, the Forbes provides no comparative information in its presentation of rankings, leading to perplexed ruminations about why, for example, Cornell College can be ranked well above Cornell University one year and drop to the bottom 50% the next or how some schools that graduate less than 50% of their students do well on this list as NYU barely edges into the top 100.

Bottom line.
The components of the Forbes rankings that matter to Davidson are collected, analyzed, and provided internally on an annual basis. The unreliable components of the Forbes rankings happen also to be those over which Davidson has no control and which often run counter to Davidson priorities. Given the unpredictable changes in rank from one year to the next, and the absence of details related to how ranks are derived, Davidson's best response to the Forbes rankings is to acknowledge the volatility of the methodology and the pleasure of the company we keep.

Money Magazine

Rankings: Best Colleges for Your Money
How Davidson does: Not especially well (new this year)

Background. This is the first year for Money Magazine's rankings, and they include some measures that are important to Davidson (graduation rate, net cost of a degree), some that are gathered from questionable sources (payscale.com), and some that reflect priorities that are not necessarily aligned with Davidson's (alumni income). Money's unique contribution to developing rankings is its attempt to statistically control for large enrollments in academic programs that lead to high paying careers. They make some other adjustments to measures that have been historically misleading (average debt at graduation takes into account the percent of students graduating with debt) and that is a good thing.

They are less than forthcoming about how some of these adjustments are made, and offer no information on how they calculate some of their predicted values. As a result, it's both not surprising and inexplicably puzzling that schools that tend to cluster on other ranking lists are widely dispersed on this one. The ranks of Davidson and its peers range from 14 to 150. It is worth noting that Johns Hopkins is tied for rank 107 with the College of Our Lady of the Elms.

Bottom line. These rankings are new this year and yet another attempt to wrest the rankings crown from perennial giant US News. Although they seem to be sincere in wanting to adjust for outcome biases that favor schools with large engineering and business programs, nothing in the way the rankings are derived suggests that career success and alumni satisfaction have sources other than income. Their underlying assumption is still that income is the primary measure of success; they simply control for income potential given each school's mix of majors.

Princeton Review

Rankings: Best 379 Colleges and 62 top/worst/topic lists
How Davidson does: Appears sporadically and inconsistently on some of the topic lists and always in the alphabetically arranged Best 379 Colleges compilation.

Background. It must be said: The topic lists can be a lot of fun. It must also be said: The Princeton Review has no relationship to Princeton University. The Princeton Review began life as a college admissions preparatory company that happened to be located, at the time, in Princeton, New Jersey.

The lists that comprise the Princeton Review rankings are based on a survey completed by visitors to the Princeton Review site (presumed to be students, specifically presumed to be students at the college they're rating, but no verification is requested). Given the small number of survey completions per campus, and the absence of demographic analysis, there is no reason to assume responses are representative.

Most of the lists are based on a single survey question. The Great Financial Aid list, for example, is based on the question, "How satisfied are you with your financial aid package?" Note inclusion on this list is not based on actual financial aid data. A few of the lists-and they tend to be the more creatively labeled-are based on more than a single survey question. The Future Rotarians and Daughters of the American Revolution list, and its counterpart, the Birkenstock-Wearing, Tree-Hugging, Clove-Smoking Vegetarians list, are based on respondents' political identification and their perception of the use of drugs on campus, popularity of student government, acceptance of the LGBT community, and how religious the school is perceived to be. There is so much volatility in the lists that a school could appear on the Birkenstock-Wearing... list one year and the "Future Rotarians..." list the next.

The 379 Best Colleges compilation released simultaneously each year does contain vetted and data-supported information for prospective students and their families. Colleges are not ranked; rather, ratings are assigned for selectivity, academic quality, cost, and extracurricular activities. Davidson consistently does well on all these ratings.

Bottom line. Since the Princeton Review's business depends on marketing its test preparation services, and since that marketing depends on amassing student e-mail addresses, the company has landed on a creative approach: Draw attention to the web site through media-friendly lists, and require visitors to set up an account in order to see detailed information on colleges. As is true for US News as well, most of the actual college data are available elsewhere without the necessity of setting up an account (and, in the case of US News, paying a fee for it). So the Princeton Review adds a fun element and generates a lot of discussion. The lists should not, however, be taken seriously.

U.S. News and World Report

Rankings: Best Colleges
How Davidson does: Very well

Background. It has been criticized-not unreasonably-for summarizing myriad and nuanced data points into a single and over-simplified number that is unduly influenced by reputation. It has been accused-not unfairly-of giving an undue weight to higher budgets, higher selectivity, and higher profiles. "Movement" is built into the rankings (through changes in factors, weights, scales or, more insidiously, through compressed variables and the ways in which tied scores are handled) that are disproportionate to the effect that campus changes would likely create, as a way to increase interest. To be fair, on this point even US News is on record with the admonishment that schools should not use language about "rising" or "falling" in the rankings.

US News is also on record regarding the subjectivity of the rankings algorithm. Robert Morse, the editor in charge of the rankings since their inception was quoted in the February 14, 2011 issue of The New Yorker: "We're not saying that we're measuring educational outcomes," he explained. "We're not saying we're social scientists, or we're subjecting our rankings to some peer-review process. We're just saying we've made this judgment."

Yet more so than virtually all the other rankings, US News demonstrates its professed commitment to providing information to prospective students and their families through its detailed presentation of the rankings themselves and a robust search feature that enables comparison along dimensions that are important to those students and families. They can learn about the academic preparation of students at different colleges by looking at standardized test means and selectivity. They can get a sense for the academic experience by looking at class size and student-faculty ratio. They can see how well a school retains and graduates its students, a decent measure of how well the admission process creates a good fit between the applicants and the school, and of the educational experience of the students. They can get a sense of how satisfied alumni are by looking at the alumni giving rate. Hundreds of data points are collected beyond what are used in the rankings, making US News one of the best sources of information on colleges, especially for prospective Davidson students for whom class size, faculty interaction, and the academic challenge of other bright students in the classroom are important.

Where schools tend to err is in giving undue attention to inconsequential changes in rank from one year to the next. A school can increase its overall score and still stay maintain the same rank; it can increase its score and go down or up in rank. Within a range of approximately five ranks in either direction, most movement is a result of methodological maneuvering.

The best proof of the movement-by-methodology explanation is also the best proof that Davidson has long been and continues to be in excellent company on the US News rankings: In 20 years of these lists, the same 17 colleges have comprised the National Liberal Arts top 10. A handful have moved on or off the list at various times, but it is the same pool of schools from which it is drawn. Limiting the list to schools that have appeared on the National Liberal Arts top 10 at least twice in 20 years and at least once in the past ten years brings that pool down to 11 schools. That Davidson has remained in that pool for 20 years is more important than an individual rank that even US News acknowledges is volatile for reasons not under the control of the colleges.

Bottom line.
Yes, schools like Davidson have a somewhat unfair advantage in terms of input, with higher budgets, academically prepared students, and already solid reputations. In fact, reputation is the primary driver for rank among schools appearing in the National Liberal Arts top 10. And yes, there is a greater focus on change in rank from one year to the next that is warranted by the data themselves. But the information US News collects has significant overlap with the information that is important to prospective Davidson students. This is a rankings list to which Davidson aspires, on which it keeps very good company, and for which careful analysis of comparative data and trajectories is worthwhile.

Washington Monthly

Rankings: America’s Best Colleges
How Davidson does: Anywhere from 11 to 36 since the rankings began in 2009, with no pattern from year to year

Background. The Washington Monthly rankings are particularly problematic for schools like Davidson because they are ostensibly driven by service and research, key words that mean something to Davidson's constituencies. However, the ways the Washington Monthly defines service and research reflect a very particular perspective that is, in many ways, at significant odds with Davidson's mission and priorities.

Some of the on-campus service measures make sense for Davidson, for example, student participation in community service, academic courses that incorporate service, the number of staff supporting community service. Here, the problem is not the measures but the data source. The Washington Monthly does not request the data from the schools but pulls them from applications made to the Corporation for National and Community Service for the President's Higher Education Community Service Honor Roll. If a school did not submit such an application the year the rankings were calculated, that school would get no credit for eight of the ten measures of service.

The remaining two measures of service are rather arbitrary: the percentage of number of alumni who join the Peace Corps and the percentage of students who serve in ROTC. Note that participation in no other service organization qualifies; in spite of the Washington Monthly's focus on what is in [America's] public interest," organizations that focus on work in the U.S. are excluded.

The research measures notoriously reflect significant biases in the Washington Monthly rankings. Unlike the service measures, where some adjustment for enrollment size is made, the primary research measure, research expenditures, includes no such adjustment. The Washington Monthly's response to this bias in favor of large universities is as follows: "...our research score rewards large schools for their size. This is intentional. It is the huge numbers of scientists, engineers, and PhDs that larger universities produce, combined with their enormous amounts of research spending, that will help keep America competitive in an increasingly global economy...This year's guide continues to reward large universities for their research productivity." When asked why PhDs in the sciences and engineering received greater weight than other fields, the Washington Monthly went on record with this statement: "obviously people working in those fields provide the most benefit to society."

Bottom line. Given the value Davidson places on service and research, and the fact that the Washington Monthly is less than forthcoming about how they are assessed, Davidson needs to be more proactive than might be called for regarding other rankings when it comes to calling out the methodology. Since rising in the Washington Monthly rankings would require a shift in Davidson values, it should not be considered a list to which the college aspires. However, given the uncomfortable position Davidson can find itself in with respect to the Washington Monthly, responses to questions about Davidson's place on this list need to be more detailed than might be warranted for other rankings.