Accessibility Navigation:

College Rankings

Questions from constituencies about the various rankings sometimes call for detailed explanations of organizational mission, methodology, and statistical analyses. Other times what is most needed is a succinct response that acknowledges the prevalence-and proliferation-of these rankings and how Davidson views the contribution of any one of them to the best interests of its students. The purpose of this document is to bring the most critical elements together in one place and offer suggestions for responding in those instances when details are not required. A second document-appended here but also available separately-provides quick identifying information on the various rankings and possible responses only.

The various college rankings are based on variables that the organizations doing the rankings believe are important or reflect quality. There is no universal set of variables on which everyone agrees, and the disagreements about what matters can be quite energetic. It is not unusual, therefore, for a college to appear near the top of one list and near the bottom of another. The varying degrees of rigor with which the information that forms the basis of the lists is collected can also result in dizzying changes in placement from one year to the next.

For some organizations, the focus is on factors that create significant differentiation among schools along a small, carefully circumscribed set of dimensions. Others-most notably the federal government-tend to whitewash differences not only among colleges but across higher education sectors and settle on a set of metrics that ostensibly apply to all-public private, residential, online, vocational, liberal arts, selective, open-but in fact apply, in the aggregate, to none.

So, reader-unfriendly though it may be, methodology matters. In what is often a sincere effort to assist prospective students and their families as they navigate the overwhelming amount of information on an increasingly complex array of higher education options, correspondingly complex numbers are reduced to summary measures that, at their least harmful, mask important context or, at their most harmful, exploit it.

In combination, these two considerations-organizational agenda and the winnowing of complex data into sound bites-can be a disservice to the very students the ranking organizations wish to help.

Rankings reflect opinions about what matters in higher education and how well colleges meet presumed obligations. They are also, candidly, often a vehicle to draw attention-and web clicks-to organizations' agenda, which run the gamut from educational to political to commercial to controversy for controversy's sake. (When US News started its rankings, for example, they were a small item in the print magazine. The print magazine is long defunct, but the web site draws more than 10 million visitors for whom the first stop is most often one of the ranking lists, the number of which grows almost annually.) Methodology runs the gamut as well, from data-driven algorithms to anecdotes offered by individuals who may or may not be students, and may or may not even attend the school in question. The result of such variety in both philosophy and rigor is that no school—not a single one—appears at or near the top of every ranking, nor is there any guarantee that doing well one year means doing well the next.

Colleges have a choice. They can dismiss all rankings on methodological grounds. They can expend time and resources improving placement on a select few. Or they can focus on aspects of the various lists that, overall rank notwithstanding, resonate with campus mission and priorities. Which of these metrics matter to us? What can we learn from this comparison with our peers? With whom do we share proximity on the list?

Davidson asks these questions every time a rankings list is released. We take a critical look at methodology, and we assess the reliability of what is being measured. We take particular note of where Davidson is relative to schools with similar missions and resources. We are mindful that prospective students, donors, and alumni are unlikely to have delved into the often hard-to-find explanations of data collection and calculation, and that there is a fine line to walk between shining an objective light on those issues and defensively downgrading their impact.

The most productive approach Davidson can take to the proliferation of rankings is to have a solid understanding of what each of these organizations hopes to accomplish, a firm grasp on campus priorities and whether or not they are truly reflected in any particular ranking, the means for conveying accurate information to its various constituencies, and the standing to give invalid conclusions only the weight they deserve.

This last is critical. Schools that have complained most publicly about their own rankings have been, almost without exception, schools whose company Davidson does not wish to keep. The top schools, on the other hand, can say, "this is not a list to which we aspire" or to greet inclusion among highly ranked schools with "we are pleased to be recognized as part of this group." Davidson is privileged to be counted among the schools able to dedicate time and resources not to what a rankings organization tells us matters but to what the college knows matters, to its mission, to its aspirations, and to its students.

That said, prospective Davidson students are not helped by a dry reiteration of methodology. Nor does the college want to appear to be dismissive or defensive when asked about rankings that grab the attention of the media. The following are suggestions for responding to questions about the various college rankings.

The Rankings: Background and Possible Responses

The alphabetical listing of the better known rankings below are accompanied by statements that can be made about them as well as answers, as applicable, to the questions each is most likely to generate.

American Council of Trustees and Alumni (ACTA)

Rankings. What Will They Learn
How Davidson does. Fairly well

Possible response. Compared to its peers, Davidson's academic requirements are somewhat more traditional, with the result that Davidson receives a somewhat higher, though still imperfect, grade from ACTA. However, those academic requirements reflect what the college's faculty believe best prepares the highly-motivated, academically-prepared students who enroll here rather than the specific ACTA criteria of excellence, the latter of which tend to skew conservative, western, and preparatory. It is not a list we aspire to, nor are the schools that do well on its schools with which Davidson compares itself.

Background: ACTA's mission includes ensuring that "the next generation receives a philosophically rich, high-quality education at an affordable price." This translates into some pretty strongly held beliefs about what constitutes an appropriate course of study for college students. Colleges that do well on the ACTA rankings have core curricula or general education requirements that include U.S. history, economics, and survey courses (that is, an introductory literature class rather than one focused on a particular literary topic, genre, or author). In order to receive credit for a composition requirement, it must be at the introductory level and include grammar; writing-intensive and writing-across-the-curriculum are explicitly excluded. ACTA only recently removed a required course in Shakespeare as a criterion of excellence. Schools are assigned letter grades that reflect how closely they meet the ACTA definitions of quality. Since the criteria for doing well on the ACTA rankings are rather proscriptive, they are also inconsistent with some aspects of Davidson's educational philosophy. It would require a fundamental shift in academic requirements to receive a grade of A from ACTA (and, in fact, only 22 of the 1,091 schools rated received one). It is also worth noting that Davidson's peers, the Ivies, and most of the colleges and universities widely recognized for their academic quality tend to do very poorly. Amherst, Bowdoin, and Berkeley were among the schools receiving an F from ACTA in 2013. Williams and Harvard received a D. Dartmouth joined Davidson with a grade of B. Schools that qualify for an A grade tend to be less selective and more fundamentally religious. All that counts toward the ACTA grade are the schools' core curriculum or general education requirements. It is possible for a school to receive a grade of A while graduating less than 30% of its students, as do Kennesaw State University and Colorado Christian College.

Bottom line: Although Davidson's grade of B looks like a good result, this is not a list that should be touted. The schools whose company we like to keep tend to occupy real estate at the bottom on the ACTA list. Nor is ACTA an organization whose criteria for excellence Davidson should be striving to meet.

Forbes Magazine

Rankings: America’s Best Colleges
How Davidson does: Ranks in the past have ranged from 32 to 61

Possible response. Given the way the ranks are derived, it is not productive to focus on Davidson's specific rank nor its change from year to year. The fact remains that it is difficult to assign urgency to rankings that depend on an unmonitored MTV-sponsored site for determining student satisfaction, and that systematically exclude a near majority of Davidson alumni from one of its outcome measures (salary data are not included for professions that require graduate or professional degrees, nor for alumni who have one, regardless of their profession's requirements). There is no trajectory of improvement that can be measured through the Forbes rankings. Rather, Davidson notes with satisfaction that it is comfortably in proximity to its peers and other schools that are widely acknowledged as exemplars of academic quality.

Background. Although the rankings are published in Forbes, they are created by the Center for College Affordability and Productivity (CCAP). Their reliance on self-reported, un-vetted data and non-representative sources explains a large part of the wide variation in rank from year to year (swings that apply to virtually all of the colleges ranked).

CCAP prides itself on using "outcomes/satisfaction-based criteria, not input/reputation criteria used by some other rankings services." Presumably they are referring to US News and its inclusion of standardized test scores, selectivity, and peer assessment. However, CCAP's focus on satisfaction is driven by this philosophy, which figures prominently in its explanation of methodology: "Asking students what they think about their courses is akin to what some agencies like Consumers Report or J.D. Powers do when they provide information on various good and services." Note that CCAP views this analogy as positive.

Also positive in the CCAP philosophy is high salary and high profile. The problems with the way these data are collected are myriad, but the larger issue is that salary and the inclusion of alumni on various lists that tend to weight fame higher than societal contribution are at odds with what Davidson considers to be markers of success.

Within the individual components from which overall rank is derived are measures that matter to Davidson, including retention, graduation rates, debt, and loan default rates. Unfortunately, the Forbes provides no comparative information in its presentation of rankings, leading to perplexed ruminations about why, for example, Cornell College can be ranked well above Cornell University one year and drop to the bottom 50% the next or how some schools that graduate less than 50% of their students do well on this list as NYU barely edges into the top 100.

Bottom line. The components of the Forbes rankings that matter to Davidson are collected, analyzed, and provided internally on an annual basis. The unreliable components of the Forbes rankings happen also to be those over which Davidson has no control and which often run counter to Davidson priorities. Given the unpredictable changes in rank from one year to the next, and the absence of details related to how ranks are derived, Davidson's best response to the Forbes rankings is to acknowledge the volatility of the methodology and the pleasure of the company we keep.

Money Magazine

Rankings: Best Colleges for Your Money
How Davidson does: Not especially well

Possible response. Money Magazine positions itself as the list that accounts for biases that favor schools whose students are preparing for high-income professions. At the same time, income is the only outcome measure included, and most of the schools ranked toward the top of the list are large universities with engineering or business programs, or smaller schools specializing in one or both. Davidson and its peers are spread-widely-throughout the top. Given the extreme differentiation among these schools, which tend to group closely together on the usual vetted measures of quality, Davidson is taking a wait-and-see approach to this ranking's validity.

Background. Money Magazine's rankings include some measures that are important to Davidson (graduation rate, net cost of a degree), some that are gathered from questionable sources (payscale.com), and some that reflect priorities that are not necessarily aligned with Davidson's (alumni income). Money's unique contribution to developing rankings is its attempt to statistically control for large enrollments in academic programs that lead to high paying careers. They make some other adjustments to measures that have been historically misleading (average debt at graduation takes into account the percent of students graduating with debt) and that is a good thing.

They are less than forthcoming about how some of these adjustments are made, and offer no information on how they calculate some of their predicted values. As a result, it's both not surprising and inexplicably puzzling that schools that tend to cluster on other ranking lists are widely dispersed on this one. The ranks of Davidson and its peers range from 14 to 150. The contrast between institutions given an identical rank is often noteworthy. Johns Hopkins, for example, is tied for rank 107 with the College of Our Lady of the Elms, a small college than accepts 80% of its applicants and is not listed among the top 100 schools in US News even when limiting its category to those located in the north region.

Bottom line. These rankings are yet another attempt to wrest the rankings crown from perennial giant US News. Although they seem to be sincere in wanting to adjust for outcome biases that favor schools with large engineering and business programs, nothing in the way the rankings are derived suggests that career success and alumni satisfaction have sources other than income. Their underlying assumption is still that income is the primary measure of success; they simply control for income potential given each school's mix of majors.

Princeton Review

Rankings: Best 379 Colleges and 62 top/worst/topic lists
How Davidson does: Appears sporadically and inconsistently on some of the topic lists and always in the alphabetically arranged Best 379 Colleges compilation.

Possible response. The lists that get the most attention each year when the Princeton Review rankings are released are the "Top...," "Worst...," and those with inventive titles ("Tree-hugging, Birkenstock-wearing Vegetarians," for example). Based on less than reliable sources-and no actual data-they're fun to read and interesting to discuss, but none are constructed in a way that should make Davidson-or any college-aspire to inclusion. What gets lost in the media attention on these lists, however, is that the Princeton Review also releases its Best 379 Colleges publication (and accompanying web site). Princeton Review does not rank the colleges; rather, prospective students and their families can find a wealth of information on admissions, academics, student life, and financial aid. Descriptions often include students' own words and, as such, provide an important window into life on campus. Davidson continues to receive high marks for the quality of its academic environment, the genuine interest faculty take in their students, and the variety of extracurricular opportunities available.

Background. It must be said: The topic lists can be a lot of fun. It must also be said: The Princeton Review has no relationship to Princeton University. The Princeton Review began life as a college admissions preparatory company that happened to be located, at the time, in Princeton, New Jersey.

The lists that comprise the Princeton Review rankings are based on a survey completed by visitors to the Princeton Review site (presumed to be students, specifically presumed to be students at the college they're rating, but no verification is requested). Given the small number of survey completions per campus, and the absence of demographic analysis, there is no reason to assume responses are representative.

Most of the lists are based on a single survey question. The Great Financial Aid list, for example, is based on the question, "How satisfied are you with your financial aid package?" Note inclusion on this list is not based on actual financial aid data. A few of the lists-and they tend to be the more creatively labeled-are based on more than a single survey question. The Future Rotarians and Daughters of the American Revolution list, and its counterpart, the Birkenstock-Wearing, Tree-Hugging, Clove-Smoking Vegetarians list, are based on respondents' political identification and their perception of the use of drugs on campus, popularity of student government, acceptance of the LGBT community, and how religious the school is perceived to be. There is so much volatility in the lists that a school could appear on the Birkenstock-Wearing... list one year and the "Future Rotarians..." list the next.

The 379 Best Colleges compilation released simultaneously each year does contain vetted and data-supported information for prospective students and their families. Colleges are not ranked; rather, ratings are assigned for selectivity, academic quality, cost, and extracurricular activities. Davidson consistently does well on all these ratings.

Bottom line. Since the Princeton Review's business depends on marketing its test preparation services, and since that marketing depends on amassing student e-mail addresses, the company has landed on a creative approach: Draw attention to the web site through media-friendly lists, and require visitors to set up an account in order to see detailed information on colleges. As is true for US News as well, most of the actual college data are available elsewhere without the necessity of setting up an account (and, in the case of US News, paying a fee for it). So the Princeton Review adds a fun element and generates a lot of discussion. The lists should not, however, be taken seriously.

U.S. News and World Report

Rankings: Best Colleges
How Davidson does: Very well

Possible response. As the original big player among college rankings, US News has been the focus of attention from colleges, statisticians, and prospective students. As a result, it wields influence, primarily of the validation type: If a school is on a prospective student's radar, that school's appearance on US News can confirm that interest. What matters to Davidson is twofold: We want prospective students to have as much information as possible in order to make the best decision for their academic future, and US News' web site provides the ability to compare schools across many dimensions. We also want to continue to keep company with some of the best liberal arts colleges in the country. Even though US News itself is on record that schools should not talk about rising or falling in rank-the ways data are collected and used are not steady enough to allow that kind of comparison-the fact remains that Davidson is among a handful of schools that have consistently appeared among the very top liberal arts colleges in the country for more than 20 years.

Possible response addendum. In 20 years of US News rankings, the same 17 colleges have comprised the National Liberal Arts top 10. Some schools may move on or off the list at various times, but it is the same pool of 17 schools from which that top 10 is drawn. When that list is limited to schools that have appeared on the National Liberal Arts top 10 at least twice in 20 years and at least once in the past ten years, there are only 11 schools; Davidson is one of them. That Davidson has remained in that exceptional pool for 20 years is noteworthy.

Background. It has been criticized-not unreasonably-for summarizing myriad and nuanced data points into a single and over-simplified number that is unduly influenced by reputation. It has been accused-not unfairly-of giving an undue weight to higher budgets, higher selectivity, and higher profiles. "Movement" is built into the rankings (through changes in factors, weights, scales or, more insidiously, through compressed variables and the ways in which tied scores are handled) that are disproportionate to the effect that campus changes would likely create, as a way to increase interest. To be fair, on this point even US News is on record with the admonishment that schools should not use language about "rising" or "falling" in the rankings.

US News is also on record regarding the subjectivity of the ranking's algorithm. Robert Morse, the editor in charge of the rankings since their inception was quoted in the February 14, 2011 issue of The New Yorker: "We're not saying that we're measuring educational outcomes," he explained. "We're not saying we're social scientists, or we're subjecting our rankings to some peer-review process. We're just saying we've made this judgment."

Yet more so than virtually all the other rankings, US News demonstrates its professed commitment to providing information to prospective students and their families through its detailed presentation of the rankings themselves and a robust search feature that enables comparison along dimensions that are important to those students and families. They can learn about the academic preparation of students at different colleges by looking at standardized test means and selectivity. They can get a sense for the academic experience by looking at class size and student-faculty ratio. They can see how well a school retains and graduates its students, a decent measure of how well the admission process creates a good fit between the applicants and the school, and of the educational experience of the students. They can get a sense of how satisfied alumni are by looking at the alumni giving rate. Hundreds of data points are collected beyond what are used in the rankings, making US News one of the best sources of information on colleges, especially for prospective Davidson students for whom class size, faculty interaction, and the academic challenge of other bright students in the classroom are important.

Where schools tend to err is in giving undue attention to inconsequential changes in rank from one year to the next. A school can increase its overall score and still stay maintain the same rank; it can increase its score and go down or up in rank. Within a range of approximately five ranks in either direction, most movement is a result of methodological maneuvering.

The best proof of the movement-by-methodology explanation is also the best proof that Davidson has long been and continues to be in excellent company on the US News rankings: In 20 years of these lists, the same 17 colleges have comprised the National Liberal Arts top 10. A handful have moved on or off the list at various times, but it is the same pool of schools from which it is drawn. Limiting the list to schools that have appeared on the National Liberal Arts top 10 at least twice in 20 years and at least once in the past ten years brings that pool down to 11 schools. That Davidson has remained in that pool for 20 years is more important than an individual rank that even US News acknowledges is volatile for reasons not under the control of the colleges.

Bottom line.Yes, schools like Davidson have a somewhat unfair advantage in terms of input, with higher budgets, academically prepared students, and already solid reputations. In fact, reputation is the primary driver for rank among schools appearing in the National Liberal Arts top 10. And yes, there is a greater focus on change in rank from one year to the next that is warranted by the data themselves. But the information US News collects has significant overlap with the information that is important to prospective Davidson students. This is a rankings list to which Davidson aspires, on which it keeps very good company, and for which careful analysis of comparative data and trajectories is worthwhile.

Possible response addendum. In 20 years of US News rankings, the same 17 colleges have comprised the National Liberal Arts top 10. Some schools may move on or off the list at various times, but it is the same pool of 17 schools from which that top 10 is drawn. When that list is limited to schools that have appeared on the National Liberal Arts top 10 at least twice in 20 years and at least once in the past ten years, there are only 11 schools; Davidson is one of them. That Davidson has remained in that exceptional pool for 20 years is noteworthy.

Washington Monthly

Rankings: America’s Best Colleges
How Davidson does: Anywhere from 11 to 36 since the rankings began in 2009, with no pattern from year to year

Possible response. Although the words "service" and "research" figure prominently in the Washington Monthly rankings, they are defined and measured in ways that are not always in alignment with Davidson's mission. This is unfortunate because prospective students and potential funding partners may wonder-not being familiar with the Washington Monthly's very particular take on these concepts-why Davidson does not appear higher in this ranking. There are hints of its philosophy in the organization's press releases, which are very editorial in tone, but the general public will not necessarily understand this lack of neutrality as red flags.

The Washington Monthly measures a college's commitment to service in arbitrary ways, and the only outcome related to service that qualifies is participation in the Peace Corps. The primary measure of a college's commitment to research is total dollars expended. That this measure privileges large universities, the Washington Monthly notes, is intentional. Further, producing a large number of PhDs in the sciences and engineering also benefits these schools because, according to the Washington Monthly, it is "obvious" that those professions have the greatest social benefit.

Davidson does not limit its commitment to service and research to areas and disciplines deemed appropriate by the Washington Monthly. It is not "obvious" to anyone at Davidson that Peace Corps volunteers are doing more for the public good than volunteers with AmeriCorps, Teach for America, or Public Allies. Nor is it considered a given that the country derives more benefit from a chemist employed by a plastics company than a social worker helping under-served children navigate a complex world.

Recognition for its efforts would, of course, be nice to have, but the Washington Monthly rankings are not the appropriate vehicle. Instead, Davidson will continue to focus its efforts expanding its partnerships with the full range of service organizations; to provide research opportunities across all disciplines; and to recruit students whose abilities and interests make them a good fit for the opportunities Davidson has to offer.

Background. The Washington Monthly rankings are particularly problematic for schools like Davidson because they are ostensibly driven by service and research, key words that mean something to Davidson's constituencies. However, the ways the Washington Monthly defines service and research reflect a very particular perspective that is, in many ways, at significant odds with Davidson's mission and priorities.

Some of the on-campus service measures make sense for Davidson, for example, student participation in community service, academic courses that incorporate service, the number of staff supporting community service. Here, the problem is not the measures but the data source. The Washington Monthly does not request the data from the schools but pulls them from applications made to the Corporation for National and Community Service for the President's Higher Education Community Service Honor Roll. If a school did not submit such an application the year the rankings were calculated, that school would get no credit for eight of the ten measures of service.

The remaining two measures of service are rather arbitrary: the percentage of number of alumni who join the Peace Corps and the percentage of students who serve in ROTC. Note that participation in no other service organization qualifies; in spite of the Washington Monthly's focus on what is in [America's] public interest," organizations that focus on work in the U.S. are excluded.

The research measures notoriously reflect significant biases in the Washington Monthly rankings. Unlike the service measures, where some adjustment for enrollment size is made, the primary research measure, research expenditures, includes no such adjustment. The Washington Monthly's response to this bias in favor of large universities is as follows: "...our research score rewards large schools for their size. This is intentional. It is the huge numbers of scientists, engineers, and PhDs that larger universities produce, combined with their enormous amounts of research spending, that will help keep America competitive in an increasingly global economy...This year's guide continues to reward large universities for their research productivity." When asked why PhDs in the sciences and engineering received greater weight than other fields, the Washington Monthly went on record with this statement: "obviously people working in those fields provide the most benefit to society."

Bottom line. Given the value Davidson places on service and research, and the fact that the Washington Monthly is less than forthcoming about how they are assessed, Davidson needs to be more proactive than might be called for regarding other rankings when it comes to calling out the methodology. Since rising in the Washington Monthly rankings would require a shift in Davidson values, it should not be considered a list to which the college aspires. However, given the uncomfortable position Davidson can find itself in with respect to the Washington Monthly, responses to questions about Davidson's place on this list need to be more detailed than might be warranted for other rankings.

Possible response addendum. Although the words "service" and "research" figure prominently in the Washington Monthly rankings, they are defined and measured in ways that are not always in alignment with Davidson's mission. This is unfortunate because prospective students and potential funding partners may wonder-not being familiar with the Washington Monthly's very particular take on these concepts-why Davidson does not appear higher in this ranking. There are hints of its philosophy in the organization's press releases, which are very editorial in tone, but the general public will not necessarily understand this lack of neutrality as red flags.

The Washington Monthly measures a college's commitment to service in arbitrary ways, and the only outcome related to service that qualifies is participation in the Peace Corps. The primary measure of a college's commitment to research is total dollars expended. That this measure privileges large universities, the Washington Monthly notes, is intentional. Further, producing a large number of PhDs in the sciences and engineering also benefits these schools because, according to the Washington Monthly, it is "obvious" that those professions have the greatest social benefit.

Davidson does not limit its commitment to service and research to areas and disciplines deemed appropriate by the Washington Monthly. It is not "obvious" to anyone at Davidson that Peace Corps volunteers are doing more for the public good than volunteers with AmeriCorps, Teach for America, or Public Allies. Nor is it considered a given that the country derives more benefit from a chemist employed by a plastics company than a social worker helping under-served children navigate a complex world.

Recognition for its efforts would, of course, be nice to have, but the Washington Monthly rankings are not the appropriate vehicle. Instead, Davidson will continue to focus its efforts expanding its partnerships with the full range of service organizations; to provide research opportunities across all disciplines; and to recruit students whose abilities and interests make them a good fit for the opportunities Davidson has to offer.

Wall Street Journal

Rankings: WSJ/Times Higher Education

How Davidson does: 76th in inaugural year (2016)

Possible response. Schools that do well in this ranking tend to be those where faculty publish at high rates and where the school is well-known to other published scholars; that enroll high proportions of diverse students (by race/ethnicity, international, first-generation, low income) and have diverse faculty; that offer a large number of majors; and where graduates who happen to have taken out student loans enter careers with high salaries. Although Davidson does well on objective measures such as graduation rate, spending per student, and student/faculty ratio, other aspects of the college don't align with the emphases of this particular ranking or the way data were collected.

Background. The WSJ college ranking is a mixture of defined data equally available for all schools and data that are subjective/self-reported or apply disproportionately to a particular type of institution. There is a reason the list is heavy on research universities (the first liberal arts college appears at rank 22). Major factors include graduation rate and some other straightforward metrics, as well as a "value-added" measure for salary 10 years out from graduation and loan repayment. These are taken from College Scorecard data, meaning they are only available for graduates who took out federal student loans.

Academic reputation is based entirely on a survey that the ranking's co-sponsor, Times Higher Education, has done for a number of years for another set of rankings. The survey is sent to "only experienced, published scholars, who offer their views on excellence in research and teaching within their disciplines and at institutions with which they are familiar." The list of "experienced, published scholars" are those listed in the Elsevier database of publications. Research productivity of the faculty is measured by the number of research papers listed in Elsevier. Student engagement depends on responses to the US Student Survey. Nothing about the survey suggests confidence in its representativeness and, in any case, it is based on perception and self-report.

Bottom line. Schools that do well in this ranking tend to be those where faculty publish at high rates and where the school is well-known to other published scholars; that enroll high proportions of diverse students (by race/ethnicity, international, first-generation, low income) and have diverse faculty; that offer a large number of majors; and where graduates who happen to have taken out student loans enter careers with high salaries.