Ranking Christian Colleges (part 2)

Spending too much time thinking or worrying about college rankings seems like a clear path to insanity, so I’ll make this my last of three posts (plus one re-post) on the subject for this season…

In previous posts I’ve joined, well, almost everyone who looks at these things and criticized U.S. News for basing as much as 25% of its college ranking system on “reputation” — measured by asking college and university presidents, provosts, and admissions directors, plus high school college guidance counselors, to rate other schools. One might well suspect that those who answer this survey aren’t all that knowledgeable about the schools they’re rating; at the very least, this aspect of the magazine’s methodology helps explain why the ratings seem to change so little from year to year. And I doubt the reputation score actually indicates anything about, say, student learning.

But the data research director for U.S. News, Robert Morse, does point to two reasons for retaining a reputation component:

U.S. News Best CollegesU.S. News knows that peer assessments are subjective, but a school’s reputation for academic quality is important to prospective students, since they know it could help them get their first job after graduation and make a good impression if they are planning to apply to graduate school.

Now, even if the purpose of a college education is job training (it’s not, at least at the school I work at), I’m going to hope that students don’t actually choose schools they think will impress employers without considering what kind of curriculum and community will best prepare them for employment.

I can, however, tentatively (and unhappily) buy the “apply to graduate school” argument, assuming that the opinions of presidents and provosts are not too dissimilar from graduate admissions committees that do seem to prefer undergraduates from certain feeder programs regarded, rightly or wrongly, as elite. In my discipline, for example, twenty-five colleges and universities account for the baccalaureate education of more than 25% of all doctoral students in History departments. The top 100 feeder programs provide 55% of PhD candidates; the top 200 make up about 70%. This from a 2005 study; it does fluctuate a bit over time, but it seems clear that, if you want to place undergraduates in a graduate program, reputation will play a significant role.

But there’s a different way of thinking about reputation than surveys of individual leaders, one based on how colleges and universities identify their peer institutions.

Chronicle of Higher Education LogoEach year schools can request a variety of data (e.g., finances, enrollment) from the U.S. Department of Education on schools they include in “comparison groups.” Just under 1600 colleges and universities are part of this process: either asking for the data, or having their data provided to peers. Recently the Chronicle of Higher Education took these requests and, via the PageRank algorithm used by Google, “[weighted] colleges on the basis of how many others chose them, and how many chose those colleges.” Click here to see the resulting map.

The reason this seems to make sense is that when colleges and universities pick schools for their comparison group, they not only include obvious peers (e.g., schools that compete for the same students) but “aspirant schools.” Noting that the average school selects sixteen “peers” that actually have larger endowments and budgets, higher SAT scores for entering students, and lower acceptance rates, Chronicle reporter Andrea Fuller concludes, “Colleges want to receive data reports on enrollments, graduation rates, student costs, faculty, and budgets for institutions they aspire to be more like.”

Building a ranking system on data collected for institutional research that’s already happening (rather than an added-on survey that participants may or may not take seriously — or, increasingly, even complete) and that indicates college leaders’ understanding of what institutions are similar to and models for theirs seems pretty reasonable to me, though certainly not fool-proof.

A few things to note about the Chronicle analysis before I dive into it more deeply:

  1. It excludes two-year schools and four-year colleges with enrollments under 500, but includes for-profit and online schools.
  2. A fair number of schools don’t take part in the “comparison group” request process. But they don’t necessarily suffer under the system, since others still request their data. Princeton University, for example, named no peers but still ranked #2.
  3. The #1 went to Carleton College, Bethel’s alphabetical neighbor on the Minnesota Private College Council. (Though Bethel didn’t name it in a comparison group; nor vice-versa.) That underscores that — unlike in the U.S. News and Washington Monthly rankings I’ve been writing about — the Chronicle peer system lumps all schools together, rather than segregating them by category. Carleton is just one of several small private colleges to make the top 10. Others include #3 Oberlin, #7 Bowdoin, and #8 Amherst.
Aerial view of Carleton College
Carleton College – Creative Commons (Dogs1337)

And this is already starting to sound a lot like the U.S. News results, with the exception of Harvard plummeting all the way down to #16. Does that generally hold true?

Here’s how my employer, Bethel University, compared to other Minnesota schools in the “Regional University (Midwest)” category, first in the U.S. News survey and then in the Chronicle peer comparison method: (the two marked with an asterisk didn’t request comparison group data)


US News Rank (within category)

Chronicle rank (out of all types)




St. Catherine









St. Scholastica



U. Minnesota-Duluth



Winona State



Minn. State-Mankato



*Bemidji State



Concordia-St. Paul



St. Cloud State



With the exception of Hamline and Bethel, private schools fared much worse in this method than the USN one, while state universities moved up. (Note: Augsburg is the only private school on this list that Bethel didn’t include in its comparison group.) Why does St. Kate’s fall so far? I’d wonder if it isn’t held back in the Chronicle method because it’s a women’s college, and so perhaps not viewed as a useful point of comparison for most coeducational schools… Except that Smith and Vassar are in the top 50, and three other members of the “Seven Sisters” crack the top 100.

Then I looked at the group of schools most central to my own historical research: the 118 evangelical Christian colleges and universities that belong to the Council of Christian Colleges and Universities. No CCCU member made the top 100, but two did rank in the top 10%: Wheaton College (IL) and Seattle Pacific. Then sixteen more (including Bethel) were in the top 20%.

Blanchard Hall at Wheaton College
Blanchard Hall at Wheaton College – Creative Commons (Liscobeck)

For each, I’ll compare the Chronicle ratings to those for Forbes, which also lumps schools of different types together but doesn’t include reputation as a factor in its ratings (which instead place significant weight on student opinion and alumni achievement). Of these, only Gordon College did not take part in requesting data on “comparison groups.”


Chronicle Rank





Seattle Pacific












Azusa Pacific















Bethel (MN)



Abilene Christian






Point Loma Nazarene


not ranked










Anderson (IN)



It’s also interesting to note which schools these high-ranking CCCU institutions included in their comparison groups. At one end of the spectrum, Houghton only requested data on fellow members of the Christian College Consortium (a smaller subset of CCCU members, mostly aligning with the above rankings) — with the exception of Bethel, oddly. Bethel, then, included almost all of the schools named above — all, I think, except for Abilene Christian and Union — plus lower-ranked peers like Crown, Indiana Wesleyan, North Park, and the two Northwestern Colleges in the CCCU and regional competitors as different as St. John’s and Hamline. Meanwhile, Wheaton didn’t include a single CCCU member among the 49 colleges it requested data on; instead, it listed mostly elite private colleges that were either non-sectarian and associated, however weakly, with mainline churches. On aggregate, just over 46% of the peers and aspirants identified by the eighteen schools making the Chronicle‘s top fifth were fellow CCCU institutions.

By comparison, when these eighteen colleges and universities had their data requested by other schools, three times in five it was by fellow CCCU members, perhaps suggesting that while more “elite” CCCU institutions are slightly more likely to look outside that network than within it for comparison schools, other evangelical colleges (both in and, like Regent or Columbia International, out of the CCCU) tend to look up to the Wheatons, Seattle Pacifics, and Calvins of the world.

Given the long history of anti-Catholicism within evangelical Protestantism, it’s also interesting to note that the CCCU schools that cracked the Chronicle‘s top 20% identified a healthy number of Catholic colleges and universities as peers or aspirants. About 13% of the schools for whom they requested data were associated with a religious order or diocese, though none had their data requested by more than two of the elite CCCU schools (and then usually because it was a regional competitor).

Perhaps more striking is that 7% were Lutheran schools — and not just those affiliated with the more conservative Missouri Synod, but ELCA-related and independent Lutheran colleges and universities. Perhaps not surprisingly, given the still-large population of Lutherans in the Upper Midwest, seven of Bethel’s comparison schools are Lutheran colleges in Minnesota, Iowa, South Dakota, and Illinois. For example, Gustavus Adolphus was named as a peer by Bethel, but also by two high-ranked CCCU members outside of Minnesota: Messiah and Whitworth. Pacific Lutheran and Valparaiso showed up even more often in these comparison groups (Pacific for Abilene Christian, Azusa Pacific, George Fox, Messiah, and Seattle Pacific; Valpo for Calvin, Messiah, Seattle Pacific, and Union).

<<Read part one of this brief series

2 thoughts on “Ranking Christian Colleges (part 2)

  1. I have always wanted someone to look into this stuff more carefully. Thanks. I grew up in Wheaton, went to Taylor, taught at Taylor for two years, and am now finishing my doctorate in theology at Duke and may teach at one of these schools again. I had seen the US News and Forbes reports but hadn’t seen the Washington Monthly or Chronicle data. I’m subscribing to the blog. Hat/Tip http://www.philipvickersfithian.com/2012/11/chris-gehrz-on-social-class-and.html

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.