It’s that time of year – the admissions season, and the time all those school rankings come out.  Some of us take a guilty pleasure in sneaking a peak at the rankings, not unlike the guilty pleasure involved in turning on just a few minutes of the Oscar awards ceremony or the Miss America pageant.  We know we should know better.  We know such arbitrary competitions are silly (at best,) and even potentially destructive: the attempt to rationalize and quantify subjective criteria and come up with a single “winner” or “best” in each category threatens to homogenize taste and to overshadow options that don’t conform to a stereotype.  When it comes to the school rankings, most of us know that the “best” school is not necessarily best for every student.  And yet, it’s pretty hard for some of us to refrain from taking a peek at those rankings.  Is my alma mater still highly ranked?  Is my town’s high school among the top ten?  What school should my daughter or son plan to attend?

I’m going to plead guilty to this: I just love to look at those rankings.  (Full disclosure: I also take a perverse pleasure in the Oscars, the Emmys, the Grammys, and sometimes even the country music awards the name of which I’ve forgotten.  Heck, I haven’t lived in New York for nearly thirty years but I still care about the Tony and even the Obie awards.)  I know I shouldn’t worry about whether the value of my BA, or my son’s, slips if our shared Alma Mater drops in US New and World Reports’ rankings – but it’s hard not to.  Similarly, despite all my best intentions, I made myself (and no doubt my family) absolutely crazy with this stuff when my son was applying to high schools and subsequently to colleges. 

It’s my hope, as an educational consultant, to help other families avoid the more corrosive aspects of the school rankings game.  The first thing I encourage families to do: take a careful look at the methodology used by any given ranking.  Stop and think for a minute: what criteria are being applied?  Those schools that they say are the best – what, precisely, are they supposed to be the best at?  What values do these lists reflect?  Most important: if a ranked list is based on values that you do not share, why would you care about it?

This fall a couple of related news items caught my eye and reaffirmed my belief that the “usual” rankings reflect values quite different from mine.  First, LinkedIn came up with its set of university rankings.  According to LinkedIn, they derived their rankings by “analyzing employment patterns of over 300 million LinkedIn members from around the world.”  This, they claim, helped them identify “the desirable jobs within several professions and which graduates get those desirable jobs” and ultimately “to rank schools based on the career outcomes of their graduates.”   I admire LinkedIn for sharing as much information as they did about their methodology; the results, I’ll have to say, strike me as underwhelming for one very important reason: there are many populations who don’t use LinkedIn.  The programs are ranked for each professional area, but many professional areas aren’t included.   Best places to study pre-med, pre-law, economics, sociology, education, environmental science, music?  You won’t find them here.   Further, is an analysis of employment patterns of people who take the time to join a social networking site really an accurate barometer of the worth of those individuals’ educational experiences?  Doesn’t this instrument entirely miss the population of people who simply have no time for, or interest in, LinkedIn?

Here in the Boston area people like me who are addicted to the school rankings eagerly await the annual Boston magazine high school ranking issue.  This year was a shocker: Boston’s September issue included, as always, lists of the top private and public high schools.  Before the month was out, they completely revised their list of public schools, and retracted their private school list, taking the latter down from their web site entirely.  Again, I applaud them for taking these steps and for sharing their rationale for doing so.  But casting one’s eye over the public school ranking that remains available online, it’s easy even for a casual reader to think of ways in which schools might “game” this system to their advantage.  Just one quick example:  one of the metrics used is “student-teacher ratio.”  A most appropriate data point and an important one: but who counts as a “teacher”?   How are part-time faculty (if any) counted?  Are athletic coaches included?  Guidance counselors?  One can easily imagine a poorly administered survey coming up with varying responses, even if we assume that all the schools are doing their best to provide accurate answers.  I’m far from being the first person to note that, as laudable as Boston’s decision to take down a flawed ranking may be, the episode calls into question the integrity and value of the whole rankings game.

Now, here’s one ranking of which I’m particularly fond and which gets relatively little ink: the Washington Monthly College Guide.  Their values resonate for me: “We rate schools based on their contribution to the public good in three broad categories: Social Mobility (recruiting and graduating low-income students), Research (producing cutting-edge scholarship and PhDs), and Service (encouraging students to give something back to their country).”   I still wish teaching were part of the formula, perhaps in addition to research; but this methodology makes a refreshing change from the process used by US News and World ReportsWashington Monthly even has a ranking for affordable elite colleges.  I urge you to check it out.

 


Comments

I was very pleased to find this web site. I want to to thank you for ones time for this particularly fantastic read!! I definitely savored every bit of it and i also have you book-marked to look at new things in your website.

Reply

Your comment will be posted after it is approved.


Leave a Reply