November 30, 2012
By Scott Jaschik

View original article

While colleges regularly complain about rankings that they don’t like (and boast about those where they do well), one publication’s list of “dangerous” colleges has renewed a debate over whether the data on which it was based reflect actual danger or safety levels, and whether any crime statistics alone can evaluate campus safety.

Business Insider published its list of the 25 “most dangerous colleges in America” last week — and the links and local newspaper articles started to spread. Then, facing considerable criticism from the California colleges that appeared on the top of the list, Business Insider calculated the most dangerous colleges in another way, while insisting that “we’re standing by” the first list. At the same time, some colleges that made the first list but not the second were issuing celebratory press releases. California State University at Fresno (No. 19 on the first list and absent from the second) issued a statement calling the first list and headline “unfounded.”

So what are the two lists? And are they a good measure of campus safety?

The first list was based on the Federal Bureau of Investigation’s Uniform Crime Reports. Campus officials note numerous problems with the FBI data. For starters, they only come from a minority of campuses (they are voluntarily submitted by several hundred colleges with sworn police officers). Perhaps more important, they include incidents reported off-campus, which may seem reasonable when talking about an adjacent area where many students live. But as the University of California at Los Angeles noted, its police officers include crime in reports about clinics and facilities throughout Los Angeles — and some of those are in areas of significant crime. The Westwood neighborhood where the vast majority of UCLA students live and where most professors teach has very low crime rates.

UCLA (No. 1 in the list based on the FBI reports) issued a news release with the headline “UCLA a dangerous campus? Don’t believe it,” with a detailed critique of the data.

Many of those who complained about the FBI data suggested that a more accurate measure would be from the statistics required by the Clery Act, the federal law that requires colleges and universities to report certain types of crimes. In its second ranking, Business Insider used the Clery data. And the publication said that this calculation is part of why it stands behind the original list. The Clery-based list “contains many of the same schools” as the FBI-based rankings, the publication noted.

True, but many of the colleges are in very different spots on the lists. UCLA is still on the second list, but at 19, not 1. The University of California at Berkeley is second on the first list and 10th on the second list. The university ranked as having the most crime on the Clery-based list (Howard University) doesn’t appear on the first list.

James E. Grant Jr., assistant vice chancellor for strategic communications of the University of California at Riverside (which made both lists), said he was bothered by an “intentionally inflammatory headline” that is now being “widely disseminated — right at the time families are deciding where to send their young scholars to college.”

Then there is the question of whether either list should be used, by itself, to judge colleges. S. Daniel Carter, director of the 32 National Campus Safety Initiative of the VTV Family Outreach Foundation (a group of family members of victims and of survivors of the 2007 tragedy at Virginia Tech), said that “you cannot look at statistics in a vacuum and say that any colleges are the most dangerous. I think these lists do a disservice to the consumer.”

Carter said that he puts more stock in the Clery data than the FBI data because more colleges are included and there are more consistent definitions of what to report. But he said that his current work has convinced him that lists alone are problematic. He was hired by the families of Virginia Tech victims to find a way to look at both data, policies and policy enforcement to measure campus safety.

He said that policies and their enforcement are key to understanding relative levels of safety. A college that decides to enforce alcohol laws will see an increase in alcohol-related arrests, but may have its alcohol issues under more control than a college that is looking the other way and making few arrests or citations.

This concern extends to violent crime, he said. Many advocates for victims of sexual assault have noted that colleges have not been consistent in their responses to attacks. “Colleges that do a good job of supporting victims will tend to have higher numbers than colleges that don’t support victims,” Carter said.

Rankings based on crime reports alone send the wrong message, he said. “You are punishing the institutions that are actually doing something about it.”