top of page

Year Over Year: How Not to Talk About Crime Data



The Government of Alberta* just released a study it commissioned on safe consumption sites for intravenous drug users. Regardless of one's feelings - pro or con - it's important to talk about this study. Why? Because it was a classic example of how to do research very poorly on an important social policy issue.


Let's get the politics out of the way:


If you're against safe consumption sites, this study does not advance your cause because it is so methodologically flawed that it ought not be taken seriously by any half-sentient creature. In fact, it just gave those who support safe consumption sites a nicely wrapped, early holiday gift.


If you're for safe consumption sites, you'll have lots of fun picking it apart. Have at it.


Now let's get to the part that concerns me: there is a veritable landslide of problems with the study. In fact, I would argue that it violates many of the basic tenets of good research. As one Twitter commentator wryly observed, "it would get an automatic reject from any [quality**] journal."


I'm not going to go into the many, many flaws in this work. I prefer instead to focus on an issue I commonly find in shoddy research: the much dreaded year over year comparison.


What is this? Year over year comparison is a technique in which one attempts to measure the effects of an event (such as the opening of a safe consumption site) by comparing, say, crime rates, from one year to the next. To illustrate, here's some examples from the report:

Both of the tables above would seem to indicate that calls for service - which include everything, by the way, such as welfare checks, domestic violence calls, and missing persons - increased both near the new sites and in the selected cities* overall. Wow! Those sites are CRIME HOT SPOTS! WE BETTER DO SOMETHING NOW!


Oh wait ... except, the year over year comparison does not tell you whether 2018 was a normal year for crime in that city. Say, what?


Yup, what if crime rates were actually higher in 2016 and 2017 and the 2018 rates happened to be lower that year?


The answer is: we have zero way of knowing any of that because the study's authors didn't provide that information, and presumably neither did they ask for it.


A long time ago, when I was a young graduate pup, my esteemed supervisor, the late Dr. Richard Ericson gave me a stern look, accompanied with some very regal stink eye, for presenting him with a year over year comparison. Richard intoned words of wisdom I have never forgotten, "you must have a minimum of 7 years when looking at crime data."


Now, others may vary in their opinion as to where minimum cut-offs should be, but I think the overall wisdom holds true: year over year comparisons when attempting to infer "causality" is simply bad practice because it can lead you to draw conclusions that may be, shall we say, less than valid. I miss Richard. Unlike myself, he had such an elegant way of delivering the shade.




* https://open.alberta.ca/dataset/dfd35cf7-9955-4d6b-a9c6-60d353ea87c3/resource/11815009-5243-4fe4-8884-11ffa1123631/download/health-socio-economic-review-supervised-consumption-sites.pdf


** I suppose there's always the possibility it could be published by the online "Internet Journal of Criming" (or something like that).


bottom of page