top of page

Making it a Bit Harder to Accept Junk Research in Policing

This week CAN-SEBP officially launched Square 1 (as in, if you're doing evidence based policing, here's a good place to get started). What is it? It's what we call a rapid assessment tool for police programs. Before, when someone was trying to figure out the evidence base for a given program, they had to hunt around the research literature to try to figure it out. What we've done is take the guess-work out of it by presenting the information to you in an easy-to-use, straightforward format. Within a couple of minutes you can now know exactly what the evidence is for the program you're interested in. Why do we need it? Policing - like many other institutions - is full of programs that are big on promises, but fail to deliver on the issue of outcome evidence. You'll get told, "we're saving lives" or "we're reducing crime", but when you ask for the proof, you get pointed to an internal evaluation that really only reveals how many files got shuffled around or how many hours were spent doing a task. We're weeding out the hype for you. Why do we REALLY need it? Square 1 asks five key questions: 1) Is the program based on existing research? 2) Has the program been independently evaluated? 3) Was the program rigorously tested? (level 4 or 5 on the Maryland Scientific Methods Scale and/or Ratcliffe Scale) 4) Has the program evaluation been replicated/reproduced? 5) Was the program tested in Canada? I want to be very, very clear here: what Square 1 is REALLY doing is establishing a new set of standards for weighing research evidence in Canadian policing. The days of junky evaluations, crappy pre-test/post-test designs using flawed data, cherry-picking literature or test results to say something works and/or pointing to your one positive evaluation as SUCCESS need to be over. By asking those 5 little questions the bar is set and the challenge is thrown. If your program is NOT a yes on all 5 criteria, we do NOT consider it an evidence-based program suitable for use in Canadian contexts. It's that simple.

Ratcliffe scale

Source: Ratcliffe (2019) at http://www.jratcliffe.net/

Maryland Scientific Methods Scale

Source: Farrington et al. (2011), cited in Ross et al. (2011).

.

"But, that's too high!!!" "Lots of things are proven using different study types" "Things can be both proven and not proven scientifically"* No. No. Say ... whut? If you want to say something "works" then these are not only acceptable standards, they are widely accepted standards internationally in both science generally, and policing science more specifically. How do we do these assessments? We solicit interested policing researchers to provide the assessment and then send the assessment out to an expert in that area. All reviews are blinded, meaning that neither party knows who the other is until the assessment is published. In the name of transparency, we publish both the name and bio of the original assessor and of the reviewer. This way, not only do you know who is doing what, but you get exposure to the work of different experts in the field.

The assessment criteria is strict: answer the 5 questions using ONLY peer-reviewed, published studies in reputable journals. Why academic journal articles? Much of the grey literature in policing is not peer-reviewed and it's junk. It's either evaluations done by consultants to keep program funding coming in, or it's stuff that people put out knowing it wasn't strong enough to withstand academic peer-review. Until that changes, we're forced to rely on published journal articles. Why the stipulation that it be a "credible" journal? Because there's also a host of online and/or predatory journals with very low standards that publish junk. If it's not one of the commonly accepted journals in the relevant field, with an impact factor and rankings, then we don't include it. How is the policing community involved? In two key ways. First, we developed this tool with the Canadian Police Association. We wanted something that would meet the needs of every officer and police employee and so CPA input into the design was crucial. Second, we solicit ideas for new assessments from the policing community. This tool will not work if it doesn't meet your needs. The best way to get your input to us is through Twitter. Tweet your ideas!

*Believe it or not, someone actually said a version of that to me in a message. It's such an awesomely magnificent example of the kind of double speak nonsense surrounding a lot of programs that I might have to get the message framed one day.

bottom of page