top of page

Tipping Sacred Cows: The Evidence Base for Situation Tables and Hubs

Over the past few years the Ontario and Saskatchewan governments have been heavily promoting the concept of ‘community safety and well-being’ (CSWB). What is this? It’s generally understood as: “the combination of social, economic, environmental, cultural, and political conditions identified by individuals and their communities as essential for them to flourish and fulfil their potential” (Wiseman and Brasher 2008: 358). The definition employed by the Ontario Ministry of Community Safety and Corrections (MCSC 2017: 3), is a bit narrower: “identifying and responding to risks that increase the likelihood of criminal activity, victimization or harm, and working together to build local capacity and strong networks to implement proactive measures”. In practice, it’s an assemblage of collaborative, community-based responses to high risk individuals and situations variously known as ‘situation tables’ and ‘hubs.’

In short form: situation tables are meetings at which police, social services, education and other public sector groups come together to discuss strategies for responding to the needs of high risk individuals. Such individuals may require mental health, addiction, housing or other services, but have consistently fallen through the cracks of the social safety net, often ending up as criminal justice ‘problems.’ The goal of the table is to redirect people into the services they need, thus, ideally, ending their continued criminal or disorderly conduct.

How strongly does the Ontario Ministry of Community Safety and Corrections endorse this approach? Well, fairly strongly. As just one example, in 2015 then Minister Naqvi gave a speech at a Public Safety Canada event at which he identified situation tables as a ‘best practice’. Not surprisingly then, I was recently told, there are some 200 of these tables across the province.

Given the enormous investment in this approach, and the branding of it as a ‘best practice’, it is a fair question to ask: ‘what is the evidence base’? The answer, unfortunately, is not all that surprising: the evidence base is virtually NON-EXISTENT, making this another government initiative that falls into the UNPROVEN category.

The first time I became aware of the Prince Albert Model (the Saskatchewan version, which is also referred to as the ‘hub’ model), was in 2014 at the Law Enforcement in Public Health conference in Amsterdam. This approach we were told was based on the partnership working model in Scotland. My understanding is that it was brought to Canada in about 2010 and originally implemented in Prince Alberta, Sask. From that time, to the time of writing this blog, there has not been one single, independent, peer-reviewed evaluation of any version of a Canadian hub or a table published in a credible research journal. Not one.

In searching the relevant literature, what did I find?

  • One peer-reviewed published piece in a policing journal on ‘lessons learned’ from a situation table in British Columbia

  • One peer-reviewed, theoretically oriented analysis of the situation tables as a tool of social control

  • One Master's thesis on the experiences of service providers on a situation table

  • One Master’s study subsequently published in an online journal dedicated to promoting CSWB, hubs and tables (supported directly and indirectly by the Ministries in Saskatchewan and Ontario and edited by CSWB advocates)*. That study, by the way, makes an inferential claim that the hub approach reduced crime in Prince Albert, Saskatchewan (a pre-test/post-test design using crime data), but is subject to all of the usual problems associated with the possibility of confounding variables (ie. there weren't other causes driving down crime). Had it been a randomized controlled trial, it would have been at least one study with a claim to demonstrating causality.

  • A batch of non-rigorous evaluations measuring outputs and not outcomes. What does that mean? It means report writers can show that, in some cases, the number of individuals returning to the table as a ‘problem’ decreased or that people were successfully moved from one caseload to another. What they cannot show is that:

  1. Decreases in returning individuals had anything to do with the services provided through the table;

  2. That people’s lives improved as a result of the intervention (and that’s supposedly the point, right? After all, as advocates have told me, they are "saving lives");

  3. That no adverse consequences ensued from the intervention;

  4. That no other backfire effects were produced (such as other individuals being denied services that then caused them to become high risk; after all, there has been no significant increase in detox beds, mental health programs, etc., so somebody’s getting bumped);

  5. That the high-risk individuals didn’t just move out of town, seek help on their own initiative, die or did not otherwise become a burden on another public (non-police) service.

  • Some evaluations shared online by consultants. Although some people will be upset for me pointing this out: people who are hired specifically to evaluate a program that any agency or group likes or loves, may have a more than slight interest in producing a report that will satisfy their funder (indeed, most of us in research have heard of, or directly experienced, situations in which funders have demanded what be put into a report or asked for negative information or analysis to be left out). In this case, much of the publicly available work on tables and hubs has been generated by consultants who also happen to promote this "product" and their services. Whenever financial incentives are involved, we have to consider the potential for a conflict of interest and such potential conflicts need to be publicly identified. This is a basic rule of research ethics. Regardless of whether one agrees or not, it is clear that completely independent evaluations are also required.

* Disclosure: I briefly sat on the Editorial Board of this journal when it launched, but resigned over editorial differences.

bottom of page