Inferring Serial Killers with Data: A Lesson from Vancouver

For those happily not in the know, my home town of Vancouver was afflicted with a serial killer during the 80’s and 90’s who largely targeted marginalized women in the downtown eastside – the city’s (and one of the country’s) poorest neighborhoods.

The murderer – Robert Pickton – was ultimately caught in February 2002 and, in December 2007, was convicted on 6 accounts of second degree murder. He is accused of murdering an additional twenty women, and may be responsible for deaths of a number more.

Presently there is an inquiry going on in Vancouver regarding the failure of the policy to investigate and act earlier on the disappearing women. Up until now, the most dramatic part of the inquiry for me had been heart wrenching testimony from one female officer whose own efforts within the Police Department went largely ignored. But I’ve recently seen a new spat of articles that are more interesting and disturbing.

It turns out that during the late 1990s the Vancouver Policy Department actually had an expert analyzing crime data – particularly regarding the disappearing women – and his assessment was that a serial murder was at work in the city. The expert, Kim Rossmo, advised the police to issue a press release and begin to treat the case more seriously.

He was ignored.

The story is relatively short, but worth the read – it can be found here.

What’s particularly discouraging is looking back at past articles, such as this Canadian Press piece which was published in June 26th, 2001, less than a year before Pickton was caught:

Earlier that day, Hughes stood with six others outside a Vancouver courthouse and told passers-by she believes a serial killer is responsible.

Vancouver police officially reject the suggestion.

But former police officer Kim Rossmo supported it while he was a senior officer. He wanted to warn residents about the possible threat. Rossmo is now involved in a wrongful dismissal trial against the force in B.C. Supreme Court.

Last week, he testified he wanted to issue a public warning in 1998, but other officers strongly objected. The force issued a news release saying police did not believe a serial killer was behind the disappearances.

Indeed, Rossom was not just ignored, other policemen on the force actively made his life difficult. He was harassed and further data that would have helped him engage in his analysis was withheld from him. Of course a few months later the murder was caught, demonstrating that his capture might have happened much earlier, if the force had taken the potential problem seriously.

A few lessons from this:

1) Data matters. In this case, the use of data could have, literally, saved lives. Rossom’s data model is now used by other forces and has become a professor in the United States.

2) The challenge with data is as often cultural as it is technical. As with the Moneyball story, the early advocates of using data to analyze and reassess a problem are often victimized. Their approach threatens entrenched interests and, the work often is conducted by people on the margins. Rossom was the first PhD in Canada to become a police officer – I’m pretty sure that didn’t make him a popular guy. Moreover, his approach implicitly, and then explicitly suggested the police were wrong. Police forces don’t deal with errors well – but nor do many organizations or bureaucracies.

3) Finally, this case study says volumes about police forces capacity to deal with data. Indeed, some of you may remember that the other week I deconstructed the Vancouver Police Department’s misleading press release regarding its support for Bill-C30 which would dramatically increase the police’s power to monitor Canadians online. I find it ironic that the police are seeking access to more data, when they have been unable to effectively use data that they can already legal acquire (or that, frankly is open, such as the number and locations of murder/disappearance victims).

3 thoughts on “Inferring Serial Killers with Data: A Lesson from Vancouver

  1. Ian Bron

    Good post. From an accountability perspective, this case has also been deeply disturbing. The individuals responsible for killing the efforts to have the situation treated appropriately should face jail time. It’s unlikely they will, however.

    Sadly, this sort of thing isn’t isolated: for example, and on a much less tragic level, a whistleblower showed Manitoba Hydro data that showed they had lost hundreds of millions of dollars due to poor modelling. She, like Rossmo, was pushed out and punished.

    And, of course, Ottawa is no exception. Working with program evaluation, I’ve too often heard the joke about “decision-based evidence making.”

    Ian Bron
    Managing Director
    Canadians for Accountability

    Reply
  2. Andrew Dyck

    The odd thing is that in my own cursory analysis of Vancouver Police Department crime statistics that are *available* through PDFs on their website, they have been extremely effective in curbing criminal activity over the past few years. Crimes in the datasets I looked at took a nosedive much steeper than one would have expected from the general downward trend in crime we see across the country. 

    Why police departments do not try harder to get this data out into the public domain and advertise their success is difficult for me to explain.

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s