What if?... We speculate on how Parliamentary debate on an issue could have been improved had there been an Evaluator General.

Case Study

Evaluation of the Long Gun Registry

Launched in 1996, the Long Gun Registry consistently evoked strong emotional reactions. Public opinion was polarized for and against the registry.

In 2010, fourteen years after the registry was created, the Royal Canadian Mounted Police, the organization which managed the registry, produced an evaluation of its effectiveness.

For at least three reasons, the evaluation was not useful input to Parliamentary debate on the registry. It was not credible. It was delivered too late. It was of mediocre quality.

1. Evaluation of the gun registry lacked credibility

Evaluation of a program by the organization which manages it may easily be perceived as biased. Not surprisingly, when the RCMP delivered a positive evaluation of the registry, critics attacked its credibility. As evidence that the organization could not be trusted to deliver unbiased information, one skeptic pointed to RCMP cover-up of their abuse of tasers in the Robert Dziekański incident.

Lesson 1: Evaluative information related to major decisions by Parliament should be compiled and communicated by an independent third party such as an Evaluator General.

2. The evaluation report was delivered too late to be useful to Parliament

In 2006, while examining the finances and administration of the Registry, the Auditor General pointed out the pressing need for data on its effectiveness in reducing deaths, injuries or threats. The RCMP consequently commenced an evaluation study. That was in 2007. Three years later, in 2010, they delivered a report.

Meanwhile, in 2009, Parliament entertained a private member’s bill to eliminate the Gun Registry. Evaluation findings were not available to Parliament at the time.

Lesson 2: Information on evaluations related to major decisions by Parliament is only useful if delivered on time.

3. The evaluation report was of mediocre quality

Despite being three years in the making (2007 to 2010), the RCMP’s assessment of the effectiveness of the Long Gun Registry lacked many of the hallmarks of high quality evaluation work.

The RCMP’s evaluation report had no executive summary. Readers had to wrestle with a 140-page document to extract key messages. There was no mention of a guiding framework or of key evaluation questions. The model of the program logic was weak. Findings and recommendations were conflated. The appendices were confusing and did not seem to add value.   

Lesson 3: To be useful to Parliament, evaluative information needs to be concise, balanced, logical and focused on answering key questions. 


An Evaluator General, as an officer of Parliament, could identify programs of critical concern and help ensure that they were evaluated by departments or agencies in a professional and timely manner. 

The Evaluator General would provide balanced, credible evaluative information as required to support parliamentary debate, drawing not only on departmental or agency study findings but also on relevant findings from studies around the world. 

Information Sources:



http://www.rcmp-grc.gc.ca/cfp-pcaf/rep-rap/index-eng.htm (see other

reports, CFP Evaluation)