Assessing Canada's Security Intelligence Review Committee: Some preliminary data

Cross-referencing: NSL, ch.3; Best Practices in Security and Intelligence Article

After a term-long interruption to deal with classes and vice dean duties, I have returned to work on my article examining the Scheinin principles on best practices in security and intelligence and CSIS.  In light of anticipated developments related to the CSIS inspector general and the upcoming International Intelligence Review Agencies Conference, I have skipped ahead a few sections to conduct some number crunching on the Security Intelligence Review Committee, CSIS's review body.  My preliminary observations follow:


Canada’s system of review predates many of the developments undertaken by other democracies in relation to their own security intelligence sector.  As a consequence, Canada – and particularly its SIRC – is sometimes viewed by these agencies with considerable respect.  Within Canada, however, the CSIS review agencies have a low profile.  Between 1984 and the date of this writing, SIRC was mentioned in the major Canadian daily newspapers a mere 853 times.[1]  One hundred and fifty seven of these mentions were in 1994, by far the year with the most coverage of SIRC.  This spike is explained by the controversy generated by the so-called “Bristow affair”.[2]  More typical years have 10-50 mentions in major Canadian dailies.  The CSIS inspector general has been even more invisible, with just approximately 200 mentions in major Canadian dailies from 1984 to present. [3]

These results pale in comparison with those for other Canadian federal watchdog agencies.  For instance, there were more than 5,500 articles featuring the activities of the Access to Information Commissioner during the period 1984 to present.[4]  Meanwhile, Privacy Commissioner Jennifer Stoddart’s activities have been featured in over 1600 newspaper stories in the last decade alone.[5] 

Much of the coverage of SIRC and the inspector general is sparked by its public reports, some of which have been critical of CSIS.  That said, the 1994 coverage of SIRC was particularly critical of SIRC itself – the term “lapdog” appeared in associated with SIRC in eight published items that year.[6]  The agency also has attracted at least one regular media critic.  Long-time CSIS critic and journalist, Andrew Mitrovica has written that “[i]t’s time that SIRC stopped being a dumping ground for former politicians, and well-connected (and, no doubt, well meaning) business people, doctors and ex-bureaucrats with little or no experience in intelligence matters. It's time that SIRC be given the money, powers and experienced investigators it so desperately needs to do what on paper, at least, is an important job.”[7] In a 2000 story on SIRC, Mitrovica quoted critics of SIRC complaining that:

  • SIRC was “too timid and too slow to investigate serious allegations of possible criminal wrongdoing and other disturbing revelations about CSIS” (opposition party national security affairs critics); and,
  • SIRC’s researchers and analysts were “easily intimidated and dismissed by CSIS senior managers” (anonymous “veteran CSIS agents”).[8]

More recently, Mitrovica has repeatedly warned that SIRC is grossly underfunded relative to its functions.[9]  In 2011, he urged that

SIRC needs more money and people to determine whether or not CSIS is playing by the rules. Today, SIRC has a pathetically small staff and budget. While the coffers of Canada's mushrooming security-intelligence apparatus reach into the hundreds of millions of dollars, SIRC has an operating budget of just over $2.6 million and a staff of 20.

Incredibly, a succession of timid SIRC chairs has remained mute on this score.

Instead, they have preferred not to rankle their political bosses about the obvious and long-standing need for a lot more money to hire a formidable team of experienced and determined investigators rather than neophytes culled from other civil service branches.[10]

Academic critics have been less pointed, but have also raised doubts about SIRC.  Professor Wesley Wark, for example, has noted that the SIRC existing at the time of this writing

is small and is promised no new resources to enhance its reporting. The degree to which the Conservative government takes SIRC seriously is placed in doubt by its failure to replace the SIRC chair, a part-time position, following the departure last year of Arthur Porter for alleged improprieties. Not only is SIRC headless for the moment, but the other Privy Councillors appointed to the committee do not have inspiring backgrounds in federal politics, decision making or in terms of their knowledge of intelligence and national security issues.[11]

Wark was earlier quoted as questioning the appointment process for SIRC, and notably the pattern of appointing those with political or other pedigrees but no subject-matter expertise.  This pattern, Wark urged, “perhaps allows for open minds, but also potentially for empty ones, or for SIRC to be prone to subscribe to CSIS interpretations of their actions. To a certain extent this tendency is meant to be kept in check by the permanent SIRC (staff), but that staff is small.”[12]

Professor Kent Roach has also commented on the composition of SIRC, noting that  

[t]he bi-or usually tri-partisan nature of SIRC, as well as the reputation of the often well known and respected former Premiers and Cabinet ministers appointed to it, provides some public confidence in its operation. It should be noted, however, that not all of the federal government's appointments to SIRC have been well received over the years and the credibility of review bodies can be quickly increased or diminished by the quality of appointments to them.[13] 

Assessment of SIRC’s actual performance is less common, especially in the legal literature.  The agency is not mentioned in a single academic article archived in the major Legaltrac database of published legal articles and commentary.[14]  A single recent article on the SSRN network assesses the effectiveness of SIRC is addressing grievances by those wrongfully accused in national security investigations.[15]  That article sees virtue in SIRC’s ability to access information, but notes that its ability to “mete out effective justice is very limited”,[16] confined to post-hoc recommendations.

Several (now dated) articles have been authored by former SIRC members or employees, and explain features of SIRC’s operations[17] or, in one case, defend it from criticism.[18]

Perhaps the most comprehensive assessment of SIRC dates to a 1989 academic article in which UK security intelligence expert Peter Gill reviewed the impact of SIRC during the period 1984-1988.[19]  Gill probed SIRC’s performance with an eye to four questions:  1. Did SIRC have adequate resources? 2. Did SIRC have the will to use these resources energetically? 3. Can SIRC obtain the information necessary for effective review? 4. Measured by its impact on CSIS performance, does SIRC have political influence?

Gill’s assessment was largely positive.  He rejected the notion that SIRC had been intentionally under-resourced.  While SIRC’s work was largely reactive, responding to matters brought to its attention, it had produced important proactive reports.  Notably, this success was regarded as a “beneficial spinoff from the appeals process into review and in part also because SIRC has been able to make use of the Inspector General’s resources by tasking him under s.40 to carry out reviews”.[20]

Gill also commented favourably on SIRC’s propensity to publicize its criticisms of CISS, regarding this as evidence of a will to have a real impact.  Gill regarded this public exposure – more than any particular originality in SIRC’s analysis – as an important driver of change at CSIS.   He also assessed SIRC’s access to sufficient information as real.  Finally, Gill drew some provisional conclusions about SIRC’s impact on CSIS, discerning some changes that he attributed to SIRC’s scrutiny.

Gill’s assessment of SIRC may constitute the high water mark.  Academic assessments thereafter have been more critical, if less systematically comprehensive.  In 1992, York University political scientist Weg Whitaker acknowledged that “very significant public presence” of SIRC during its first five year, under the stewardship of its first chair, Ron Atkey.[21]  Whitaker observed that SIRC had not yet been co-opted by the agency it was charged to review, a perennial threat to SIRC-like institutions.  At the same time, he warned that Canada was entering an era of policy drift and institutional inertia, and only public scandals were likely to shake this inertia.[22]

In a subsequent 1996 article, Whitaker assessed SIRC’s performance during the 1994 Bristow case, noted above.  He concluded that “[a]lthough SIRC has fulfilled most of the implicit expectations of the government in the affair, it has by no means emerged unscathed.  …[T]he spotlight cast upon the review body’s unrepresentative political make up…have done it some lasting harm”.[23]

More recent academic articles have commented on SIRC’s modest funding, describing them as out-matched by increased CSIS resources.[24]  Meanwhile, in an opinion piece written in 2002 on the aftermath of 9/11, Wark observed that “SIRC has been invisible and silent since Sept. 11. It failed to undertake an immediate review of Canadian security intelligence knowledge surrounding the attacks, one more sign that SIRC has lost its early edge.”[25]

c)  Measuring SIRC Performance

These are all important critiques.  Examining their merit is, however, a daunting undertaking.  Whether SIRC is functioning well or not is very difficult to test empirically given the inherently closed nature of national security review.  Nevertheless, as a first step to adding to this sparse contemporary literature on SIRC, I have sought to compile time series data on SIRC’s composition and performance, drawing on publicly available data in SIRC and CSIS annual reports and other reference databases.

1. Appointment and Composition

SIRC members are appointed by the governor-in-council (after consultation with the leaders of official parties in the Commons) for five-year terms, and sworn into the Queen’s Privy Council for Canada.  The Act imposes no qualification requirements. 

i) Political Affiliation

As discussed above, SIRC has been critiqued over the years for a membership comprising (at least in part) political partisans.  The implication is that partisanship has an effect on performance – presumably government affiliated members might be inclined to meekness in the face of CSIS wrongdoing while opposition affiliates might suffer from a more aggressive outlook.

For the purposes of this article, I reviewed the biographies of the 25 individuals who had been appointed to SIRC since its inception to the time of this writing.[26]  Of these 25, 12 (48%) were either former politicians associated with the party in power at the time of their appointment, or otherwise had documented affiliations to that party (in the form, e.g., of electoral contributions).  Another 11 (44%) had analogous ties to opposition parties at the time of their appointment.  For the remaining 2 (8%), I was unable to find any clear partisan affiliations.

The tenures of these individuals were not evenly distributed or of a similar duration.  Averaged over the 27 years, government party affiliated members comprised 32.5% of the SIRC membership; opposition party affiliated members 50.8% of SIRC membership, while nonpartisan members occupied 16.8% of SIRC committee posts. 

These averaged figures mask several interesting patterns.  For one thing, because SIRC member appointments endure for five years (and may be renewed), members sometimes presided during periods in which their party of appointment was no longer in office.  These political changes have produced 5 years in the 27 years for which data were available in which no government party-affiliated individual sat on SIRC.  In comparison, there has never been a year in which SIRC was without opposition-affiliated members. 

Most of the years in which government party affiliates were absent from SIRC were in the mid-1990s, after the defeat of the Progressive Conservative Party by the Chrétien Liberals.  To some extent, this pattern (in more modest form) persisted throughout the Chrétien years.  From 1994 to 2003, opposition or non-partisan members dominated SIRC.

The more typical pattern both before and after this period has been 2 or 3 government affiliated members and 2 or 3 opposition affiliated or non-partisan members.  Put another way, the membership has split fairly evenly between government affiliated and non-government affiliated members. 

Determining the impact of affiliation on SIRC performance is difficult because “performance” is a difficult quality to measure quantitatively.  The single proxy indicator of performance that I was able to extract from the publicly available data is the number of reports issued by SIRC in response to their review and complaints functions.  Complaints that culminate in reports tend to be those in which SIRC found some action by CSIS worthy of commentary and often recommendations for change.  Quantity of reports is an imperfect indicator since it does not speak to the quality of what is found in these (almost always) secret reports.  Nor can it measure the extent to which SIRC overlooked matters that should have produced a report.  Nevertheless, quantity of reports is the only indicator available on the public record.

In fact, there is no correlation between the opposition and non-partisan affiliations of SIRC members and the number of overall SIRC reports issued by year.[27]  Likewise, there is no meaningful correlation between opposition and non-partisan affiliations of SIRC members and the proportion of reports issued in response to complaints resolved or closed by SIRC in individual fiscal years.[28] 

ii) Expertise

The data on expertise demonstrate other own interesting trends.  First, I was only able to find one instance in which a SIRC member’s biographical material suggested a pre-existing intelligence expertise.  One of the first SIRC members, Saul Cherniack, served in military intelligence during the Second World War.  While it is entirely possible that other SIRC members acquired security intelligence expertise in a manner not evident from the biographical information (perhaps while in other government offices), it is still striking that effectively none have had “official” pre-existing involvement in the modern security intelligence sector. 

In terms of on-the-job learning, the average number of “person years” each year on SIRC has been 16.7.  That is, the combined years of experience on SIRC of all of the sitting committee members on average equals 16.7 years.  As table 1.1 suggests, this measure of institutional knowledge has waxed and waned with time, appearing to move in rough cycles as new appointees acquire experience and experienced members depart.

A key question is whether levels of experience have a bearing on SIRC performance.  Again, I have used the issuance of reports as a proxy of performance.  There is no statistically meaningful correlation between these person years and the number of SIRC reports issued per year, suggesting that experience (or lack thereof) has not had an impact on the quantity of formal review work undertaken by SIRC.[29]  Nor is there any correlation between person years and the proportion of closed complaints that resulted in SIRC reports.[30] 

Table 1.1



A second trend relates to legal training.  Not surprisingly, given the quasi-judicial function performed by SIRC in its complaints function and the potentially legalistic matters raised in its review functions, SIRC has historically been heavy with lawyers.  Fourteen (56%) of SIRC members have either been practicing lawyers or otherwise legally trained.  This average obscures, however, important trends.  In particular, the proportion of legally trained SIRC members has fallen dramatically in the last decade.  This development is illustrated in table 1.2.  Since 2006-2007, no SIRC member has been a graduate of a law program, let along a practicing lawyer.

This changing composition does not seem to have affected the numbers of reports issued by SIRC.  There is no correlation between legal training and the number of SIRC reports issued each year.[31]  Nor is there a correlation between legal training and the proportion of closed complaints resulting in SIRC reports.  Indeed, if anything there was a very weak negative correlation between legal training of these complaints-sparked SIRC reports.[32]

Table 1.2


iii) Funding

As table 1.3 suggests, SIRC funding has always been modest relative to that of CSIS.  Between 1985 and 2009, the periods for which data were available, it averaged 0.77% of CSIS funding.  At certain periods – especially in the early 1990s – it fell well below this level, before moving back to average or above-average figures in the early 2000s.  In 2004, it rose to its highest level ever – 0.97% of CSIS funding – after a 2002 request from SIRC that its funding be increased to reflect the increased size of CSIS post-9/11.[33]  More recently, however, SIRC spending has fallen to the lowest levels of its history, relative to that of CSIS.  In 2008-2009 and 2009-2010, SIRC spending was 0.56% and 0.51% of CSIS spending.  (Notably, my calculations for CSIS spending for 2009-2010 subtracted the $44 million spent on the new CSIS headquarters for that year and so can be regarded capturing only spending on personnel and operations.)[34]

Of interest, the estimates for SIRC during these two years envisaged a substantially higher spending level than was actually realized – a difference of approximately $0.5 million per year – suggesting that the committee did not conduct all of the activities it had anticipated in preparing its estimated expenditures, or that those activities were substantially less expensive than predicted.  Whatever the reasons, reduced SIRC spending during these years does not seem to be the product of an externally mandated spending restraint.

Table 1.3


The impact of budgetary matters on productivity is difficult to measure definitively.  However, using the index of reports issued by SIRC, the historical average of SIRC spending divided by number of reports is $0.28 million.  As table 1.4 suggests, that number has varied over time.  The budget/report figure fell during the 1990s and has risen in the last decade, suggesting that SIRC is producing fewer reports per unit of its spending than was the case in the 1990s.

Table 1.4

Perhaps even more meaningful is a measure of SIRC activity relative to the size of CSIS.  One could hypothesize that the tempo of SIRC activity would increase as the scale of CSIS activities increase, since with more activity comes the greater prospect of complaints and error.  In fact, however, the number of new complaints brought in relation to CSIS each fiscal year suggests a weak negative correlation with the CSIS budget – that is, a bigger budget is weakly correlated with fewer complaints.[35]

Moreover, the number of reports issued by SIRC per million dollars of CSIS spending belays the notion that SIRC has been busier in response to a larger CSIS. Overall, from the period 1985 to 2009, there is no statistical correlation between the number of reports issued by SIRC and the CSIS budget.[36]  More recently, as table 1.5 suggests, the number of reports issued by SIRC per million of CSIS funding seems to have decreased since the 1990s.  Put another way, SIRC activity – measured in reports – has been more or less static even while CSIS’s budget and scale of operations has increased.  In the result, the proportion of reports per unit of CSIS budget has fallen.

Table 1.5





[1]           Data produced by a search in ProQuest “Canadian Newsstand Major Dailies” database, using keyword “SIRC”.

[2]           In 1994, SIRC issued a report on a so-called agent provocateur associated with CSIS that galvanized substantial controversy, criticism and press coverage.  By way of example, see, e.g., Clayton Ruby, “SIRC’s intolerable ‘limit of the tolerable’,” Toronto Star (21 Dec 1994), A25.

[3]           Data produced by a search in ProQuest “Canadian Newsstand Major Dailies” database, using keyword “inspector general” and “CSIS”.

[4]           Data produced by a search in ProQuest “Canadian Newsstand Major Dailies” database, using keyword “commissioner” and “access to information”.

[5]           Data produced by a search in ProQuest “Canadian Newsstand Major Dailies” database, using keyword “Privacy” and “Jennifer Stoddart”.

[6]           Data produced by a search in ProQuest “Canadian Newsstand Major Dailies” database, using keyword “SIRC” and “lapdog”.

[7]           Andrew Mitrovica, “Canada’s spy watchers ring alarm many years too late,” Toronto Star (1 Nov 2010) A17.  Mr. Mitrovica is probably the single most prolific critic of SIRC in the popular press.  Other items authored by him include: “Toothless bark from spy watchdog,” Toronto Star (01 Nov 2011) A19; “Casting light on our spies,” Toronto Star (15 Nov. 2011) A23; “Same old torture story,” Toronto Star (13 Feb 2012) A15; “CSIS freed from final shreds of oversight,” Toronto Star (1 Mat 2012) A19.

[8]           Andrew Mitrovica, “Spy watchdog timid and slow, critics assert,” Globe and Mail (6 July 2000) A8.

[9]           See Andrew Mitrovica, “Toothless bark from spy watchdog,” Toronto Star (01 Nov 2011) A19;

[10]         Andrew Mitrovica, “Casting light on our spies,” Toronto Star (15 Nov. 2011) A23.

[11]         Wesley Wark, “Don’t cut off the minister’s eyes and ears on CSIS,” Ottawa Citizen (1 May 2012) A11.

[12]         Wesley Wark, “We don’t need foreign spy service,” Ottawa Citizen (7 Feb 2011) A4.

[13]         Kent Roach, “Review and Oversight of National Security Activities and some reflections on Canada’s Arar Inquiry,” (2007) 29 Cardozo Law Review 53 at 65-65.

[14]         Note that not all articles accessed through this database are full text searchable.  It is not, therefore, definitive.  A search of Quicklaw’s legal literature database unearths a handful of articles mentioning SIRC, but generally with nothing more than passing reference.

[15]         Jasminka Kalajdzic, “Access to justice for the wrongfully accused in national security investigations,” (2009) 27 Windsor Y.B.  Access Just. 171.

[16]         Ibid at 188.

[17]         See, e.g., J.J. Blais, “The political accountability of intelligence agencies ‐ Canada,” (1989) 4:1 Intelligence and National Security 108; Murray Rankin, “National Security: Information, Accountability, and the Canadian Security Intelligence Service,” (1986) 36:3 University of Toronto Law Journal 249.

[18]         Maurice Archdeacon, “The heritage front affair,” (1996) 11:2 Intelligence and National Security 306.

[19]         Peter Gill, “Symbolic or real? The impact of the Canadian security intelligence review committee, 1984–88,” (1989) 4:3 Intelligence and National Security 550.

[20]         Ibid at 570.

[21]         Reg Whitaker “The politics of security intelligence policy‐making in Canada: II 1984–91,” (1992) 7:2 Intelligence and National Security 53 at 59.

[22]         Ibid at 72.

[23]         Reg Whitaker, “The ‘Bristow affair’: A crisis of accountability in Canadian security intelligence,” (1996) 11:2 Intelligence and National Security 279 at 301.

[24]         Roy Rempel, “Canada’s Parliamentary Oversight of Security and Intelligence,” (1994) 17 International Journal of Intelligence and CounterIntelligence 634 at 638.

[25]         Wesley Wark, “Our security IQ needs testing,” Globe and Mail (28 Feb 2002) A19.

[26]         This biographical information was drawn from various editions of Canadian Who’s Who, keyword searches of the “Canadian Newsstand” database, Election Canada electoral contribution returns for a number of years, online biographical material posted where a SIRC member was also a member of the Order of Canada, and, for current SIRC members, the online biographies posted by SIRC itself.

[27]         SIRC reviews per annum were tabulated from the SIRC annual reports and the list of SIRC reviews found at  In some cases, the nomenclature used to denominate the date of these reviews and the changing formats of annual reports made it hard to pinpoint with precision the precise fiscal year in which a report was released.  This creates a minor degradation in the quality of the statistical analysis involving these reports.  With this caveat, correlation between the summed total of opposition or non-partisan affiliated members as a percentage of the SIRC members each fiscal year and numbers of report per year is 0.08 (using the CORREL function in Excel) – effectively zero correlation.  (A perfect positive correlation would be 1.0).

[28]         Data on the number of closed complaints are available in SIRC annual reports from 1988-89 forward.  Data on the number of complaints resulting in reports is available – or may be easily extrapolated – from SIRC annual reports from 1999-2000 forward.  Running the CORREL function in Excel in relation to reports as a proportion of closed complaints and opposition or non-partisan affiliation, from 1999 forward, produced a correlation of 0.19 – at best, an extremely weak correlation.

[29]         See notes above for the source of these SIRC report data.  Overall correlation between person years and number of reports was 0.27 (using the CORREL function in Excel), suggesting only a weak correlation between the two variables.  A correlation lagged by one year in an effort to accommodate a delay in the impact of person years on report production produced an even weaker correlation of 0.18.  (A perfect positive correlation would be 1.0).

[30]         Running the CORREL function in Excel in relation to reports as a proportion of closed complaints and person years, from 1999 forward, produced a correlation of 0.03 – essentially no correlation.

[31]         Overall correlation between the proportion of SIRC members with legal training each fiscal year and the number of reports was -0.05, effectively zero.

[32]         The correlation between legal training and the proportion of closed complaints resulting in reports for 1999 forward was -0.20.

[33]         See discussion in Canada, SIRC Annual Report, 2004-2005, at

[34]         Data for these calculations were collected from the SIRC and CSIS annual reports available on the website of these organizations.  SIRC annual reports report CSIS budgets throughout the 1980s and into the 1990s.  The period for which data on CSIS budgets were available was 1985-2009.

[35]         The Excel CORREL function produces a figure of -0.32 for the period for which data were available – 1988-2009.

[36]         The Excel CORREL function produces a figure of 0.0 for the period for which data were available – 1988-2009.