New Dissemination Model — Home page, Navigation and Data Tables
Archived information
Archived information is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please "contact us" to request a format other than those available.
In April 2012, Statistics Canada launched its multi-year New Dissemination Model project with the goal to modernize the methods and framework for disseminating data via its website. The key objective is to create a user-centric website and to increase coherency, consistency and simplicity in dissemination activities.
As part of this project, Statistics Canada held consultations with Canadians in June 2015. This final pre-launch consultation evaluated the website's ease of navigation and user satisfaction with the refined design. Evaluation sessions focussed on testing the updates made to: the main menu; the data, analysis, reference and Census pages; and the geography mapping tool.
Consultation methodology
Statistics Canada conducted in-person usability consultations. Participants were asked to complete a series of tasks and to provide feedback on the proposed website.
How to get involved
This consultation is now closed.
Individuals who wish to obtain more information or to take part in a consultation may contact Statistics Canada by sending an email to consultations@statcan.gc.ca.
Please note that Statistics Canada selects participants for each consultation to ensure feedback is sought from a representative sample of the target population for the study. Not all applicants will be asked to participate in a given consultation.
Statistics Canada is committed to respecting the privacy of consultation participants. All personal information created, held or collected by the Agency is protected by the Privacy Act. For more information on Statistics Canada's privacy policies, please consult the Privacy notice.
Results
What worked
Most participants successfully completed a series of tasks on various pages of the proposed website. In terms of the main menu, the labels ‘About StatCan’, ‘Geography’, ‘Analysis’, and ‘Surveys and statistical programs’ were well received. Participants successfully used the horizontal ‘Key statistics’ layout and preferred the version that provided provincial data. The tab layout of the results page was understood by participants—they knew that when they clicked on a tab, the displayed results were limited to that tab.
The button labels for the two download options were also well received. Participants understood the different functionality of the two buttons and correctly selected the right option for downloading a table as displayed.
With the Geography mapping tool, most participants were able to go back to a map of Canada using an alternate path (they did not use the available button).
The ‘Sort by’ labels on the results page were understood by most participants and they preferred the current labels (‘Sort by relevance’ and ‘Sort by most popular’). Finally, as an indicator icon to obtain additional information, the participants preferred the information icon to the question mark icon. This icon would be placed next to key items.
Areas for improvement
Infrequent users were not aware of the National Household Survey, or how it related to the Census of Population.
The ‘More key statistics’ button was sometimes overlooked in the ‘Key statistics’ area.
The proposed layout of some Census and National Household Survey table results (‘grouped tables’) was not clear to some users and they were hesitant to click on the link to the appropriate table.
The proposed icons for ‘Revert-to-Canada” and ‘Settings’ on the Geography mapping tool were not intuitive for participants.
Recommendations
Keep the current labels on the main menu.
Retain the current ‘Sort by’ labels.
Use the horizontal layout for ‘Key statistics’ and have the provinces clickable on the page. Increase the font size for the ‘More key statistics’ button.
Provide succinct explanations for the Census program and the relationship between the Census of Population and the National Household Survey and the Census of Agriculture.
On the Census program page, use tiles to illustrate the featured items. Use an information icon to explain what each tile is about.
In the layout of the ‘grouped tables’, link the table’s title as well as the individual table number.
Use ‘Download as displayed’ and ‘Download data series’ on the buttons within the ‘Download table’ options.
For the Geography mapping tool, use a button with Canada on it for the ‘Revert-to Canada’ function and use a gear icon for the ‘Settings’ button.
Use an information icon to indicate that there is more information available about an item.
Statistics Canada thanks participants for their participation in this consultation. Their insights guide the agency's web development and ensure that the final products meet users' expectations.
By Susie Fortier and Guy Gellatly, Statistics Canada
This special edition article provides nontechnical answers to selected questions related to the use and interpretation of seasonally adjusted data. Organized as a set of Frequently Asked Questions (FAQ), it complements the more technical discussions of seasonal adjustment in Statistics Canada publications and reference manuals.
This reference document is divided into two sections. Section 1 is a review of concepts and definitions that are central to the theory and practice of seasonal adjustment. Section 2 is a discussion of selected issues that are related to the analysis and interpretation of seasonally adjusted data.
Section 1: Context, definitions and terminology
1. What is a time series?
A time series is a sequence of observations collected at regular time intervals. These data provide information on a well-defined statistical concept for a specific reference period, and are presented at different points in time. Most economic data disseminated by Statistics Canada are presented as a time series. Examples include the monthly data on consumer prices, retail sales, employment and gross domestic product. These data correspond to monthly reference periods that are available for a long sequence of months, to facilitate comparisons over time.
2. What is a seasonally adjusted time series?
Monthly or quarterly time series data are sometimes influenced by seasonal and calendar effects. These effects can bring about changes in the data that normally occur at the same time, and in about the same magnitude, every year. For example, monthly retail sales have historically been at their highest level for the year in December as a result of holiday shopping, and then declined to lower levels in January. This occurs year after year and affects the extent to which information on trends in retail industries can be informed by comparing raw sales data for these two months. A seasonally adjusted time series is a monthly or quarterly time series that has been modified to eliminate the effect of seasonal and calendar influences. The seasonally adjusted data allow for more meaningful comparisons of economic conditions from period to period. A raw time series is the equivalent series before seasonal adjustment and is sometimes referred to as the original or unadjusted time series.
3. Why is seasonal adjustment needed?
Many users of economic and social statistics rely on time series data to understand changes in socio-economic phenomena over time. Important statistical properties of a time series include its direction and turning points, as well as its relationship to other socio-economic indicators. A seasonal pattern in a series can obscure these important features by making period-to-period movements in the data more difficult to interpret. Many users of time series data do not consider movements in the data that relate to seasonal and other calendar effects to be analytically meaningful. These seasonal and calendar effects can obscure "true" underlying movements in the data series related to the business cycle, or to non-seasonal events, such as strikes or unanticipated disruptions in production. Consequently, seasonal adjustment techniques that remove the effect of seasonal and calendar influences from the original data can sharpen the extent to which a time series can be used to evaluate meaningful changes in economic conditions over time.
4. Is seasonal adjustment always required?
Seasonal adjustment may not always be appropriate or required. It is not necessary to seasonally adjust a series that does not exhibit an identifiable seasonal pattern or other calendar-based influences. It is also not always advisable to use seasonally adjusted data when the raw estimate represents the true statistic of interest. For example, decision makers who rely on the Consumer Price Index (CPI) for indexation purposes are advised to use unadjusted data—as these reflect the actual price movements observed from period-to-period. However, data users who are more interested in analyzing underlying price trends in the economy are encouraged to use seasonally adjusted indexes.
Similarly, analysts who are interested in calculating the raw growth in the number of young adults working from April 2012 to May 2012 should examine the raw estimates for these two months, and calculate the difference. This month-to-month change in raw employment might not yield much useful information about changes in the labour market conditions facing young adults if seasonal or calendar effects have a significant influence on employment levels in either or both months. However, the raw data show the extent to which actual employment for this group grew, or contracted, from April to May—which may be useful information for other purposes.
5. How common is seasonal adjustment at Statistics Canada?
Statistics Canada seasonally adjusts almost all of its major sub-annual economic indicators, including quarterly and monthly estimates of gross domestic product, and monthly employment estimates from the Labour Force Survey. Although the vast majority of the agency's releases highlight seasonally adjusted data, both the seasonally adjusted series and unadjusted series are often made available.
6. How are seasonally adjusted data estimated?
Seasonally adjusted data are estimated by breaking down time series data into various components. Using well-established statistical techniques, this process involves decomposing a time series into four separate components: (1) the trend-cycle, (2) seasonal effects, (3) other calendar effects such as trading days and moving holidays, and (4) the irregular component. The seasonally adjusted series is the original time series with the estimated seasonal and calendar effects removed, or equivalently, the estimated combination of the trend-cycle and the irregular components.
7. What are the time series components?
A time series can be split into four separate time series components: (1) the trend-cycle, (2) seasonal effects, (3) other calendar effects such as trading days and moving holidays, and (4) the irregular component. Here is an overview of each:
The trend-cycle: This represents the smoothed version of the time series and indicates its general pattern or direction. The trend-cycle can be interpreted as the long-term movement in the time series, the result of different factors (or determinants) that condition long-run changes in the data over time. As its name suggests, the trend-cycle also reflects periodic expansions and contractions in economic activity, such as those associated with the business cycle.
Seasonal effects: These represent regular movements or patterns in time series data that occur in the same month or quarter every year. On the basis of past movements of the time series, these regular patterns repeat themselves from year to year. These seasonal patterns are fairly stable in terms of timing, direction and magnitude. Often these seasonal effects relate to well-established calendar-based variations in economic activity, such as the increase in retail sales in the lead up to Christmas, or increases in construction employment in the spring. Seasonal effects identify these regularly occurring patterns in the data.
Other calendar effects such as trading days and moving holidays: Aside from seasonal effects, other systematic calendar-based effects can influence the level of economic activity in a specific period. The most important of these are the trading-day effects. These effects can be present when the level of economic activity varies depending on the day of the week. For example, retail sales are usually higher on Saturdays than on any other day of the week. Consequently, a five-Saturday month is more likely to result in higher retail sales than a month with only four Saturdays. Another common example of a calendar effect is the date of Easter, which can be expected to increase retail sales in March or April depending on the month in which it occurs. This particular calendar effect is referred to as a moving holiday effect.
The irregular component: This component includes unanticipated movements in the data that (1) are not part of the trend-cycle, and (2) are not related to current seasonal factors or calendar effects. The irregular component could relate to unanticipated economic events or shocks (for example, strikes, disruptions, unseasonable weather, etc.), or can simply arise from noise in the measurement of the unadjusted data (due to sampling and non-sampling errors).
8. Which components are included and excluded in a seasonally adjusted series?
Seasonal effects and other calendar effects such as trading days and moving holidays are excluded from seasonally adjusted series. Consequently, the seasonally adjusted series is the combination of the trend-cycle and the irregular component. The contribution of the irregular component is worth emphasizing, because seasonally adjusted data are sometimes misinterpreted as providing users with "pure" information on the trend-cycle.
9. Why are raw and seasonally adjusted data revised over time?
The raw data can be revised to take into consideration additional data that were reported late, to correct data that were initially misreported, or for various other reasons. In such cases, the seasonally adjusted data that are based on unadjusted data also need to be revised.
Hindsight is very important for time series analysis. Even when the raw series has not been revised, it is often useful to revise the seasonally adjusted data. To estimate the seasonal effects at any given point in time, statisticians use information from previous, current and future observations. Information about future observations is not available in real time, so seasonal adjustment is conducted using previous and current values, along with projected values. These projections are based on a statistical model that uses past information. As new data becomes available, the various time series components can be estimated more accurately. This results in revised, more accurate estimates of the seasonally adjusted data.
Periodically, the methods used to estimate time series components for specific data series are also reviewed. Each statistical program at Statistics Canada has its own revision strategy, and schedules are routinely made available to data users in advance of these revisions.
10. Do year-over-year comparisons of raw data work as well as more formal seasonal adjustment techniques?
Comparing raw data for the same period in each year provides information on long-term trends and economic cycles, but these comparisons do not necessarily remove all the seasonal patterns from the data. Certain holidays, like Easter, do not fall on the same date or even in the same month from year to year. If the timing of these holidays influences the variable being measured, such as monthly retail sales, raw year-over-year comparisons can be misleading. For example, in 2013, Easter was on March 31st, whereas in 2012, it was on April 8th. Thus, it may be misleading to conclude that the change in sales from March 2012 to March 2013 reflects underlying trends in retail industries, as differences in sales may have been influenced by the timing of the Easter holiday.
Similarly, year-over-year comparisons of raw data ignore the trading day effect, which occurs in many series, and can affect the validity of year-over-year comparisons. For example, many businesses generate less output on Saturday and Sunday than during weekdays. In 2011, October began on a Saturday, and included 5 full weekends and 21 weekdays. In 2012, October began on a Monday, and included 4 full weekends and 23 weekdays. A simple year-over-year comparison between these two months will not account for these differences, and could affect the analysis of changes in economic output over time.
Even when no other calendar effects are present in the data, comparing the same periods in each year can still be problematic. In general, it can be shown that this type of comparison lacks timeliness for the identification of turning points (the point at which a decreasing series, for example, begins to increase).
Comparing a current value with only one past value (the value of the series 12 months before the current reference month) can also be misleading if that particular value is unusual. For example, comparing economic data for British Columbia for February 2011 to data for February 2010 (the month in which the province hosted the Winter Olympics) may not yield useful information about changes in trends. To partially mitigate this effect, data for the current month (February 2011) can be compared with an average of the data for previous Februarys (for example, the past five years). A similar technique can be applied to examine month-to-month movements. For example, the December to January movement of this year could be compared with a historical average of December to January movements for the last five years. Although this method may yield some additional insight, some measure of caution is warranted as it does not take the place of more formal seasonal adjustment techniques.
References
Ladiray, D. and Quenneville B. (2001) Seasonal Adjustment with the X-11 Method, Springer-Verlag, Lecture Notes in Statistics, vol 158.
Section 2: Issues related to analysis and interpretation
1. How do I interpret period-to-period changes in seasonally adjusted data?
Period-to-period changes in raw data and period-to-period changes in seasonally adjusted data provide different information. To illustrate this, consider hypothetical employment data from a monthly industry survey. Every month, these data are collected and processed to obtain an estimate of total industry employment. This estimate is raw (not seasonally adjusted)—it is a measure of the number of people working in the industry in the reference month, without distinguishing between (or disentangling) the various time series components that contribute to this estimate.
Before publication, this estimate of industry employment is seasonally adjusted, to remove the influence of seasonal and calendar effects from the raw data (using current and past information on industry employment). This adjusted estimate is the official estimate of industry employment released in The Daily.
An important note about comparisons over time—the difference between the seasonally adjusted employment estimates for two consecutive months cannot be interpreted as the raw difference in the number of people actually working in the industry in these months. The raw difference is the difference in the unadjusted employment estimates obtained directly from the survey.
Rather, the difference in the month-to-month seasonally adjusted estimates is a direct measure of the change in the number of people working, after expected changes due to the variation in seasonal employment between these two months are taken into account. The resulting number may be less than the raw difference or it may be more, depending on how seasonal effects are changing from month to month.
The example below illustrates the distinction between raw and seasonally adjusted data, using hypothetical employment data for an industry, collected over two consecutive months. In this example, it is assumed that there are no other calendar effects.
Table 1
Industry employment, raw and seasonally adjusted Table summary
This table displays the results of Industry employment. The information is grouped by Time Period (appearing as row headers), Unadjusted data, Seasonally adjusted data, Trend cycle, Irregular component and Seasonal effects (appearing as column headers).
Time Period
Unadjusted data
Seasonally adjusted data
Trend cycle
Irregular component
Seasonal effects
Persons
Source: Statistics Canada, authors' calculations.
Month 1
6,200
7,200
6,650
550
-1,000
Month 2
5,400
6,800
6,500
300
-1,400
Change (month 2 minus month 1)
-800
-400
-150
-250
-400
In month 1, the unadjusted estimate of industry employment was 6,200; the seasonally adjusted employment estimate was larger, at 7,200. Accordingly, the employment attributed to seasonal effects in month 1 was -1,000. What does this mean?
It means that about 1,000 fewer employees were expected to be working in month 1 when compared with a generic average level of industry employment throughout the year. These "expected" and "average" levels are based on historical patterns that reflect typical seasonal movements in these data.
Accordingly, these 1,000 fewer employees are added back into the employment estimate for month 1, yielding a seasonally adjusted estimate that is larger than the unadjusted, or raw, estimate collected from the survey. Why is this done? This occurs because the objective of seasonal adjustment is to make the month-to-month data more comparable so that they provide better information about trends and cyclical movements. Seasonally adjusting the data puts month-to-month comparisons on equal footing.
The estimate of industry employment for month 2 exhibits a similar pattern, with the final seasonally adjusted estimate exceeding the unadjusted estimate. In this month, 1,400 fewer employees would be expected to be working in the industry (compared with a generic average level of monthly employment throughout the year), based on regularly occurring seasonal movements. Adding this employment back into the unadjusted estimate from the survey data brings the published (seasonally adjusted) estimate to 6,800.
Both months are examples of "adding back" – supplementing the survey data with additional employment – because the seasonal effects are negative. In these cases, less employment is expected in the reference month because of past seasonal patterns, so employment has to be added back in to make the data comparable from month to month. For other months, the reverse could apply—because the seasonal factors are positive. In these months, more employees are expected to be working than in the hypothetical average month, so seasonal adjustment removes some employment from the unadjusted data to put these months (in statistical terms) on an equal footing with other months during the year.
2. How do seasonal patterns affect the interpretation of month-to-month changes?
The interpretation of month-to-month changes can be complex because it involves some of the more technical aspects of the data modelling used in seasonal adjustment routines. Seasonal patterns can be modelled "additively" or "multiplicatively". If seasonal patterns are modelled as additive, the extent to which month-to-month changes in employment are being influenced by changes in the seasonal effects can be examined in a fairly straightforward fashion.
To see this, consider again the hypothetical employment data used in the example in question 11. Seasonally adjusted employment fell from 7,200 in month 1 to 6,800 in month 2, a net decline of 400 workers.
This is different from the unadjusted change calculated directly from the survey data. The unadjusted estimate fell from 6,200 in month 1 to 5,400 in month 2, a net decline of 800 workers, or twice the decline in the seasonally adjusted data.
What accounts for the large difference in these two estimates? As noted above, both months had negative seasonal effects. This means that, in view of past patterns of seasonality, lower industry employment is expected in each of the two months when compared with an annual generic monthly average. But the negative seasonal effect in the second month was larger in absolute terms, by some 400 workers. While about 1,000 workers were added to the raw survey data in month 1 to obtain the seasonally adjusted estimate, some 1,400 workers were added back in month 2.
Numerically, about 40% of this reduction in the seasonally adjusted estimate can be attributed to changes in the trend-cycle. The remaining 60% is due to the irregular component.
3. Which estimate—seasonally adjusted or raw—is "correct"?
Both estimates are correct, as both derive from legitimate statistical processes. The choice of one over the other depends on the purpose of the analysis.
If users are interested in estimates of the actual level of industry employment in a particular period (the number of people working), or in the period-to-period changes in these actual employment levels, these estimates can be obtained directly from surveys without any seasonal adjustment.
A problem arises when trying to use these unadjusted data to interpret changes in economic conditions. The raw data reflect the combined effect of all components that contributed to the observed level of employment in a monthly or quarterly period. This includes the trend-cycle, the seasonal effects, the other calendar effects and the irregular component. In the example in question 11, it is correct to say that industry employment declined by 800 workers from month 1 to month 2— the decline tabulated directly from the raw data. But it is less appropriate to attribute this decline to specific factors, such cyclical downturns, while ignoring the potential influence of other components, such as routine changes in seasonal hiring patterns, which also contribute to changes in the raw data.
The key point is that the choice between seasonally adjusted and raw data is context-driven. It depends on the issue that the data are attempting to inform, and whether period-to-period movements in these data that derive from seasonal influences are relevant to that issue.
4. How do I interpret seasonally adjusted data when an industry is undergoing structural change?
This question relates to the reliability of seasonally adjusted data. Two points warrant emphasis:
Seasonal effects reflect typical movements in time series data due to established seasonal patterns;
Seasonally adjusted data (which remove the seasonal component and the other calendar effects) are influenced by more than changes in the trend-cycle. They are also influenced by irregular events that, in many cases, have a large impact on the resulting estimate.
Structural change can refer to situations in which some fundamental aspect of an economy or industry is changing, resulting in new conditions that differ from past norms. These could involve major technological innovations that alter the nature of production. They could also involve more routine changes in hiring patterns in response to new administrative practices.
Both of these examples could bring about new seasonal patterns in an industry that contrast with traditional seasonal patterns. How are these reflected in the seasonally adjusted data?
In the short run, these shifts would be regarded as irregular movements in the data, to the extent that they deviate suddenly from expected patterns. Over time, these new patterns would become seasonal and gradually incorporated into the historical record, as new time series information on these changes becomes available. This assumes that these changes are becoming a regular feature of the data—and not the result of irregular events or shocks.
Accordingly, it can be more difficult to interpret movements in seasonally adjusted data when underlying seasonal patterns are evolving or changing rapidly. In such cases, irregular factors can exert a large impact on seasonally adjusted estimates.
5. How does seasonal adjustment account for "unseasonable" weather?
This is a question that relates to a common misconception about seasonally adjusted data—namely, that it is a technique whose sole purpose is to remove the effect of changes in weather or climate from the data. Seasonal adjustment removes the average or anticipated effect of seasonal factors from monthly or quarterly data, many of which have to do with changes in weather or climate. But it is more accurate to state that these seasonal factors relate to all things seasonal—weather and climate-related or otherwise—that have the potential to affect the analysis of trend or cyclical patterns in the data.
The idea of the "average" effect noted earlier is important, as the magnitude of these period-specific seasonal adjustments are again based on historical patterns. If weather or climate conditions are generally reflective of these past patterns, the seasonal adjustment routines can be expected to do a fairly complete job of factoring out movements in the unadjusted data that are attributable to these weather or climate changes. But unseasonable weather, such as the very warm spring in eastern Canada in 2012, is, by definition, not indicative of the average pattern, and will influence seasonally adjusted estimates.
6. What method does Statistics Canada use to produce seasonally adjusted data?
Statistics Canada seasonally adjusts sub-annual time series data using the X-12-ARIMA method, which uses well-established statistical techniques to remove the effect of regular, calendar-related patterns from unadjusted data. Although less complex alternatives may be used, such as comparing the original data in the same period in each year, these techniques have limitations when it comes to removing calendar effects. Accordingly, Statistics Canada recommends the use of formal, established methods for dealing with seasonality. In practice, seasonal adjustment is performed following Statistics Canada Quality Guidelines.
7. Where can I find more information on selected issues?
As mentioned at the start, this document is intended as a practical guide that provides users with additional perspective on issues related to the use and interpretation of seasonally adjusted data. It is designed to complement a paper by Wyman (2010), who illustrated many of these points with Statistics Canada data. In addition, the extensive literature on seasonal adjustment can provide readers with a fuller examination of the issues discussed in this document.
References
Ladiray, D. and Quenneville B. (2001) Seasonal Adjustment with the X-11 Method, Springer-Verlag, Lecture Notes in Statistics, vol 158.
This report presents the results of the evaluation of the Macroeconomic Accounts Programs (MEAP). It covers three fiscal years, from 2010-2011 to 2012-2013. The evaluation was undertaken by the Evaluation and Performance Measurement Division.
This Report was approved by the Departmental Evaluation Committee and the Chief Statistician on November 19, 2014.
In accordance with the accountability requirements in the Treasury Board 2009 Policy on Evaluation and its Directive, this report is available to the public and posted on the departmental website in both official languages.
Statistics Canada also shared this report with its program delivery partners and key stakeholders, including the National Statistical Council.
Evaluation scope, purpose and methodology
The evaluation covers all components of the MEAP, with a particular focus on the International Accounts and Statistics Division (IASD). It addresses a number of questions relating to the continued need for the program, its alignment with government priorities, its consistency with federal roles and responsibilities, the achievement of its expected outcomes, and the extent to which it demonstrates efficiency and economy. It was included in the Departmental Risk-based Audit and Evaluation Plan for 2012-2013 to 2016-2017, which was approved by the Departmental Evaluation Committee on March 2013.
In accordance with the Government of Canada's Policy on Evaluation, the purpose of the MEAP evaluation is to provide an evidence-based, neutral assessment of its value for money, and more specifically, its relevance and performance. In addition, the evaluation design has incorporated all components of the Standard on Evaluation for the Government of Canada to ensure quality, neutrality and utility. Finally, the list of questions addressed in this evaluation incorporates all core issues identified in the Government of Canada's Directive on the Evaluation Function.
Information from multiple sources was used to address the evaluation questions, including:
a document and literature review
a review of financial and administrative data
a series of key informant interviews with program representatives, internal and external users of MEAP data, as well as representatives of relevant international organizations
a survey of data users
case studies
a bibliometric and webometric analysis.
The Macroeconomic Accounts Program
The Macroeconomic Accounts Program (MEAP) is expected to provide a comprehensive set of statistics on economic activities occurring in Canada, as well as economic activities between Canada and the rest of the world. At the time of the evaluation, the program included several components falling into two main groupings: the core accounts and a set of additional accounts. Each of these accounts has its own purpose, which is to provide a specific perspective on the nature of the Canadian economy.
The four components of the MEAP's core accounts provide the fundamental statistics relating to all aspects of Canada's core economic activities. These accounts include the Canadian System of National Accounts, the Canadian Input-Output Tables, the Government Finance Statistics and the International Accounts. The additional accounts falling under the MEAP include the productivity accounts, the capital stock program, as well as satellite accounts covering different aspects of the Canadian economy, such as tourism and culture.
The MEAP is expected to support Statistics Canada's two strategic outcomes by ensuring that:
all Canadians have access to timely, relevant and quality statistical information on Canada's changing economy and society for informed debate, research and decision making on social and economic issues
specific client needs for high-quality and timely statistical services are met.
The Macroeconomic Accounts Branch is responsible for the overall management of the MEAP. Its work is primarily supported by four divisions:
International Accounts and Statistics Division (IASD)
Industry Accounts Division (IAD)
Public Sector Statistics Division (PSSD)
National Economic Accounts Division (NEAD).
During the three years covered by the evaluation, the level of resources allocated to the MEAP has fluctuated. The number of budgeted full-time equivalents (FTEs) has varied between 301 and 333, and its total budgeted expenditures have varied between $26 million and $30 million.
Evaluation conclusions and recommendations
The analysis of the information gathered as part of this evaluation resulted in findings and conclusions about the relevance and performance of the MEAP, which led to three recommendations.
Conclusions
Relevance
The MEAP plays a critical role in supporting a number of decisions related to Canada's economic growth. First, it provides the required statistical information for the implementation of a number of legislative requirements, including those related to equalization payments, harmonized tax systems, and to the monitoring of foreign ownership and control of Canadian businesses. It also allows Canada to uphold a number of international reporting commitments to the international community such as the IMF and UN.
Evaluation findings also confirm that the nature of the economic information being gathered, the process by which it is gathered, and the purpose for which it is gathered are all consistent with the roles and responsibilities of the federal government. Moreover, considering the economic crisis that Canada has had to face during the period covered by the evaluation, there was a strong alignment between the activities undertaken through the MEAP and the priorities of the federal government.
What evaluation findings also indicate, however, is that a number of drivers are fundamentally changing the nature of economic activities and economic relationships among countries, which creates expectations that different or new statistical data will be produced to adequately monitor the impact of these new trends. To meet these new information needs, the Macroeconomic Accounts Branch has undertaken a number of initiatives, including the implementation of new international standards. While these efforts are significant, evaluation findings also confirm that a number of information gaps remain and, in order to adequately address them, new activities will need to be undertaken.
Achieving expected outcomes
The MEAP has made significant progress towards its expected outcomes. First, the program has allowed all key stakeholders to readily access statistical information related to many dimensions of Canada's economy. While some metadata information requires updating, statistical users have had access to information supporting the appropriate use of the data. While these statistics are released within the prescribed timeframe, the evaluation has found that there are growing pressures to release this statistical information within a shorter timeframe in order to meet the requirements and expectations of key stakeholders.
Second, the MEAP data has enhanced the level of knowledge of its key users. Evaluation findings point to a high level of satisfaction in relation to the interpretability of the data produced. This is particularly significant considering the increasing complexity of the information provided in light of the implementation of new international standards. A structural characteristic that works in favour of Statistics Canada is its high level of operational integration, which contributes to the coherence of the data produced. In this context, the MEAP has been in a position to produce data that have not required significant revisions. This is an indication of strong reliability, which is an important indicator of data accuracy.
Third, the MEAP data is used in a wide range of settings and for multiple purposes. The evaluation findings confirm that it is primarily in applied settings that MEAP data are used to shape economic, fiscal and monetary policies. While the data is also used for research purposes, current practices are such that the data is often used without being directly or properly cited in academic publications. This creates significant challenges in attempting to adequately measure the extent to which the data is used in academic settings.
Efficiency and economy
The evaluation findings on economy and efficiency, which focuses on the IASD as an illustrative example, found that the Division has been required to maintain data releases and other information products within a reduced budget. The fact that the Division has maintained its schedule of data releases with reduced staff levels creates an apparent lowering of unit cost in the production of the international accounts and statistics. On the surface, it would appear that IASD has become more efficient. However, against these efficiency gains must be set an increased potential for errors and slippages in the schedule, which would negatively affect relevance and reputation. While this risk appears to have been managed because of professional commitment, there is evidence that the information production processes are being progressively stretched.
A closely related issue is that the resource constraints may increasingly place the division in a position of not being able to fulfill its entire mandate. The changing nature of the economy, in particular the globalization of manufacturing and services, means that government and industry decision-makers need new varieties of information to set policy and make investments. In addition to meeting this demand for new information perspectives, the resource constraints also pose another relevancy risk, in that they limit the Division's ability to serve other federal government users who often are prepared to participate in a cost-recovery process. Also, the transition towards the centralization survey data collection activities has proven to be more challenging than anticipated, particularly for smaller divisions such as the IASD with highly-specialized survey activities. As a result, the expected financial relief on the Division did not materialize as initially expected.
Two other important findings emerged in the evaluation. First, the financial administration system in general and the time recording processes in particular do not support the tracking of staff time (costs) through the diverse activities that lead to the range of outputs. Second, while the Division maintains procedures that document the activities needed to support data releases, these are not business process models of the activities that support explicit measurement of activities needed to support the realization of the diverse range of IASD outputs.
Recommendations
Based on all evaluation findings gathered as part of this evaluation, the following recommendations are submitted for consideration by the Macroeconomic Accounts Branch:
Recommendation 1 (performance measurement):
It is recommended that the MEAP proceed with the implementation of a performance measurement system to ensure the availability of timely performance information to support ongoing program management and decision making, demonstrate the achievement of program expected outcomes and support future evaluations (including up-to-date business process models for outputs, list of outputs produced and released and a register of data users).
Recommendation 2 (relevance, efficiency and economy):
It is recommended that IASD, establish/implement the appropriate strategy to enhance its capacity and explore avenues to resume the provision of cost-recovery services to deal with the current relevancy risk, the increased potential for errors and slippages in the schedule and to address some of the unmet needs of users.
Recommendation 3 (performance):
To ensure that data users have a proper understanding of its macroeconomic products, it is recommended that the Macroeconomic Accounts Branch ensure the timely update of all applicable metadata information.
Management Response and Action Plan
Recommendation 1
Performance Measurement
Focus : IASD and MEAB
It is recommended that the MEAP proceed with the implementation of a performance measurement system to ensure the availability of timely performance information to support on-going program management and decision making, demonstrate the achievement of program expected outcomes and support future evaluations (including up-to-date business process models for outputs, list of outputs produced and released and a register of data users).
Statement of Agreement / Disagreement
Management agrees with the proposed recommendation.
Management Response
Management has already started implementing a system to produce regular and timely performance measures to address all outcomes identified in the MEAB logic model. In addition, the corporation has begun to develop standardized performance measures as well. Once these indicators have been developed the MEAB will update its performance management strategy and ensure that the PMS accurately describes the performance measures. In addition to the performance measures IASD and the MEAB will create a catalog of its various outputs and maintain an up to date list of data users which have contacted the agency for MEAB information.
Table 1 Recomendation 1
Timeline
Deliverable(s)
Responsible Party
April 2015
Development of the MEAB specific Performance Measures.
Director General – MEA / Manager National Accounts Integration Group
April 2015
Incorporation of corporate performance measures into the MEAB Performance Measurement Strategy.
Corporate Task force on performance measurement.
April 2015
Register of Data users based on individuals who have contacted the branch for MEA related information.
Director General – MEA / Manager National Accounts Integration Group
Recommendation 2
Relevance, Efficiency and Economy
Focus : IASD and MEAB
It is recommended that the IASD, establish/implement the appropriate strategy to enhance IASD capacity and explore avenues to resume the provision of cost-recovery services to deal with the current relevancy risk, the increased potential for errors and slippages in the schedule and to address some of the unmet needs of users.
Statement of Agreement / Disagreement
Management agrees with the proposed recommendation.
Management Response
Work is already underway to enhance the capacity of the International Account and Statistics Division. Recently the division merged with the International Trade Division. This merger has brought in a great deal of expertise into the division as well as an opportunity to streamline operations and create new data products which will lead to increased capacity to not only deliver the current set of programs but also take on cost recovery work. In addition, the International Trade Division, has a long history of undertaking extensive cost recovery work – this expertise can be shared with other programs in the division. The branch is seeking and has secured some new funding that will ensure the division has increased capacity to deliver its programs and expand its programs where required. Some of the funding has been provided from the Economic Statistics Field while other funding opportunities are being discussed with other departments. Given the cross cutting nature of this work (measuring global production, merchanting, international financial flows) and their impact in other program areas the agency as a whole will need to find ways to increase its capacity to deal with these emerging issues
The division has been engaged in a number of staffing processes over the last fiscal year and will be hiring a number of economists. Finally, the division has been actively participating in the development of the Macroeconomic Accounts training program. This training program will be used to develop the human capital within the division to allow them to more effectively perform their duties.
While there are some unmet needs of users it will be difficult for IASD to move forward on a number of data products until there is international agreement around concepts, methods and the implementation timeframe for these data products. Statistics Canada cannot unilaterally implement changes to its international accounts program until the changes are implemented by our major trading partners (e.g. the US). If these changes are implemented unilaterally international account asymmetries will arise making the data confusion for our users and less relevant. The international accounts program will remain active on the international front to ensure these changes are coordinated among countries.
Table 2 Recomendation 2
Timeline
Deliverable(s)
Responsible Party
April 2014
Merger of the International Trade Division and International Accounts and Statistics Division
Assistant Chief Statistician – Economic Statistics / Director General – MEA / Director General – Economy Wide Statistics / Director – International Accounts and Statistics / Director – International Trade Division
April 2015
Secure additional funding for programs such as the Exporter / Importer Register, Foreign Affiliate Statistics and Securities programs to meet emerging user needs.
Director General – MEA / Director – International Accounts and Statistics Division
April 2015
Secure additional funding for programs to address the needs of the international community for data related to financial stability and international linkages (G20 data gaps initiative and SDDS+).
Assistant Chief Statistician – Economic Statistics / Director General – MEA / Director – International Accounts and Statistics / Director – International Trade Division / Director – Financial Planning Division
April 2015
Completion of the Macroeconomic Accounts Training curriculum.
Director General – MEA / Director – International Accounts and Statistics Division
April 2015
Hiring or promoting a number of economists within the division.
Director – International Accounts and Statistics Division
Recommendation 3
Performance
Focus : MEAB
To ensure that data users have a proper understanding of its macroeconomic products, it is recommended that the Macroeconomic Accounts Branch ensure the timely update of all applicable metadata information.
Statement of Agreement / Disagreement
Management agrees with the proposed recommendation.
Management Response
The Macroeconomic Accounts Branch is developing a meta data model and associated meta database to be used to store SNA meta data, display those meta data to internal and external users and facilitate the loading, editing and auditing of those meta data. The model will be embodied in a relational meta database and is presently a work in progress. Once the model has been completed programs within the MEA will be targeted to populate the database with current meta data information. Once populated the information will be loaded into Statistics Canada Integrated Meta Database (IMDB) and made public via the Statistics Canada website where it can be accessed directly from CANSIM (in association with the actual data). The International Accounts and Statistics programs will be targeted as one of the first programs to supply the updated meta data information.
Table 3 Recomendation 3
Timeline
Deliverable(s)
Responsible Party
December 2015
Release of: User Guide to the Macroeconomic Accounts
Director General - MEA
April 2015
Release of updated and detailed BOP metadata in the IMDB.
Director General – MEA / Director – International Accounts and Statistics Division.
April 2016
Release of updated and detailed IIP and other IATD metadata in the IMDB
Director General – MEA / Director – International Accounts and Statistics Division.
Statistical Information Service Evaluation 2015/2016
Archived information
Archived information is provided for reference, research or recordkeeping purposes. It is not subject to the Government of Canada Web Standards and has not been altered or updated since it was archived. Please "contact us" to request a format other than those available.
During the 2015/2016 fiscal year, Statistics Canada's Statistical Information Service will be evaluated to assess its ability to meet the information needs of the Canadian public.
This evaluation is intended to allow users of this service to provide feedback and to express their level of satisfaction with the services they received.
Feedback will be used to help Statistics Canada further improve its service delivery.
How to get involved
Individuals who wish to obtain more information or to take part in a consultation should contact Statistics Canada by sending an email to consultations@statcan.gc.ca.
Please note that Statistics Canada selects participants for each consultation to ensure feedback is sought from a representative sample of the target population for the study. Not all applicants will be asked to participate in a given consultation.
Statistics Canada is committed to respecting the privacy of consultation participants. All personal information created, held or collected by the Agency is protected by the Privacy Act. For more information on Statistics Canada's privacy policies, please consult the Privacy notice.
Results
Results of the client satisfaction evaluation will be published online when available.
Statistical information is critical for effective decision making in a modern economy. Statistics play an essential role in the production and dissemination of statistical information. Because of the importance of the information required, quality is fundamental to Statistics Canada's mandate. Data collection activities form the basis of much of the information Statistics Canada disseminates and survey collection activities are the main source of contact the Agency has with the public they rely on. As such, ensuring the quality of collection activities is a critical aspect to ensure the accuracy of the information.
The objectives of the audit were to provide the Chief Statistician (CS) and the Departmental Audit Committee (DAC) with assurance that:
Statistics Canada has appropriate governance mechanisms in support of the planning and resource allocation of Collection and Regional services.
Appropriate quality assurance mechanisms have been established and are consistently applied to ensure that Collection and Regional Services Branch are gathering quality data and are compliant with the Statistics Canada Quality Guidelines.
The audit was conducted by Internal Audit Division in accordance with the Government of Canada's Policy on Internal Audit.
Key Findings
Responsibilities and accountabilities are defined and communicated in the Collection and Operations Services Agreement (COSA) and the SSO handbook and clearly defined mandates have been developed for collection oversight committees. Effective capacity planning takes place within Collection Planning and Management Division (CPMD) and the regional offices to ensure collection activities take place as planned.
While the risk management framework includes the identification of risks and high level mitigation strategies, it does not include detailed risk mitigation activities nor does it assign specific accountability for each activity or timelines for implementation.
Survey collection management in the regions and CPMD inconsistently apply survey reports, which may result in undetected data quality issues. Currently, the primary focus of survey management reports is to ensure response rate targets are met and specific quality reports or indicators are not being leveraged to assess the quality of collections.
CPMD and Regional offices have established an effective process for feedback and the escalation of issues. Communication occurs daily and the escalation of issues occurs at the appropriate level.
Basic interviewer and survey training is effective and the tools in place help ensure that interviewers understand survey procedures and concepts. However, training for the Business Register does not effectively prepare interviewers to make changes to the frame with confidence.
The use of Quality Control Feedback System (QCFS) is not optimized to ensure monitoring targets are met at the regional and interviewer level and the results of monitoring activities are not being analyzed to ensure the quality of collections. Computer Assisted Personal Interview (CAPI) monitoring requires improvement to ensure that interviewers receive timely feedback, interviewers and respondents are made aware of the importance of quality assurance, and observational monitoring takes place in all regions— as outlined in the SSO handbook—to meet monitoring expectations and enhance the quality of CAPI collection activities.
Overall Conclusion
The governance framework in place in Collection and Regional Services branch is effective; responsibilities and accountabilities are documented and effective capacity planning within Collection and Regional Services Branch takes place.
Although appropriate quality assurance mechanisms exist within the Collection and Regional Services Branch (CRSB), they are not adequately leveraged or applied. The Statistics Canada Quality Guidelines note that an effective collection management framework should strike a balance between survey performance and quality. Current survey collection monitoring and management practices focus primarily on collection performance and should be enhanced to better ensure the quality of data collected.
Conformance with Professional Standards
The audit was conducted in accordance with the Internal Auditing Standards for the Government of Canada, which includes the Institute of Internal Auditors (IIA) and the International Standards for the Professional Practice of Internal Auditing.
Sufficient and appropriate audit procedures have been conducted and evidence gathered to support the accuracy of the findings and conclusions in this report and to provide an audit level of assurance. The findings and conclusions are based on a comparison of the conditions, as they existed at the time, against pre-established audit criteria. The findings and conclusions are applicable to the entity examined and for the scope and time period covered by the audit.
Patrice Prud'homme
Chief Audit Executive
Introduction
Background
Statistics Canada was established to ensure that Canadians have access to a trusted source of Canadian statistics to meet their highest priority needs. Access to trusted statistical information underpins democratic societies, as it supports evidence-based decision-making in the public and private sectors, and informs debate on public policy issues. Under the Statistics Act, Statistics Canada is required to "collect, compile, analyze, abstract and publish statistical information relating to the commercial, industrial, financial, social, economic and general activities and conditions of the people of Canada", which requires the organization to collect data from a wide variety of Canadian businesses and households.
At Statistics Canada, Collection and Regional Services Branch (CRSB) is responsible for the majority of data collection activities. Its mandate is to:
Provide data collection services to Statistics Canada's statistical programs (Household surveys, Business surveys and Census collection);
Provide the Management Structure for the Separate Employer – Statistical Survey Operations (SSO) – which comprises Statistics Canada's interviewers;
Promote the availability and effective use of Statistics Canada's products and services through the Advisory Services and Communications program in the regions.
Within the branch, Collection Planning and Management Division (CPMD) is accountable for the coordinated planning of the collection capabilities, and the capacities of the collection infrastructure. They also act as the front door for most survey partners who require collection support. CPMD is the interface between the subject matter divisions and the regions and designs collection activities in response to client specifications, provides cost estimates, acquires requisite services from supplying divisions within the Field to execute the survey, monitors and reports on survey progress and conducts post-mortems of completed projects.
There are three regions that manage survey data collection operations: the Eastern Region, which has offices in Halifax, Montreal and Sherbrooke; the Central region, with offices in Ottawa, Toronto and Sturgeon Falls; and the Western Region and Northern Territories with offices in Edmonton, Regina, Calgary, Winnipeg and Vancouver. Within these regions, there are approximately 2000 interviewers and public servants providing services and conducting surveys.
The CRSB branch began piloting a branch renewal project in January 2014, with full implementation scheduled for April 1, 2014. Under this project, the roles and responsibilities for regional offices and CPMD will change. CPMD will transfer the management of surveys to the regions, and subject matter areas will communicate directly with the regions. CPMD will continue to do the planning and prepare the training materials for surveys, while regional offices will be fully responsible for the resolution of survey collection issues.
Audit objectives
The objectives of the audit were to provide the Chief Statistician (CS) and the Departmental Audit Committee (DAC) with assurance that:
Statistics Canada has appropriate governance mechanisms in support of the planning and resource allocation of Collection and Regional services.
Appropriate quality assurance mechanisms have been established and are consistently applied to ensure that Collection and Regional Services Branch are gathering quality data and are compliant with the Statistics Canada Quality Guidelines.
Scope
The scope of this audit included an examination of the adequacy and effectiveness of the quality controls in place for the collection of data conducted by Collection and Regional Service branch. Specific areas that were examined include the quality assurance controls currently in place within the Computer Assisted Telephone Interview (CATI) and Computer Assisted Personal Interview (CAPI) collection environments to ensure compliance with the Statistics Canada Quality Guidelines.
The audit also examined the effectiveness and adequacy of the governance and capacity planning for Collection and Regional Services. During the course of the audit, the audit team examined Collection Planning and Management Division at head office, as well as the regional offices in Edmonton, Sherbrooke and Sturgeon Falls. The audit covered the period from January 2013 to January 2014. The audit did not include infrastructure elements related to Shared Services Canada, as they will be examined in a separate engagement.
Approach and methodology
The audit consisted of a comprehensive review and analysis of relevant documentation, as well as interviews with key management and staff from CRSB at headquarters and regional offices. The field work included a review, assessment, and testing of the processes and procedures in place to ensure the Agency provides quality statistical information.
This audit included interviews with personnel in all of the regions, subject matter divisions, as well as interviews with CRSB staff located at headquarters. Document review included survey reports and documentation from each of the regions and headquarters. A random sample of interviewers was drawn from each of the sites visited. Testing included a random sampling of interviewer files from each of the three sites visited and examination of the Quality Control Feedback System (QCFS) reports for the specific months over the period under examination, and a validation to ensure training was provided to interviewers and that it occurred prior to collection activities. The audit also tested to determine the procedures in place to ensure that SSO employees were informed of and acknowledged the code of ethics and conduct for SSO employees. Additionally, because the Edmonton Regional office was the only office visited responsible for CAPI interviewers, a sample of CAPI interviewer files were examined.
This audit was conducted in accordance with the Internal Auditing Standards for the Government of Canada, which includes the Institute of Internal Auditors (IIA) International Professional Practices Framework.
Authority
This audit was conducted under the authority of the approved Statistics Canada Integrated Risk-Based Audit and Evaluation Plan 2013/14 to 2017/18.
Findings, Recommendations and Management Responses
Objective 1: Statistics Canada has appropriate governance mechanisms in support of the planning and resource allocation of Collection and Regional services.
Governance
Responsibilities and accountabilities are defined and communicated in the COSA and the SSO handbook.
Clearly defined mandates have been developed for collection oversight committees and effective capacity planning takes place within CPMD and the Regional offices to ensure collection activities take place as planned.
While the risk management framework includes the identification of risks and high level mitigation strategies, it does not include detailed risk mitigation activities nor does it assign specific accountability for each activity or timelines for implementation.
A robust governance framework is essential to ensure that Collection and Regional Services Branch (CRSB) provides quality data collection services to Statistics Canada's statistical programs. Authorities, responsibilities and accountabilities should be clearly defined and understood at all levels to support effective collection and capacity management, and a well-developed approach to risk management should be in place.
Responsibilities and accountabilities for CPMD and RO staff are defined and communicated
Responsibilities and accountabilities of the Collection Planning and Management Division (CPMD) and Regional offices (ROs) have been clearly defined in the Collection and Operations Services Agreement (COSA), which is a formalized agreement that has been developed to define roles, responsibilities and accountabilities of key stakeholders in the survey process.
The COSA describes the project, details the collection and operation activities, outlines the areas responsible for each activity and provides timelines. It also details what the deliverables are, and how they will be measured and monitored, including what management information reports will be used. The COSA includes details of the governance and communications processes that will be implemented to manage collection and operations activities. All new surveys are required to have a COSA completed and signed, but because COSA is new and is being implemented in stages, most ongoing surveys do not yet have an approved COSA in place.
Regional collections staff roles and responsibilities are outlined in the SSO employee handbook. It outlines key tasks, expectations and responsibilities for field and office interviewers, and senior interviewers.
Clearly defined mandates have been developed for collection oversight committees
Effective oversight bodies are important to ensure management's direction, plans and actions are appropriate and responsible. In order for oversight bodies to be effective, they should be provided with timely and accurate information to adequately fulfil their oversight function.
The audit noted that three oversight committees have been established for survey collections. These are:
Collection Planning Committee (CPC);
Social Collection Steering Committee (SCSC); and,
Business and Agriculture Surveys Collection Steering Committee (BACSC).
The mandate of the Collection Planning Committee is primarily to review and recommend strategies for collection, respondent relations and operational efficiencies related to cost, quality and timeliness. The committee is also responsible for reviewing demands and priorities for collection services to recommend capacity adjustments. The CPC meets monthly and is chaired by two directors general. Membership is comprised of directors and assistant directors from within CRSB, as well as key stakeholder divisions from statistical infrastructure, informatics, and subject matter divisions. This committee reports to Executive Management Board at Statistics Canada.
The audit found that the committee is fulfilling its objectives with respect to collection strategies, respondent relations, quality, timeliness and reviews capacity demands. A review of the committee's minutes and interviews with key CPC members revealed that various issues related to the mandate are discussed and action items are assigned as required to ensure follow-up occurs. Additionally, representatives from working groups are invited to CPC meetings to make informational presentations regarding the status of initiatives or research.
The Social Collection Steering Committee and the Business and Agriculture Collection Steering Committee have clearly defined mandates. Both committees provide oversight of household, business and agriculture surveys collection and are required to report back to CPC on business or household survey collection issues, committee initiatives or decisions. SCSC and BACSC membership is at the director level or their delegates from subject matter and statistical infrastructure divisions, and representatives from CRSB are also members of these committees. The audit reviewed the committees' minutes during the last fiscal year that the SCSC and the BACSC are meeting the obligations noted in their mandates.
Effective capacity planning takes place
The Statistics Canada quality guidelines note that capacity planning is an important step in the survey process and effective capacity planning should be used as a tool to help ensure the quality of collection activities.
The audit found that capacity planning within CPMD is formalized and aligned with data collection planning in the regions. Interviews and document review revealed that business rules are in place and planning assumptions are documented. Current and future needs for resources are assessed on a regular basis and the information provided by headquarters is sufficient for regional offices to undertake their planning activities. Within the regions, labour market conditions are monitored to ensure offices have adequate staffing capacity to carry out collection activities.
CRSB and regional office risk management activities require further development
A well-developed approach to risk management should include identification and assessment of risks, development of mitigation plans to reduce the likelihood of risks, and the on-going monitoring of conditions to ensure risk management strategies are working as intended.
Within Statistics Canada, integrated risk management is performed at the program, division or branch level. Each year, areas identify the risks that may impact Statistics Canada's ability to accomplish objectives, along with the potential likelihood, impact and mitigation strategies.
The audit noted that risk management is developed at the Collection and Regional Services Branch level. Eight key risks facing the branch that may preclude the attainment of objectives were noted. Broadly, these risks can be organized into risks related to respondent relations, human resource risks, competing priorities, interdependencies, and the stewardship of information. High-level mitigation strategies have been identified, but do not outline the specific risk management activities to be performed, nor assign the accountabilities or timelines for implementation.
Recommendations:
The Assistant Chief Statistician, Census, Operations and Communications field should ensure that:
Specific risk mitigation activities are established, accountability is assigned and timelines for implementation are documented.
Management Response:
Management agrees with the recommendations.
The Director General, CRSB will modify the comprehensive Branch level risk profile to provide additional detail including specific risk mitigation activities, with appropriate designated accountabilities and specific timelines.
Deliverables and Timeline: Revised CRS Branch Risk Profile will be completed by March, 2015.
Objective 2: Appropriate quality assurance mechanisms have been established and are consistently applied to ensure that Collection and Regional Services Branch are gathering quality data and are compliant with the Statistics Canada Quality Guidelines.
Survey Collection Management
Survey collection management in the regions and CPMD inconsistently apply survey reports, which may result in undetected data quality issues. Currently, the primary focus of survey management reports is to ensure response rate targets are met and specific quality reports or indicators are not being leveraged to assess the quality of collections.
CPMD and Regional offices have established an effective process for feedback and the escalation of issues. Communication occurs daily and the escalation of issues occurs at the appropriate level.
The Statistics Canada Quality Guidelines note that data collections and capture operations have an impact on the accuracy of data. Given this, there should be a balance between quality and performance measurement tools used to manage data collections to help ensure the accuracy of the information. The Guidelines outline several reports that can be used as measures of quality and further notes that these measures help support decisions regarding the need to amend collection processes or redesign collection tools.
There are inconsistent approaches to survey collection management by Regional Offices and CPMD
Survey collection management includes the monitoring and controlling of collection activities. It should ensure that survey production plans are implemented and that necessary corrections or adjustments are made and communicated to subject matter and other relevant stakeholders. Regional office staff manages regional data collection activities and CPMD project officers manage collections, as well as monitor and report on overall survey progress.
Reports related to overall collection activities are prepared in the regions by data collection managers (DCMs), district managers and regional program managers (RPMs). Within CPMD, project officers are responsible for the preparation of reports to support survey management. The audit also found that, for some of the surveys examined, subject matter divisions also prepare and use collection reports. These reports analyze response rates, outcome codes, and other paradata to support survey collection management.
DCMs and RPMs within the regional offices prepare and review a number of system generated reports designed to monitor survey performance and interviewer productivity. These reports include outcome codes and daily progress reports, time per unit, and cost versus claims reports. The audit found inconsistencies in the use and frequency of the review of these reports. For example, some DCMs noted reviewing some or all of these reports and, in some cases, DCMs did not review any of these reports. The variation in survey management approaches within the regions increases the risk of discrepancies in the reports created by regions for CPMD and other partners, which may affect CPMD's ability for effective decision-making.
In addition, the reports used by CPMD for collection management purposes vary depending on the project officer. The audit found that some CPMD project officers rely solely on the daily response rate reports to monitor their surveys. Others indicated that they reviewed an array of reports, such as reports on results of conversion and tracing efforts. No procedural documentation exists outlining what reporting should be reviewed by CPMD project officers for effective collections management.
Interviews with subject-matter representatives noted that the differences in collection management approaches affect the usefulness of the information they are receiving regarding the status and quality of the survey's collection activities, resulting in quality issues that may go undetected.
Quality assurance tools are insufficiently leveraged in the management of collection activities
The Statistics Canada Quality Guidelines outline several reports that may be used as survey collection management tools by both CPMD and Regional Offices. These reports include processing error rates, follow-up rates, rates of non-response by reason, response rates, and capture/coding error rates.
Management noted that, for many surveys, response rates are declining. The audit found that, as a result of this decline, significant effort is placed on ensuring response rates for surveys, as these rates are indicators of both performance and data quality. At the same time, adequate response rates alone do not guarantee data quality. The Statistics Canada Quality Guidelines note that an effective collection management framework should also include tools specifically designed to monitor quality.
The audit team examined the reports prepared in CPMD and the regions. Within each of the three regions examined, the primary focus of the survey-reporting that is prepared and reviewed is to ensure that response rate targets are being met, and that interviewers are meeting productivity targets. Specific reports being used include: outcome codes, time per unit and cost versus claim reports. While these reports form part of an overall quality assurance suite of indicators, they are used as performance management tools and not sufficiently leveraged in the management of data quality.
Within CPMD, the audit found no indication that specific reporting on the quality of the collected data is being created or quality is being monitored. Reports designed to examine overall collections quality would help with the early identification and mitigation of issues affecting the quality of the data being collected.
The Quality Control Feedback System (QCFS) is used to monitor individual interviewers, but also has reporting capabilities that allow users to create reports, which enable analysis of the quality of data collections by creating and analysing reports by error type, errors by survey, trends in interviewer performance, etc. The audit noted that this type of analysis is not done within CPMD or in the regions, which would help interviewers understand survey requirements and be used as a tool to monitor interviewer collection performance.
In order to address survey management and quality reporting weakness, the audit noted that one subject matter division has been requesting additional reports from CPMD in order to conduct their own paradata analysis in an attempt to identify quality issues. Through this analysis, they have been able to provide feedback to CPMD on areas requiring investigation. This exercise has proven to be useful and, in one instance, led to the identification of serious interviewing gaps.
An effective process for feedback and the escalation of issues has been established
Communication of issues between Regional Office and CPMD takes place daily. Specifically, the audit found evidence that survey-specific issues with potential impacts on collection activities are reported and escalated within the regions and to CPMD for action. These issues include such things as, capacity and budgets, labour and environmental issues, technical issues, and survey-specific informational requests.
The audit found that CPMD and Regional offices have established an effective process for feedback and the escalation of issues. Communication occurs daily and issues escalation occurs at the appropriate level.
Recommendations:
The Assistant Chief Statistician, Census, Operations and Communications field should ensure that:
A collection management approach, including the identification of reports to be produced, is established and communicated, to ensure consistency in the collection services provided to partners;
Quality indicators, such as those outlined in the Statistics Canada Quality Guidelines, are included within the mandatory reporting requirements, and information from QCFS is regularly analyzed in order to identify training weaknesses and other survey issues.
Management Response:
Management agrees with the recommendations.
The Director, Collection Planning and Research Division will develop a set of quality indicators and reports that can be used to identify issues during or after collection activities and to maintain or improve the quality of data collection. The development of these indicators will be done in collaboration with the Social Survey Collection Steering Committee and the Business and Agriculture Collection Steering Committee. Should the Branch be unable to absorb the costs associated with developing, implementing and maintaining the expanded quality indicators, a business proposal will be brought forward through the Department's long term planning process.
Deliverables and Timeline: Development of the Collection Data Framework will be completed by December 2014.
Training and Monitoring in Support of Quality
Basic interviewer and survey training is effective and the tools in place help ensure interviewers understand survey procedures and concepts; However, training for the Business Register does not effectively prepare interviewers to make changes to the frame with confidence.
The use of QCFS is not optimized to ensure monitoring targets are met at the regional and interviewer level and the results of monitoring activities are not being analyzed to ensure the quality of collections.
The audit revealed that Computer Assisted Personal Interviewing (CAPI) monitoring requires improvement to ensure that interviewers receive timely feedback, interviewers and respondents are made aware of the importance of quality assurance, and observational monitoring takes place in all regions—as outlined in the SSO handbook—to meet monitoring expectations and enhance the quality of CAPI collection activities.
Training and Tools
The Statistics Canada Quality Guidelines note that interviewer manuals and training must be carefully prepared and planned, since they provide the best way to guarantee data quality (e.g., high response rate and accurate responses), the comprehension of survey concepts and subject matter, and ensure proper answers to questions from respondents.
Effective basic and survey-specific training and tools for interviewers are in place, however, Business Register training should be enhanced to ensure interviewers understand requirements
The audit examined basic training given to all interviewers upon hire. This training effectively outlines Statistics Canada's mandate and objectives, the roles and responsibilities of interviewers and senior interviewers, and the importance of data collection. It also describes collection activities and quality control.
Survey-specific training is to be given to interviewers assigned to that specific survey. To ensure interviewers understand survey methods and concepts and the survey application, and have the opportunity to do mock interviews, training should take place prior to them beginning work on the survey. The audit examined two business surveys and three social surveys and found that training materials outline specific concepts, procedures and applications. In accordance with the Statistics Canada Guidelines, survey training also included several approaches to training, including home study, classroom training, and mock interviews. Audit testing confirmed that interviewers received basic and survey- specific training and that it took place prior to interviewers beginning any collection activities.
The Business Register is the survey frame for business and agricultural surveys. Prior to 2011, any changes to the frame were made by staff in the Business Register Division and were subject to validation with subject matter areas. Beginning in 2012, the responsibility for making live updates to the register was transferred to interviewers in the regions. Interviewers in all regions visited noted that the Business Register training currently provided does not effectively prepare them to be able to make changes to the register with confidence. Interviews with subject-matter areas substantiated this by noting that there have been changes made to the frame by interviewers in the regions that were inaccurate, which have resulted in erroneous changes to the frame. The audit examined the Business Register training materials and noted that it assumes a level of knowledge of industrial organization and classification that staff in the regions may not have. Without effective training for interviewers, there is a risk that invalid changes made to the Business Register may go undetected and impact the quality of business and agricultural surveys.
Monitoring
The Statistics Canada Quality Guidelines note that the 'interviewing skills of interviewers should be monitored to ensure that they conform to a pre-established list of standards,' and that, '...monitoring should also be used to identify strengths and weaknesses in the interviewer's skill set, to provide feedback to the interviewers and to focus training on weaker areas.'
The QCFS is designed to capture the results of quality control monitoring operations for both Computer Assisted Telephone Interviewing (CATI) and Computer Assisted Personal Interviewing (CAPI), and to provide quality estimates and feedback reports to interviewers and managers. It also generates interviewer sampling plans required for the efficient administration of quality control procedures and monitoring. Each CATI and CAPI interviewer is assigned to a monitoring plan based on the interviewer's past performance. Interviewer plans and the required number of monthly monitoring sessions for CATI and CAPI interviewers in each plan are classified in QCFS as follows:
Plan A: Experienced/excellent. Monitoring sessions required: CATI-1, CAPI-1
Plan B: Very good. Monitoring sessions required: CATI-2, CAPI-2
Plan C: Acceptable/new interviewers. Monitoring sessions required: CATI-4, CAPI-3
Plan R: Unacceptable—re-training. Monitoring sessions required: CATI-6, CAPI-4.
During a given monitoring session, the quality of the interviewer's work is evaluated against a pre-defined set of criteria. If an interviewer makes an error or shows poor interviewing practices for any question, the information is recorded in QCFS.
CATI interviewers work out of regional offices across the country, and using a computer survey application, telephone respondents to conduct the interview. Monitoring is carried out by senior interviewers who listen, observe, and assess the live interaction between the interviewer and the respondent, while using a telephone listening device and observe a simultaneous duplicate image of the interviewer's computer screen. Respondents are advised that the call may be listened to by a supervisor.
CAPI interviewers work in the field and go to respondent's residences to complete interviews. Survey responses are captured by the interviewer in a computer survey application. Monitoring is carried out by reviewing digitally recorded segments of interviews. Each recorded question is saved in a separate audio file and transmitted daily to the regional offices—where monitoring activities can take place. Respondents are asked by the interviewer if they consent to being recorded, allowing them the option to decline. If they do not to give consent, the recording is turned off.
The audit team observed eight monitoring sessions and confirmed that monitoring activities are performed according to the QCFS process. The audit noted that monitors actively listen to interviewers to ensure data are coded as required, interviewers ask questions as per the questionnaire and that the interviewers' conduct is professional. Errors and comments from each monitoring session were subsequently entered into the system, and for CATI monitoring sessions feedback was provided immediately after each monitoring session.
Monitoring of CATI interviewers should be enhanced to optimize quality
The audit assessed the monitoring activities within each of the sites visited to confirm whether monitoring occurred, if it was relevant and if feedback was given to interviewers. Of the 53 CATI interviewer files tested across the regions, only three interviewers were monitored as frequently as required by their monitoring plans. Interviews with regional management noted that senior interviewers have day-to-day responsibilities in addition to monitoring activities and that monitoring targets are still considered onerous, despite recent reductions in the required monitoring targets. Regional management also stated that interviewer plans for those requiring more monitoring due to identified quality issues, are not reviewed to ensure monitoring plans are implemented as required.
The audit noted that feedback from monitoring sessions is provided to CATI interviewers in a timely manner. However, regions do not use QCFS to identify performance trends for interviewers as QCFS is considered to be a coaching tool only and not a performance management tool. Management in the regions noted that the focus of QCFS monitoring is to meet monitoring targets set by head office and explained that, because feedback is given to interviewers after each monitoring session, quality is assured. Without analysis of monitoring results, the need for specific training or specific survey issues may not be identified or addressed.
CAPI monitoring is inconsistent across regions
There are several methods of monitoring CAPI interviews to ensure the quality of interviewer collections, including monitoring using the QCFS, validation monitoring and observation monitoring. Of the 15 CAPI surveys currently in the field, the QCFS is used to monitor CAPI interviewers for 2 CAPI surveys only—the Canadian Community Health Survey (CCHS) and the Survey of Household Spending (SHS). As a result, only CAPI interviewers working on these two surveys can be monitored using the QCFS.
The audit examined the QCFS monitoring consent-rate report for SHS produced in CPMD which detailed, for each CAPI interviewer, the rates of consent for recording of the interview. It showed that, on average, 29% of CAPI respondents refuse to give consent. However, the audit noted significant variation across interviewers. For example, examination of the consent report revealed that 24% of CAPI interviewers had refusal rates of more than 50%. As a result, there is often little sample available to monitor. The DCM responsible for CAPI interviewers in the Edmonton region noted that these reports were provided for informational purposes only and no action is taken to determine why consent rates varied. During interviews it was noted that there have been several instances of interviewers coaching respondents to decline having the interview recorded.
CAPI interviewers are not monitored as frequently as required by their plan. The reason given was because there is little sample available to monitor, and respondents have the option to decline being recorded. Additionally, CAPI monitors stated that providing timely feedback to CAPI interviewers is difficult. Because CAPI interviewers work in the field, the monitor must call and leave a message requesting the interviewer to call back. Many CAPI interviewers are reluctant to call back and avoid receiving feedback because it is perceived negatively. Without effective feedback mechanisms for CAPI interviewers, it is unlikely that risks to the quality of collections resulting from ineffective interviewing skills are adequately addressed.
Validation monitoring consists of telephoning respondents to verify that a Statistics Canada employee has been to their residence and has done an interview. Key data are confirmed and the respondent is questioned about the overall professional impression of the interviewer. The audit tested and confirmed that validation monitoring takes place at the Edmonton regional office, as it was the only one of the three sites visited responsible for CAPI interviewers.
Observation monitoring occurs when a senior interviewer accompanies a CAPI interviewer in the field and observes collection activities. The CAPI Statistical Survey Operations (SSO) employee handbook noted that observational monitoring must be conducted at a minimum of once every two years. Within the Eastern and Western regions, the audit confirmed that observational monitoring is taking place. In the Central region, staff noted that observational monitoring had been discontinued and has not taken place since 2010.
Recommendations:
The Assistant Chief Statistician, Census, Operations and Communications and the Assistant Chief Statistician, Analytical Studies, Methodology and Statistical Infrastructure should ensure that:
Business Register Training is adapted for SSO staff to ensure quality frame maintenance.
Management Response:
Management agrees with this recommendation.
The Director of the Collection Planning and Research Division, in collaboration with the Director of the Business Register Division, will ensure the Business Register training for interviewers is adjusted to meet the users' needs.
Deliverables and Timeline:
New streamlined course based on needs identified through consultation with CRSB. This was completed in December 2013.
First iteration of new course delivered nationally. This was completed in March 2014.
New training materials made available to SSO/CRSB. This was completed in May 2014.
To ensure all interviewers and Senior Interviewers are properly trained, CRSB and the Regions use an Excel listings of trained or to be trained staff on the BR. This will be an ongoing activity.
Recommendations:
The Assistant Chief Statistician, Census, Operations and Communications field should ensure that:
The CATI and CAPI interviewer monitoring plan targets are met and that the results of monitoring are examined at the interviewer and survey level to ensure the quality of operations.
For CAPI operations, the feedback from QCFS monitoring be provided to interviewers in a timely manner, interviewers and respondents are made aware of the importance of monitoring as a quality assurance tool to minimize refusals to monitoring, and there is clarification of the observation monitoring requirements.
Management Response:
Management agrees with this recommendation.
For CATI, the Assistant Directors/Operations will work with Collection Planning and Research Division to update the monitoring targets, and develop a plan to ensure those targets are met and tracked. Feedback mechanisms to interviewers will be examined as a part of the plan.
Deliverables and Timeline: Updated CATI monitoring targets and plan for assessing the program by March 2015
For CAPI, Regional Directors will work with Collection Planning and Research Division to develop and cost out a revised monitoring program, similar to the CATI program, and develop an implementation plan. If necessary, an LTP proposal may be developed to fund CAPI monitoring. This plan will set time requirements for feedback and clarify monitoring requirements for CAPI interviewers.
Deliverables and Timeline: CAPI monitoring program proposal and implementation plan will be developed by March 2015.
Appendices
Appendix A: Audit criteria
Control Objective / Core Controls / Criteria
Sub-Criteria
Policy Instrument
1) Statistics Canada has appropriate governance mechanisms in support of the planning and resource allocation of Collection and Regional services.
1.1 Oversight bodies for Collection and Regional Services Branch have been established and their roles and responsibilities have been formalized. (G-3, G-4 & G-6)
Clearly defined mandates have been established and communicated for the oversight bodies responsible for Collections and Regional Services.
Oversight bodies requests and receives sufficient, complete, timely and accurate information for decision-making purposes.
The oversight bodies have the appropriate authority level for effective decision making.
TBS Audit Criteria related to the Management Accountability Framework: A Tool for Internal Auditors
1.2 Roles, responsibilities and accountabilities for Collection and Regional Services Branch staff are clear, communicated and understood. (AC-1).
Responsibilities and accountabilities for CPMD staff are formally defined and clearly communicated.
Responsibilities and accountabilities for regional collection staff are formally defined and clearly communicated.
Roles and responsibilities regarding collection activities are exercised as intended.
Statistics Act
TBS Audit Criteria related to the Management Accountability Framework: A Tool for Internal Auditors
Directive on Informing Survey Respondents
Directive on the Security of Sensitive Statistical Information
SSO employee Handbook -Office
SSO employee Handbook -Field
CATI Quality Control Monitoring – Monitor's Manual
CAPI Quality Control Monitoring – Monitor's Manual
COSA
1.3 Effective capacity resource planning takes place at the branch-level and is aligned with survey collection planning. (PPL-1, PPL-2, PPL-4)
Capacity planning is aligned with data collection planning.
Capacity planning is documented and communicated and includes analysis of current and future resource requirements.
Changing labour market conditions are monitored and addressed on a regular basis by Regional Office Management.
TBS Audit Criteria related to the Management Accountability Framework: A Tool for Internal Auditors
1.4 An effective risk management framework exists to identify the key risks, including emerging risks, facing the collection activities, and adequate risk management strategies have been developed and communicated. (RM-2,3,6)
Management identifies and periodically assesses the risks that may preclude the achievement of collection objectives.
Risk management strategies have been developed and communicated to the key stakeholders of Regional Collection Services.
TBS Audit Criteria related to the Management Accountability Framework: A Tool for Internal Auditors
Directive on Informing Survey Respondents
Directive on the Security of Sensitive Statistical Information
2) Appropriate quality assurance mechanisms have been established and are consistently applied to ensure that Collections and Regional Services Branch is gathering quality data and is compliant to the Statistics Canada Quality Guidelines.
2.1 Collections and Regional Services Branch provide employees with the necessary training to support the discharge of their responsibilities. (PPL-4)
2.1.1 CPMD staff are provided with adequate training to ensure the quality of data collection of the surveys for which they are responsible.
2.1.2 CPMD provides adequate and timely training to all interviewers assigned to a survey.
2.1.3 Regional Operations staff are provided with adequate training regarding the fundamental data collection (i.e. required for all surveys).
TBS Audit Criteria related to the Management Accountability Framework: A Tool for Internal Auditors
Directive on Informing Survey Respondents
2.2 Collections and Regional Services Branch provide employees with the necessary tools to support the discharge of their responsibilities (PPL-4).
2.2.1 CPMD staff are provided with adequate tools (SOPs) to ensure the quality of data collection of the surveys for which they are responsible.
2.2.2 CPMD provides adequate and timely tools (FAQs, definitions, interview guides, etc.) to all interviewers assigned to a survey.
TBS Audit Criteria related to the Management Accountability Framework: A Tool for Internal Auditors
Directive on Informing Survey Respondents
Statistics Canada – Quality Guidelines
2.3 There is effective supervision over the collection of data to ensure the integrity of the collection process. (PP-3, PPL-8, PSV-5)
2.3.1 Monitoring of CATI and CAPI interviews is being conducted frequently and issues are addressed in a timely manner within Regional Offices.
2.3.2 The monitoring includes an assessment of interviewers compliance to the Directive on Informing Survey Respondents, as well as Statistics Canada principals of sound Integrity and ethical values.
2.3.3 Regional Office Management performs timely periodic reviews of the monitoring performed over collections and that ensured that issues have been communicated and addressed per established processes.
2.3.4 Feedback and information provided to CPMD by stakeholders regarding improvements in the collection of data have been formally communicated and addressed in a timely manner.
TBS Audit Criteria related to the Management Accountability Framework: A Tool for Internal Auditors
Directive on Informing Survey Respondents
Statistics Canada – Quality Guidelines
Directive on the Transmission of Protected Information
2.4 An effective monitoring and reporting framework exists to ensure collection objectives are met and issues are addressed in a timely manner. (PP -3, ST-20, RP-3)
2.4.1 Effective reporting exists and includes relevant quality assurance indicators such as those described in the Statistics Canada Quality Guidelines (i.e. rejection rates, coding error rates, NR rates, average interview length reports etc).
2.4.2 Issues regarding the achievement of collection results are identified, escalated to appropriate authority-levels and addressed in a timely manner.
2.4.3 The results of data collection are reported to Subject Matters in a timely and consistent manner.
TBS Audit Criteria related to the Management Accountability Framework: A Tool for Internal Auditors
Statistics Canada – Quality Guidelines
Directive on the Transmission of Protected Information
CATI Quality Control Monitoring – Monitor's Manual
CAPI Quality Control Monitoring – Monitor's Manual
Appendix B: Acronyms
Acronym
Description
BACSC
Business and Agriculture Collection Steering Committee
Businesses, municipalities and community organizations have specific information needs. Statistics Canada's Outreach Program is dedicated to serving you with this free newsletter, as well as with ongoing learning and sharing activities.
Help us spread the word
If you know an organization that may benefit from what this newsletter has to offer, please pass it on or put them in contact with us. Please also feel free to share it on your website or other social media platforms.
Stay connected
Visit the Stay connected portal on Statistics Canada website.
No endorsement of any social media products or services is expressed or implied.
Got a question or comment?
Please contact our communications staff in your region:
The landscape for Information Technology (IT) has changed for Statistics Canada since 2011. The introduction of Shared Services Canada (SSC) transferred ownership of IT infrastructure and telecommunications from 43 departments and agencies, including Statistics Canada, to SSC. This change has resulted in an increased dependence on a third party for the provision of these services and has required Statistics Canada, specifically IT Branch, to work with a newly established service provider to maintain stable IT services while managing the ambitious change agenda of transformation set out by TBS and SSC. While Statistics Canada recognizes and supports the Government of Canada Modernization Agenda, there have been challenges, as SSC's mandate, priorities and timelines are not necessarily aligned with those of Statistics Canada.
With the transfer of the control over these services to SSC, there are increased risks affecting Statistics Canada's ability to meet its operational requirements, including the 2016 Census. Additionally, as an organization which collects and maintains sensitive information about individuals and businesses, the introduction of SSC has increased Statistics Canada's inherent risk in relation to the security of sensitive statistical information. With these risks in mind, the Internal Audit Division has conducted a review to identify and assess the current governance structure in place to manage and oversee the relationship between SSC and Statistics Canada.
The objectives of this engagement were to proactively examine the governance framework, risk management program and control activities in place relative to the management of the relationship between Statistics Canada and SSC, as the outsourced service provider of IT infrastructure services, and to provide recommendations for management's consideration in order to improve the current management control framework.
This review was conducted by Internal Audit Division in accordance with the Government of Canada's Policy on Internal Audit.
Key Findings
Given the inherent risks created with the introduction of SSC, a governance framework has been established by Statistics Canada to oversee the relationship with SSC, including escalation mechanisms. While this governance framework is in place and operating, it is not formally documented to ensure roles and responsibilities are understood between the two departments. Within Statistics Canada, internal governance mechanisms have been enhanced to ensure an efficient and effective approach to identifying and escalating issues.
A risk management framework has been established and is being maintained to document and proactively mitigate the risks, to the extent possible, associated with SSC's responsibilities in managing key elements of Statistics Canada's IT infrastructure and telecommunications systems.
Although governance and risk management frameworks are in place to oversee the relationship with SSC, as this relationship matures, internal processes should be reviewed and updated to include consideration of the impact(s) of the relationship with SSC on key processes.
Overall Conclusion
Statistics Canada has taken a proactive approach in working with SSC. The governance structure, although undocumented, ensures that Statistics Canada is in the best position to have its voice heard and to mitigate the risk associated with its loss of control over its IT infrastructure and telecommunications. Representatives within SSC highlighted that the engagement by Statistics Canada and their proactive approach to dialogue on issues is considered a leading practice and has been used as a case study with other partner organizations.
Conformance with Professional Standards
The review was conducted in accordance with the Internal Auditing Standards for the Government of Canada, which includes the Institute of Internal Auditors (IIA) International Standards for the Professional Practice of Internal Auditing.
Sufficient and appropriate procedures have been conducted and evidence gathered to support the accuracy of the findings and conclusions in this report. The findings and conclusions are based on a comparison of the conditions, as they existed at the time, against pre-established review criteria. The findings and conclusions are applicable to the entity examined and for the scope and time period covered by the review.
Patrice Prud'homme
Chief Audit Executive
Introduction
Background
Shared Services Canada (SSC) was created in 2011 with the mandate to fundamentally transform how the Government manages its Information Technology (IT) infrastructure. Per its mandate, SSC currently delivers email, data centre and telecommunication services to 43 federal departments and agencies, including Statistics Canada. The creation of SSC brought together people, technology resources and assets from the 43 federal departments. As of August 2011, 157 IT employees transferred from Statistics Canada to SSC in order to provide the above noted services.
SSC's Report on Plans and Priorities for 2013-14 outlines that they will continue to renew the Government of Canada's IT infrastructure focusing on the procurement of a single email solution, enhancing IT security across the Government of Canada, and finalizing its consolidation strategies for data centres and networks.
Specifically affecting Statistics Canada, SSC has initially focused on the centralization of email services, data centres and networks. With the transfer of the control over these services to SSC, there are increased risks affecting Statistics Canada's ability to meet its operational and service delivery requirements, including the 2016 Census. As SSC establishes itself to provide these services to 43 departments, Statistics Canada has faced challenges in getting attention, responsiveness and priority from SSC. Additionally, as an organization which collects and maintains sensitive, confidential information about individuals and businesses, the introduction of SSC has increased Statistics Canada's inherent risk in relation to the security of sensitive statistical information.
With these risks in mind and now that two full years have passed since the inception of SSC, Statistics Canada's Internal Audit Division conducted a review engagement to identify and assess the current governance structures in place to manage and oversee the relationship between SSC and Statistics Canada. This review is being conducted to assess the governance framework, risk management mechanisms and control activities in place within Statistics Canada, with the intent to recommend opportunities for improvement as Statistics Canada works towards the achievement of their strategic objectives with a new reality of outsourced IT infrastructure support.
Review Objectives
The objectives of this engagement were to proactively examine the governance framework, risk management program and control activities in place relative to the management of the relationship between Statistics Canada and SSC, as the outsourced service provider of IT infrastructure services, and to provide recommendations for management's consideration to improve the current management control framework.
Scope
The scope of the engagement included a review of:
The governance framework in place to manage the relationship with SSC;
The sufficiency and adequacy of the risk management program developed to mitigate the risks associated with the introduction of SSC; and
The appropriateness of the control activities established to ensure that Statistics Canada's needs are being met given the changes in control over Statistics Canada's IT infrastructure.
The governance, risk management and control activities relative to Statistics Canada's relationship with SSC were assessed based on evidence provided during the period from January to April 2014.
Approach and methodology
The review engagement included gaining an understanding of the key risks associated with the transfer of services to SSC and the existing governance control frameworks, risk management approaches and control activities that were designed and implemented to mitigate the identified risks associated with SSC. This was achieved through the conduct of a comprehensive review and analysis of relevant documentation, including relevant guidelines, risk management and performance reporting, organization charts, etc., and the conduct of interviews with key management and staff from IT, Census program, other stakeholders within Statistics Canada and, as required, key contacts at SSC.
This review was conducted in accordance with the Internal Auditing Standards for the Government of Canada, which includes the Institute of Internal Auditors (IIA) International Standards for the Professional Practice of Internal Auditing.
Authority
This engagement was conducted under the authority of Statistics Canada's integrated Risk-Based Audit and Evaluation Plan 2013/14 to 2017/18, approved by the Departmental Audit Committee.
Findings, Recommendations and Management Responses
Objectives: To proactively examine the governance framework, risk management program and control activities in place relative to the management of the relationship between Statistics Canada and SSC, as the outsourced service provider of IT infrastructure services and to provide recommendations in order to improve the current management control framework.
Governance Framework to Oversee the Relationship with SSC
A governance framework has been established by Statistics Canada to oversee the relationship with SSC. While this governance framework is in place and operating, it is not formally documented to ensure roles and responsibilities are understood between the two departments.
Statistics Canada's internal governance mechanisms have been enhanced to ensure an efficient and effective approach to identifying and escalating issues.
In situations where the service provider is an external entity, a robust governance framework is essential to the management of the relationship. This governance structure should be documented and have in place adequate mechanisms to communicate the organization's plans and priorities to the service provider and have an effective method for the escalation of issues to ensure operational objectives are met.
Effective oversight bodies have been established; however the governance framework between Statistics Canada and Shared Services Canada has not been formally documented
Management at Statistics Canada has recognized the need to manage the relationship with Shared Services Canada (SSC) in order to effectively work with the new organization and as a result, a formal governance framework has been established in which Statistics Canada has regular meetings with the SSC counterparts to discuss departmental priorities and requirements and proactively address concerns or issues.
Governance bodies with membership from both Statistics Canada and Shared Services Canada have been created at various levels. The current structure consists of meetings between Statistics Canada and Shared Services Canada at several levels. At the senior management level, Assistant Deputy Ministers at both departments meet regularly and the Deputy Ministers meet on an as needed basis, to discuss the risk environment and significant elements of the relationship including critical projects and administrative issues. The Director General (DG) of the Industry Portfolio at SSC and the Chief Information Officer (CIO) at Statistics Canada meet monthly to discuss ongoing projects and issues and at the director level, meetings occur weekly between the Director of Information Technology at Statistics Canada and the Director of Network and Security at SSC, and the Director of the Client Relationship unit (SSC) to discuss day-to-day operational issues.
At the working level, Statistics Canada created a liaison position to coordinate and manage interactions with SSC. An Assistant Director of Information Technology Operations Division was named as the Liaison Officer for Statistics Canada in April 2013. Within this context, he takes part in the operational level meetings between Statistics Canada and SSC and works directly with the SSC Relationship Manager to prioritize needs and resolve issues as they arise and to act as a single point of contact for the communication of needs and issues to SSC.
Additionally for large-scale projects such as the Census, project level governance structures have been established. Integrated project teams which are comprised of both Statistics Canada and SSC employees report to a joint management/steering committee about project related progress and issues. On-going concerns are also communicated to the Statistics Canada Liaison Officer to be addressed within the larger Statistics Canada/Shared Services Canada governance framework.
Statistics Canada and Shared Services Canada Governance Structure for Relationship Management
Deputy Ministers Level
Annual and Adhoc Meetings
Minutes taken
Assistant Deputy Minister /
Assistant Chief Statistian Corporate Services
Every 6 weeks
Minutes taken
Director General /
Chief information Officer (STC)
Monthly
Register of action items
Director Level
Weekly
Register of action items
Operations – STC Liaison and SSC operations
and SSC Client Relationship Manager
Multiply times per week
Discussions of emerging operational issues
Large Scale Projects – STC Liaison
Officer and Integrated Project teams
Regularly as required by project
Reports to Steering Committee
Relative to telecommunications, a separate governance process has been established. The Director of Corporate Support Services (CSSD) at Statistics Canada and the Chief of Telecommunications at Statistics Canada discuss day-to-day operational issues with the Director of Telephony at SSC. More specifically, the Director of Corporate Support Services at Statistics Canada liaises regularly (bi-weekly) with the SSC telephony team to address operational issues that arise. Should any issues require escalation; the Director of CSSD will address them at the operational meeting between SSC and Statistics Canada. If required, any additional escalation of issues would follow the same governance structure set up for IT infrastructure.
The review noted that there had been some challenges with the telecommunications governance mechanism; however, with changes in SSC representatives, this mechanism seems to be working better and Statistics Canada representatives believe that the responses they have been receiving take into consideration Statistics Canada's operating environment. In an attempt to increase the visibility of telecommunication needs and issues, the governance structure in place for IT infrastructure has been expanded to include telephony issues and the Statistics Canada Director of Corporate Support Services has been invited to the weekly operational meetings to highlight or escalate any telephony issues since January 2014.
Although governance frameworks are in place for IT and Telephony between Statistics Canada and Shared Services Canada, no formal documentation of the governance framework is in place and roles and responsibilities, escalation protocols and decision making authorities have not been formally established between the two departments. As a result, instances have occurred in which modifications to IT environment were not appropriately authorized and required subsequent intervention and resolution. In the absence of formal documentation, there is an increased risk that decisions could be made relative to SSC activities that have not been appropriately authorized and are not aligned to the priorities of the organization.
Statistics Canada's internal governance structure is used to communicate and address concerns with SSC
In order to manage the relationship with SSC and to meet the needs and expectations of SSC, Statistics Canada has enhanced its internal committee structure and escalation process to address issues with SSC. The review noted that at the senior management level that the Assistant Chief Statistician (ACS) of Corporate Services may escalate issues related to SSC to the Executive Management Board at Statistics Canada to ensure issues are addressed in a timely manner.
Other internal Statistic Canada committees that are used to help address issues and aid in the management of the relationship with SSC include:
Field Information Technology Managers (FITM) – Issues related to IT and SSC are identified and discussed at FITM meetings. These meetings are attended by the Statistics Canada Liaison Officer who includes items on the issues log for discussion with SSC at the regular meetings. These meetings help ensure issues identified are addressed and solutions documented.
Informatics Committee – This committee considers impacts of SSC changes on Statistics Canada's operations. If an infrastructure incident has occurred, an incident report is generated with recommendations; this report is escalated to the Informatics Committee or Security Coordination Committee for oversight.
Corporate Business Architecture (CBA) Committee – This committee is represented by most DGs within Statistics Canada and examines key transformational projects. If there are issues with SSC, they are escalated to the ACS level to ensure they are brought to the attention of SSC.
Additionally, an Infrastructure Gatekeeping Committee has been established to prioritize and approve the short-term infrastructure requests to be processed by SSC. While being overseen by the Liaison Officer at Statistics Canada, this committee does not have a formal mandate or formal delegated authority for decision making.
Internal governance committees within Statistics Canada have adapted processes to ensure issues related to the IT and telecommunications with SSC are logged, addressed and monitored.
Considerations for management:
It is management's responsibility to determine the appropriateness of control activities and to implement corrective measures if deemed necessary. Potential considerations outlined below should not be considered formal recommendations, but should facilitate discussions related to the adaptation of internal control activities that reflect new IT and telecommunications realities.
The governance framework between Statistics Canada and SSC should be formally documented. Over the long-term this should be formalized in an overall Memorandum of Understanding (MOU) or Service Level Agreement (SLA). In the absence of these more joint mechanisms, Statistics Canada should document the governance framework and the associated levels of authority for decision-making so that the governance, escalation and decision making authority relative to IT infrastructure and telecommunications is communicated and understood by stakeholders within Statistics Canada.
Risk Management
A risk management framework has been established and is being monitored to document and proactively mitigate the risks, associated with SSC's responsibilities in managing key elements of Statistics Canada's IT infrastructure and telecommunications systems.
An effective risk management framework includes formal risk management and institutionalized practices that enable management to assess, mitigate and monitor the internal and external risk environments.
The risk management framework in place works to proactively mitigate the risks associated with SSC as an external service supplier
Statistics Canada has recognized that there is an increased risk to the successful delivery of its programs given the responsibility and control that Shared Services Canada has over Statistics Canada's IT infrastructure and telecommunications system and the dependence the organization has on these elements. At the highest level within Statistics Canada, risks associated with the relationship with SSC have been reflected within the existing Corporate Risk Profile (2012 – 2014). These risks have been specifically reflected in Risk #2 – Loss of Reputation and Public Trust. This corporate risk highlights the heightened potential threat of breach of Statistics Canada's informatics infrastructure with the creation of SSC. High level mitigation action plans have been identified including: establishing SLAs with SSC, coordination with SSC's IT infrastructure roadmap to ensure Statistics Canada needs and priorities are reflected and monitoring of the informatics infrastructure through quality reviews, evaluations and business continuity plans.
During the course of the review, additional risk management activities were identified relative to the management of the relationship between SSC and Statistics Canada. These activities include:
Each program develops and maintains a risk register as part of the corporate risk planning exercise. The 2013 Informatics Branch Risk Register has highlighted SSC as a risk – specifically S10 – Interdependency – External. Specified mitigation strategies and action plans include establishing an SSC-Statistics Canada project plan to manage the government-wide change agenda (relative to IT infrastructure) and ensuring that SSC understands Statistics Canada priorities and that they have been factored into the whole of government initiatives.
Corporate Support Services completes a risk register as part of the risk planning exercise. Since the ownership of telephony services moved over to SSC, external risks impact the Corporate Support Services program. The 2013 risk register identifies external dependencies as medium risk. However, the only mitigation strategy specified to address risks associated with SSC is, "to put in place a software maintenance contract SLA with IT and touchpoints in SSC."
Project related risks involving SSC are documented and escalated as part of the overall project governance. For example, due to its size and complexity, the Census project has its own risk register; of which, the risks associated with the reliance on SSC has been identified. Another example of specific project-related risks is the Space Optimization Project– Workplace 2.0). Meeting minutes from the project team have confirmed that risks and issues (including those relative to SSC's role) are discussed and documented and escalated if required.
Minutes of the FITM committee meetings demonstrate that risks associated with SSC/Statistics Canada IT infrastructure are being highlighted to the Statistics Canada Liaison Officer for inclusion on the issues log for discussion with SSC and as necessary, escalation within Statistics Canada.
All ongoing operational issues and risks are expected to be communicated to the Statistics Canada Liaison Officer. This includes project-related issues unless the project has a dedicated SSC Project Manager (i.e. Census). However, if a project has been assigned a dedicated SSC Project Manager, there is a mechanism within SSC for the Relationship Manager to maintain awareness of project status and issues for discussion with Statistics Canada as part of the regular meetings, as necessary. For telecommunications, although there is a separate point of contact for discussions and issue resolution on a daily basis, should escalation be required, the issue is to be brought to the attention of the Statistics Canada Liaison Officer.
Having a single point of contact for documenting, communicating and escalating issues relative to the relationship with SSC (i.e. the Statistics Canada Liaison Officer) is a good practice and ensures that the existing governance structure and risk management approach work as intended. The Liaison Officer, in consultation with programs, determines when and if escalation of an issue is required. This ensures a consistent approach with respect to the escalation of issues.
Control Activities
Key internal businesses processes have not been reviewed and updated to include consideration of the impacts of SSC as the external service provider.
In a sound internal control environment, control activities should be integrated into business practices to manage risks associated with the services delivered by external parties to ensure the organization can meet its strategic and operational objectives.
Some Key internal processes have not been adapted to take into consideration the introduction of an external service provider
With the introduction of SSC and its responsibility for the IT infrastructure and telecommunications systems in place within Statistics Canada, internal business processes have been impacted and require updating to include the need to address SSC processes and priorities. The review identified the following processes that should be considered for revision to ensure the organization takes into consideration the responsibilities and authorities transferred to SSC:
Long-Term Planning:
The Integrated Strategic Planning Process (ISPP) is an annual six step process that begins with a review of the strategic planning priorities and concludes with the allocation of resources for approved projects. Typically, projects that result from this process are transformative in nature and include an IT component. Within this IT component, an IT infrastructure or telecommunications impact is likely (i.e. server capacity) and therefore, these projects require the involvement of SSC.
The ISPP does not currently incorporate a mechanism to share the long-term plan and investment decisions with SSC for consideration of impact, alignment with Government of Canada (GoC) priorities and the consideration of realistic time horizons given GoC initiatives/Other Government Departments (OGD) initiatives and limited resources within SSC. Although Statistics Canada management noted that they have provided SSC senior managers with Statistics Canada's Integrated Plan, SSC CRM representatives stated that they do not have timely insight into Statistics Canada's long-term plan, which limits SSC's ability to consider Statistics Canada's requirements in the SSC long-term planning process. This elevates the risk that Statistics Canada's investment plans, resource allocation decisions and established timeframes for projects requiring IT infrastructure will not be aligned with SSC plans and priorities, which could potentially impact the ability for SSC to support the initiative or meet the project timelines.
Project Management:
The departmental guidance for project management is the Departmental Project Management Framework (DPMF) which is a set of standard project management processes, templates and tools to be used throughout a project's life cycle to initiate, plan, execute, control and close a project. All projects valued at over $150,000 are expected to follow the DPMF. Similar to the ISPP, the majority of projects that follow this process have an IT component and involve an IT infrastructure or telecommunications element.
Currently, based on discussions with SSC's CRM representatives, this framework does not incorporate or take into consideration the gating process in place within SSC. Specifically, in order for SSC to support a project (assuming there is an IT infrastructure or telecommunications component), information should be provided as early as possible to SSC so they can submit it to the SSC Project Execution Committee for approval.
Although continuing to mature and subject to change, SSC has developed templates for use by their customer departments to communicate their needs. Historically, when Statistics Canada had control over its own IT infrastructure, the organization could define the IT infrastructure solution. However, SSC expects clients to only submit their business requirements and SSC will determine the optimal solution. This change in process impacted the Census program, whose representatives provided SSC with IT documentation based on what they used for the previous Census. SSC would not accept this and requested they complete their business requirements (consistent with other requests) and that SSC would determine the solution, creating delays in the process.
Without aligning the existing project management/gating process (including tools and templates) within Statistics Canada to the SSC gating process, delays may be experienced and projects put on hold while SSC puts the IT component of projects through its own approval process.
Short-Term Infrastructure Needs Assessment:
Consistent with the communication of Statistics Canada's long-term planning needs to SSC, timely communication of short-term, operational infrastructure needs has been requested by SSC. As a result, within the operational management of the relationship with SSC, process changes have been made to ensure consistent, timely short-term (one-year) operational requirements are being communicated to SSC. For fiscal 2014/15, the Liaison Officer within Statistics Canada initiated a process where he requested that upcoming year operational requirements be identified and documented by all IT Field Managers in a standard template. This template was then consolidated and reviewed for any duplication. This final document is being shared with SSC for information and planning purposes. This change in process and provision of short-term infrastructure requirements was noted as a leading practice by SSC.
In addition to communication of short-term needs, the process to request in-year infrastructure (not previously approved by SSC) has also been modified. All current requests for infrastructure are initiated through the Statistics Canada Portal (service request system). Once the service request is identified as an infrastructure need, it is automatically forwarded to the Statistics Canada Liaison Officer. These requests are consolidated and prioritized by the newly established Gatekeeping Committee. Once prioritized, the Liaison Officer provides the listing to SSC for action. The current arrangement with SSC is that a service request is closed before a new one is opened. However, if the request is of a critical nature, it will be forwarded to the Liaison Officer who will release if for immediate action by SSC. This change in process for fiscal 2014/15 is supporting SSC's ability to efficiently plan and coordinate infrastructure needs of their partner departments over the short-term.
Incident Management/Change Management:
In spring 2014, Statistics Canada began the implementation of an information technology incident management framework for issues relating to SSC. This framework builds upon the 2011 framework and now addresses the role that SSC plays in Information technology. The framework is intended to standardize the prioritization and escalation of IT incidents within the organization in order to restore normal service operation as quickly as possible and to minimize the impact on business operations of Statistics Canada's mission critical programs and key service areas. This framework does not include desk-top related issues or existing application and systems maintenance. The framework outlines the roles and responsibilities for the Incident Coordinator Team whose membership consists of representatives from both the program areas as well as IT staff who develop action plans to address incidents. The Director of ITOD is responsible for ensuring timely follow-up on recommendations.
With respect to IT application change management, Statistics Canada uses the Jira change, issue and risk tracking tool to manage IT application changes. However, it was noted during interviews that program areas have created their own processes to determine how and when this tool is used, or when IT personnel should be consulted. For infrastructure changes, SSC has implemented a Change Advisory Board and STC program representatives have been invited to attend, although it was noted that it was unclear if Statistics Canada has a role in the decision making or is there for information purposes only.
Given the creation of SSC as a single point of contact for all infrastructure needs and the decentralized nature of Statistics Canada, without standardized IT change management processes aligned to SSC's needs/processes, there is an increased risk that infrastructure decisions will be made and actioned without the appropriate authority and assessment of impact on the organization prior to implementation.
Additionally, the review noted that for telecommunications, all service requests and telecommunications issues are being forwarded to SSC. Interviews noted that it is not clear what responsibilities remain within Statistics Canada for telecommunications. As a result, it is unclear which requests should be actioned by Statistics Canada representatives (i.e. password resets for voicemail and cell phones) and those that require escalation to SSC.
Opportunities to Enhance Service
Given that the introduction of SSC has required Statistics Canada to work with a new service provider, one element of a successful relationship is continuous improvement. This can be achieved in different ways including active engagement with the service provider as well as sharing practices and experiences internally.
Recognizing that SSC has been assigned a significant GoC mandate for transformation and that Shared Services Canada supports 43 departments, the CIO at Statistics Canada has been proactive in volunteering to participate in CIO Committees and GoC initiative working groups (i.e. network transformation, evaluation of e-mail technology) to ensure Statistics Canada's voice is heard at the table. As participant, the CIO is kept up to date on decisions made on these initiatives and is able to identify potential impacts on Statistics Canada.
Although the governance structure within Statistics Canada has multiple touch points with SSC representatives on an ongoing basis, opportunities exist to enhance the current governance structures by periodically soliciting from within Statistics Canada and at SSC lessons learned or best practices for managing the relationship with SSC to ensure that approaches and techniques that are working well are known to areas.
Considerations for management:
It is management's responsibility to determine the appropriateness of control activities and to implement corrective measures if deemed necessary. Potential considerations outlined below should not be considered formal recommendations, but should facilitate discussions related to the adaptation of internal control activities that reflect new IT and telecommunications realities.
The current long-term planning process is reviewed to incorporate the consideration of the impact of proposed projects / priorities on SSC through the communication and engagement of SSC in the process in order to confirm that the priorities and time horizons are aligned to SSC's expectations and capacity.
Statistics Canada's DPMF is reviewed and revised to align the existing gating process with SSC's gating process – when the project has an IT infrastructure component. This would include the early identification of an IT infrastructure component so that SSC can be engaged as early as possible and can initiate its own gating process, minimizing delays to Statistics Canada projects. Further, the project management guidance should be reviewed to ensure that information needs/format of SSC requirements are reflected (i.e. template, level of detail) to avoid inefficiencies and time delays.
Formal Agency-wide IT change management processes (including assignment of roles and responsibilities between SSC and Statistics Canada) are defined and implemented which align with the processes and definitions in place within SSC.
The existing governance framework incorporates a formal process to periodically solicit lessons learned/best practices from within Statistics Canada and SSC for purposes of collaborating and sharing this information to encourage continuous improvement of the relationship.
Appendices
Appendix A: Review Objectives
Control Objective / Core Controls / Criteria
Sub-Criteria
1) Proactively examine the governance framework, risk management program and control activities in place relative to the management of the relationship between Statistics Canada and SSC, as the outsourced service provider of IT infrastructure services and to provide recommendations in order to improve the current management control framework.
1.1 Effective oversight bodies are established. (G-1)
A formal governance framework is in place to manage the relationship with SSC.
The formal governance framework is appropriate given Statistics Canada's mandate and the role of SSC.
The governance framework is communicated and understood by all stakeholders within Statistics Canada.
Evidence is in place to demonstrate use of the existing governance framework.
1.2 Management identifies the risks that may preclude the achievement of its objectives. (RM-2)
A risk management framework has been established and is being maintained to document and proactively mitigate the risks associated with SSC managing key elements of Statistics Canada's IT infrastructure.
1.3 Management identifies and assesses the existing controls that are in place to manage its risks. (RM-3)
1.4 The organization leverages, where appropriate, collaborative opportunities to enhance service. (CFS-3)
Key processes and approaches are being tailored and revised to allow Statistics Canada to continue to achieve its mandate while being reliant on SSC to provide key IT infrastructure services.
The Research Data Centre (RDC) at Dalhousie University is one of 27 RDCs operating across Canada. These RDCs were established through the efforts of Statistics Canada, Social Sciences and Humanities Research Council, Canadian Institutes of Health Research, Canadian Foundation for Innovation and university consortia, to strengthen Canada's social research capacity and support the policy research community. The Dalhousie RDC facility was first opened in 2001.
The mandate of RDCs is to promote and facilitate social science research using Statistics Canada's confidential microdata, while protecting the confidentiality of data through effective operational and analytical policies and procedures that create a culture of confidentiality.
The RDCs are staffed by Statistics Canada employees and are operated under the provisions of the Statistics Act in accordance with all confidentiality rules, and are accessible only to researchers with approved research projects who have been sworn in under the Statistics Act. Day-to-day monitoring of the environment and physical security within the RDC is the responsibility of RDC analysts. RDC analysts administer the operation of the RDCs and ensure that the activities are consistent with Statistics Canada's mandate.
The objectives of this audit are to provide the Chief Statistician (CS) and the Departmental Audit Committee (DAC) with assurance that the RDC at Dalhousie University
has effective practices and mechanisms in place to ensure that the confidentiality of data is protected in the delivery of services; and
complies with applicable Treasury Board Secretariat (TBS) and Statistics Canada policies and standards regarding Information Technology (IT) and physical security, to ensure that confidentiality of data is protected in the delivery of services.
The audit was conducted by Internal Audit Division (IA) in accordance with the Government of Canada's Policy on Internal Audit.
Key findings
The administration of research contracts is supported by roles and responsibilities that are well defined and communicated both at the program level and within the RDC. Communiqués are an effective means to inform staff in the regions of changes in policies and procedures.
Staff at the Dalhousie RDC follows procedures to ensure that researchers become deemed employees prior to being approved for research contracts. There are complete records supporting contract management on file at headquarters, but there are opportunities to improve procedures for contract management to ensure that acknowledgements of the Values and Ethics Code are signed by all researchers and are on file.
There are regular discussions on risks between the regional manager, the RDC analyst and MAD management, and they are considered within the context of the annual risk register exercise. Integration of RDC program risk information by CSSD into a risk based inspection approach would improve and optimize inspection activities.
Processes and procedures for confidentiality vetting are in place and are effective in protecting the confidentiality of the data. The analyst also maintains an audit trail of all vetted documents which cannot be modified by researchers; this has been deemed a good practice.
Roles, responsibilities and accountabilities for Dalhousie IT support staff are outlined in service level agreements, and are aligned to government policies for IT and the Statistics Act. IT general controls, including access, identification and authentication safeguards were embedded in systems and software configurations and were operating as intended at the Dalhousie RDC.
Staff at the Dalhousie RDC is employing an effective practice by using a reliable USB key strictly dedicated for transferring electronic files to the server. Currently, there are no documented procedures or protocols in place to mitigate security threats associated with the use of external USB keys for the RDC program. All RDCs could benefit from this practice and further protect the data stored on the closed network.
Physical security measures in place within the Dalhousie University RDC comply with applicable TBS policies and Statistics Canada's Security Practices Manual. Key controls such as a secure perimeter with intrusion detection and restricted access to the centre are in place and effective to ensure the security of the information held in the facility.
Overall conclusion
The RDC at Dalhousie University is well managed and has effective practices and mechanisms in place to ensure that the confidentiality of data is protected in the delivery of services. There are opportunities for improvement in areas of acknowledgments for values and ethics, and by formalizing risk management and integrating the results into the RDC inspection strategy.
The RDC's physical and IT environments comply with TBS, as well as Statistics Canada policies and standards, and are effective in protecting the confidentiality of data in the delivery of services.
Conformance with professional standards
The audit was conducted in accordance with the Internal Auditing Standards for the Government of Canada, which includes the Institute of Internal Auditors (IIA) International Standards for the Professional Practice of Internal Auditing.
Sufficient and appropriate audit procedures have been conducted and evidence gathered to support the accuracy of the findings and conclusions in this report and to provide an audit level of assurance. The findings and conclusions are based on a comparison of the conditions, as they existed at the time, against pre-established audit criteria. The findings and conclusions are applicable to the entity examined and for the scope and time period covered by the audit.
Patrice Prud'homme
Chief Audit Executive
Introduction
Background
The Research Data Centres (RDCs) are part of an initiative by Statistics Canada, the Social Sciences and Humanities Research Council (SSHRC), Canadian Institutes of Health Research (CIHR), Canadian Foundation for Innovation and university consortia to strengthen Canada's social research capacity and to support the policy research community. The SSHRC is a federal agency that promotes and supports university-based research and training in the social sciences and humanities disciplines. CIHR is the major federal agency responsible for funding health research in Canada.
The Microdata Access Division (MAD) provides restricted access to confidential microdata through RDCs at universities across the country and the federal RDC in Ottawa. MAD is responsible for ensuring the confidentiality of information provided by Canadians. Currently, there are 27 RDCs: 26 are located in a secure setting on university campuses, and one is located within a research institute. These RDCs provide researchers with access to microdata from population and household surveys, meaning that researchers do not need to travel to Ottawa to access Statistics Canada microdata. In addition to centres located on campuses, the Federal Research Data Centre (FRDC) in Ottawa provides microdata access to researchers from federal policy departments. For the RDC program as a whole, functional authority is formally delegated to the manager/director of the RDC Program. At the regional level, functional authority resides with the RDC regional manager.
The mandate of RDCs is to promote and facilitate social science research using Statistics Canada's confidential microdata, while protecting the confidentiality of data through effective operational and analytical policies and procedures that create a culture of confidentiality. The RDCs are staffed by Statistics Canada employees and are operated under the provisions of the Statistics Act in accordance with all confidentiality rules and are accessible only to researchers with approved research projects, who have been sworn in under the Statistics Act. Day-to-day monitoring of the environment and physical security within the RDC is the responsibility of RDC analysts. RDC analysts administer the operation of the RDCs and ensure that the activities are consistent with Statistics Canada's mandate.
The Statistics Canada Risk-Based Audit and Evaluation Plan requires that the Internal Audit Division completes an audit of one RDC per year. Over the past three years, the RDCs located in the following universities were audited: University of Calgary and University of Lethbridge (2011); University of Alberta (2012); and McMaster University (2013).
Audit objectives
The objectives of the audit were to provide the Chief Statistician (CS) and the Departmental Audit Committee (DAC) with assurance that the RDC at Dalhousie University
has effective practices and mechanisms in place to ensure that the confidentiality of data is protected in the delivery of services; and
complies with applicable Treasury Board Secretariat (TBS) and Statistics Canada (StatCan) policies and standards regarding Information Technology (IT) and Physical Security, to ensure that confidentiality of data is protected in the delivery of services.
Scope
The scope of this audit included a detailed examination of the systems and practices of the RDC at Dalhousie University for the protection of data, the use of technology and the physical security.
The audit focused on the confidentiality vetting of data output by the on-site Statistics Canada employees; deemed employee status and security clearance requirements for access to microdata; research proposal process for RDC; microdata research contracts; physical security of the RDC site in compliance with applicable TBS and Statistics Canada policies and standards and IT protection in compliance with applicable TBS and Statistics Canada policies and standards.
Approach and methodology
The audit work consisted of an examination of documents, interviews with key program management, and personnel within the Microdata Access Division (MAD), Information Technology Operations Division (ITOD) and Dalhousie University, as well as a review for compliance with relevant policies and guidelines.
The field work included a review, assessment, and testing of the processes and procedures in place to ensure physical security, use of technology and the protection of data at Dalhousie University. A sample of microdata research contracts (completed, in progress, and microdata research contracts in evaluation) was examined to ensure coverage of contract types, data sources, multiple contract holders and research purpose. A combination of judgemental and systematic samples, totaling 43 out of 89 contracts having a start date between 2010 and 2014, was selected for testing, representing nearly 50% of recent microdata research contracts for this RDC.
This audit was conducted in accordance with the Internal Auditing Standards for the Government of Canada, which includes the Institute of Internal Auditors (IIA) International Professional Practices Framework.
Authority
The audit was conducted under the authority of Statistics Canada's Integrated Risk-Based Audit and Evaluation Plan for 2014/15 to 2018/19.
Findings, Recommendations and Management Responses
Objective 1: The Dalhousie University RDC has effective practices and mechanisms in place to ensure that the confidentiality of data is protected in the delivery of services.
Administration of microdata research contracts
The administration of research contracts is supported by roles and responsibilities that are well defined and communicated both at the program level and within the RDC. Communiqués are an effective means to inform staff in the regions of changes in policies and procedures.
Staff at the Dalhousie RDC follows procedures to ensure that researchers become deemed employees prior to being approved for research contracts. There are complete records supporting contract management on file at headquarters, but there are opportunities to improve procedures for contract management to ensure that acknowledgements of the Values and Ethics Code are signed by all researchers and are on file.
There are regular discussions on risks between the regional manager, the RDC analyst and MAD management, and they are considered within the context of the annual risk register exercise. Integration of RDC program risk information by CSSD into a risk based inspection approach would improve and optimize inspection activities.
Processes and procedures for confidentiality vetting are in place and are effective in protecting the confidentiality of the data. The analyst also maintains an audit trail of all vetted documents which cannot be modified by researchers; this has been deemed a good practice.
Effective management of research contracts is key to ensuring that only approved data are used by the authorized individuals and for purposes that are in line with program objectives. Program and RDC staff have a shared responsibility in ensuring that researchers become deemed employees with valid security clearance. As RDCs are located within university campuses, a collaborative approach to contract management, ongoing communication and risk management are required to protect the confidentiality of data.
Roles and responsibilities
Roles, responsibilities and accountabilities should be clearly defined and communicated. Effective means of communication are also necessary to ensure that staff is up to date on current policies, practices and procedures.
The roles and responsibilities for the management of the Microdata Research Contracts (MRCs), access to confidential microdata and confidentiality vetting are defined and communicated to stakeholders in policies, guidelines, standards and detailed guides. At the program level, authority is formally delegated to the RDC manager in Statistics Canada's Security Practices Manual,which states that the RDC manager:
"is responsible for establishing and maintaining an inventory of administrative information on research projects involving deemed employees for headquarters, the regional offices and the research data centres. Information includes research proposals and other information throughout the life-cycle of the project and certification that required procedures have been followed."
Additionally, the Policy on the Security of Sensitive Statistical Information assigns to Directors,
"the responsibility for controlling and protecting all sensitive statistical information obtained or held by their respective areas in the pursuit of their program objectives. When access to sensitive statistical information is provided in a Research Data Centre or equivalent, the Manager, Research Data Centre Program, assumes these responsibilities."
At the Dalhousie RDC, there is one full-time RDC analyst and two part-time statistical assistants. The RDC analyst reports to the regional manager, and the statistical assistants report to the full-time RDC analyst. The regional manager responsible for the Dalhousie RDC is located at the University of Guelph.
Information from RDC contracts is compiled by the Head Office Operations Unit (HOOU) in the Client Relationship Management System (CRMS) central database. Information entered in the system includes contract status, approval dates, names of researchers, reviewers and review outcomes, contract end dates and data approved for access. Staff working in RDCs cannot access this system and do not use its outputs within the context of their daily operations; however, they do received periodic reports generated from the CRMS and are asked to validate the information with their records. Information drawn from the CRMS is used by headquarters to manage the final stages of project completion, to follow up on security clearance renewal, and to monitor program growth and data usage. Information on the number of deemed employees and the use of datasets is also compiled and reported to Subject Matter Divisions (SMD). The audit team tested the accuracy of the information entered in CRMS on a sample of contracts. Test results revealed a small number of immaterial data-entry errors in CRMS, which did not impact the RDC's operations.
Communiqués are posted on the RDC extranet and provide information on new policies, processes, procedures and best practices to RDC staff. During interviews, it was established that staff at the Dalhousie RDC knew where to find these communiqués and were also able to provide examples of recent messages communicated through this means.
The administration of research contracts is supported by roles and responsibilities that are well defined and communicated both at the program level and within the RDC. Communiqués are an effective means to inform staff in the regions of changes in policies and procedures.
Deemed employees
A key management control relied upon to ensure the confidentiality of information within RDCs is the deemed employee status which all researchers must obtain prior to accessing the RDC. In addition to having an approved project, each researcher must undergo a security screening and be sworn in under the Statistics Act.
As per the Policy on the Use of Deemed Employees revised in August 2007, researchers wishing to access the RDC are required to become deemed employees and undergo a reliability security screening pursuant to sub-sections 5(2) and 5(3) of the Statistics Act, and take an oath or affirmation of office and secrecy, pursuant to sub-section 6(1) of the Statistics Act. They must also sign an acknowledgment that they have read and understood theStatistics Canada Values and Ethics Code for the Public Service. These actions are to be completed prior to the MRC being signed by Statistics Canada. Once a researcher has successfully completed these requirements and attended an orientation session, they are officially a deemed employee of Statistics Canada.
Testing was conducted to ensure that all required documentation was in place and valid for 31 researchers associated with 15 sampled contracts. Results revealed that valid security clearances and oaths of office and secrecy had been signed by all researchers. Additionally, tests were conducted to ensure that researcher acknowledgements of the Values and Ethics Code for the Public Service were on file. Results revealed that of the 31 researchers associated with the sampled contracts, 24 had signed this acknowledgement and copies were on file. For the other seven researchers, there were no signed copies of the acknowledgements on file, nor were they subsequently found. According to program management, researchers were allowed to submit their acknowledgements of the Values and Ethics Code after the research contracts were approved. Signed Values and Ethics forms attest to the fact that researchers have read and agreed to the terms and conditions set out by the RDC program. In order to ensure that Values and Ethics acknowledgment forms are on file, the RDC program has recently adopted a new approach where forms are required as part of the proposal package sent to headquarters for evaluation. This practice is expected to ensure that acknowledgements of the Values and Ethics Code are signed by all researchers and are on file.
Staff at the Dalhousie RDC follows procedures to ensure that researchers become deemed employees prior to being approved for research contracts. There are complete records supporting contract management on file at headquarters, but there are opportunities to improve procedures for contract management to ensure that acknowledgements of the Values and Ethics Code are signed by all researchers and are on file.
Contract management
MRCs are signed either by the Director of MAD or a delegated manager within the RDC program once project proposals have been evaluated and approved by the program. The audit tested compliance of the contract processing procedures by reviewing a sample of 31 active and completed contracts associated with the Dalhousie RDC. All proposals, project descriptions or course syllabuses and signed MCRs were found to be on file at headquarters. The audit also determined that contracts were signed by the appropriate authority at Statistics Canada. According to procedures established for the RDC Program, a new contract cannot be approved until all deliverables for other contracts are completed. Researchers have 12 months after the contract expiry date to complete and remit end products (publication or results). Contracts for which the end product has not been remitted become delinquent in the CRMS. MAD has implemented a checklist which is completed by the RDC program staff prior to approval of all new contracts; this control includes verification that all deliverables for previous contracts have been submitted prior to approval of the contract.
File documentation supporting MRCs was found to be complete. Contracts were signed by the appropriate authority and there are mechanisms in place to ensure that new contracts are signed only for researchers with projects in good standing.
Risk management
Risk management practices involving both program management and RDC staff in the regions are essential to identify, assess and respond to risks that may preclude the achievement of RDC program objectives.
Interviews with program management and RDC staff revealed that regular discussions on risks are held between the regional manager, the RDC analyst and MAD, and they are considered within the context of the annual risk register exercise.
IT and Physical Security Inspection Report are also used as a means to identify and mitigate risks that are specific to individual RDCs. Inspections are led by CSSD with the help of ITOD staff. Two physical security inspections and two IT security inspections have been conducted for the Dalhousie University RDC, once prior to opening the centre in 2001 and again in 2011. The strategy used to establish the calendar of security inspections for RDCs is currently established by CSSD. The current approach neither considers nor integrates risks identified by the RDC program, such as changes to the IT environment. MAD management has stated that the current approach to security inspections for RDCs could be improved and optimized by integrating risk information obtained from the RDC program and adopting a risk-based approach to RDC inspections, which the RDC program itself would be better positioned to determine.
There are regular discussions on risks between the regional manager, the RDC analyst and MAD management, and they are considered within the context of the annual risk register exercise. Integration of RDC program risk information into a risk based inspection approach would improve and optimize inspection activities.
Confidentiality vetting
RDCs are repositories of Statistics Canada microdata files that are accessible to researchers with approved projects. Effective and appropriate processes and procedures for confidentiality vetting should be in place and adhered to in order to significantly reduce the risk of unwanted disclosure.
Confidentiality vetting is the process of screening research outputs, syntax or any confidential data-related material to assess the risk of a prohibited disclosure. This is done by analysing whether obvious identification of individual cases or information about individual cases can be inferred or deduced from the statistical output. The RDC analyst's primary responsibility with respect to confidentiality vetting is to ensure confidentiality is not breached when allowing research outputs to leave the RDC. The analyst should review all materials that the researcher would like to remove from the RDC and the final responsibility and decision to release the output rests with the analyst. Confidentiality vetting is conducted using the survey-specific guidelines for all surveys housed in the RDCs. Questions or concerns related to the vetting process or to unfamiliar statistical techniques are addressed by the RDC regional manager or with the RDC Vetting Committee.
During the orientation session, researchers receive training on the confidentiality vetting process and the required documentation for vetting requests. This documentation includes descriptions of variables, weighted and non-weighted counts, syntax and completion of the disclosure request form for every output request. A detailed draft document entitled, Disclosure Control Rules for Outputs from Survey Data at RDCs provides instructions on how to conduct and perform confidentiality vetting. Guidelines on disclosure risk analysis for various data types and descriptive or tabular output and variance-covariance and correlation matrices, graphs, and models are included.
Confidentiality vetting guidelines and processes are found in the Researcher Guide. An important part of the process is for researchers to complete the Vetting Request Form, which provides the required information for the analyst to conduct and document the vetting request. Information required from the researcher includes:
the name of the output file, survey and cycles used
characteristics of the population being analyzed
the statistical procedure and weights used
a description of the variables
weighted and unweighted outputs.
Once the vetting is complete, output deemed non-confidential is released to the researcher.
At the Dalhousie RDC, all confidentiality vetting is completed by the full-time RDC analyst. This same analyst has been working with the RDC program for several years and understands Statistics Canada data and the confidentiality requirements. This analyst also has active MRCs; to ensure segregation of duties, the confidentiality vetting for her outputs are completed at headquarters.
Using judgemental sampling (to ensure the inclusion of a variety of data sets, researchers and contract types); 14 completed, active, delinquent and withdrawn contracts were selected to verify that confidentiality vetting takes place at Dalhousie and that the method used is appropriate. Results showed that for all contracts, confidentiality vetting forms were completed and that the method used for each survey was in accordance with the established process. The required supporting documentation for previously vetting material, variables used, definitions of derived variables, as well as weighted and unweighted counts were provided. Evidence was in place that the analyst used the appropriate vetting techniques and effectively applied vetting procedures.
Additionally, the analyst has implemented a procedure to ensure that the finalized vetted material was under the analyst's control. This control helps ensure, should vetting have to be recreated or more vetting is to be done, the analyst has a record that cannot be modified by a researcher.
Processes and procedures for confidentiality vetting are in place and are effective in protecting the confidentiality of the data. The analyst also maintains an audit trail of all vetted documents which cannot be modified by researchers; this has been deemed a good practice.
Recommendations
It is recommended that the Assistant Chief Statistician of Social, Health and Labour Statistics should ensure that:
procedures for contract management are strengthened to ensure that acknowledgements of the Values and Ethics Code are signed by all researchers and are on file;
discussions on risks between the regional manager, the RDC analyst and MAD management are used to inform and determine the physical and IT inspections strategy.
Management response:
Management agrees with the recommendations.
The contracts that were missing the signed Value and Ethics forms were mainly dating from contracts signed between 2009 and 2012. A number of improvements in the contract processing system, including the acknowledgements of the Values and Ethics Code component, have been made since 2012. These have strengthened the process to ensure that all the necessary documents are completed and stored on file/integrated with the research contracts. The following three improvements were made in the past year to address the deficiency that had been identified internally prior to the audit, and will continue to be put in place to address this situation:
Introduced procedures when sending documents to Head Office -outlined in communiqué 2013-05. Contracts will not be processed by Head Office unless all documentation is received (since October 2013).
Introduced the use of a checklist when the Microdata Research Contract (MRC) is signed by the Director or designate. The checklist indicates that the Head Office staff have received and processed all documents. This checklist must be completed before an MRC is signed (since December 2013).
Implemented a new MRC to integrate the Network Use Form and Values and Ethics Code into the contract (since July 2014).
The Director of MAD will work with IT and physical security to develop a more integrated schedule for the RDCs taking into account risk management and mitigation strategies for the RDC program. Focus will be placed on IT inspections to be conducted in each of the research data centres that are connected to the Wide Area Network over the next 2 years with a view to ensuring that new and upgraded equipment and configurations meet TBS security requirements. While no on site physical inspections will be done unless a new centre is built or a centre has been renovated, physical Security inspectors will meet with RDC Analysts by phone to confirm that no changes have occurred at their centre since the last inspection. MAD management will continue to meet with IT and physical security several times per year and will review the schedule and inspection strategy with the Security Coordinating Committee as required.
Deliverables and Timeline: A revised strategy for IT and physical security inspections, and an integrated risk-based schedule will be developed between January 2015 and September 2016.
Objective 2: The RDC at Dalhousie University complies with applicable TBS and Statistics Canada policies and standards' regarding Information Technology Security and Physical Security to ensure that confidentiality of data is protected in the delivery of services.
Information technology security
Roles, responsibilities and accountabilities for Dalhousie IT support staff are outlined in service level agreements, and are aligned to government policies for IT and the Statistics Act. IT general controls, including access, identification and authentication safeguards were embedded in systems and software configurations and were operating as intended at the Dalhousie RDC.
Staff at the Dalhousie RDC is employing an effective practice by using a reliable USB key strictly dedicated for transferring electronic files to the server. Currently, there are no documented procedures or protocols in place to mitigate security threats associated with the use of external USB keys for the RDC program. All RDCs could benefit from this practice and further protect the data stored on the closed network.
Information technology security in RDCs should be compliant with applicable TBS policies, such as the Operational Security Standards, Management of IT Security and Statistics Canada's Security Practices Manual. Roles, responsibilities, and accountabilities should be clearly defined and communicated. In the context of RDCs, IT security should include controls for the protection of the information system; communications with and within the information system; access controls that ensure the ability to permit or deny user access to the systems; and identification and authentication controls that allow unique identification and authentication of each user.
Roles and responsibilities
Within Statistics Canada the Information Technology Operations Division (ITOD) provides guidance and directives on IT security requirements for the RDC program.
At the Dalhousie RDC, IT services are provided locally by university IT staff members. These staff members respond to the RDC analyst's requests as required and ensure that workstation computers, the RDC server and other IT equipment are configured to adhere to Statistics Canada directives and policies.
Roles, responsibilities and accountabilities for Dalhousie IT support staff are outlined in service level agreements, and are aligned to government policies for IT and the Statistics Act.
IT systems safeguards and software configurations
In 2013, all computer systems and hardware at the Dalhousie RDC were replaced and the IT environment was configured to join the headquarters domain. As a result of this change, user accounts are now being created by the HOOU, and only the perimeters of access are controlled locally at the RDC. The server at the Dalhousie RDC has recently been updated and is a stand-alone setup with open directories, using the BASIS Proxcard-II Access System to grant permissions. Apart from the wide-area network (WAN), the server has no external connection. As a result, remote access to the server outside of the RDC is not possible.
During the on-site visit there were nine functional stand-alone workstations available for use by the researchers. These were new computer systems acquired in the last year, all identically configured, with the same hardware and software. The audit team selected a number of key computer safeguards which were tested to ensure they were functioning as intended. Results of the audit team's inspection were as follows:
Computers in the research lab are not connected to the Internet (Internet access is only available to RDC employees in the RDC analysts' offices), and data and researcher folders are stored on the server.
Software is installed on each workstation by Dalhousie's IT support and each workstation has a Deepfreeze application which ensures no residual data remains on the computer upon log out. The audit team tested the Deepfreeze and confirmed it to be effective; documents appeared in a local temporary folder, and were erased when the user logged back in.
The McAfee anti-virus is the prescribed software to be used and was operational on all computers. For all computers linked to the closed network, regular updates are processed by headquarters every week via the WAN. For the computer having Internet access in the RDC analyst's office, the anti-virus application is updated daily via Internet.
Password configuration and validation have been set to standards that are in accordance with Statistics Canada's IT requirements.
The server and back-up drives are locked in a cabinet housed inside the RDC secured area, for which only the analyst has the keys; this complies with the program's security requirements.
Printouts of researchers' work can only be produced from a network printer which is located in the RDC analyst's office.
USB ports available on computers in the research lab are programmed to detect and reject all devices which have storage capacity and/or communication media. The audit team connected a cell phone and USB key to a sample of three out of the nine functional lab computers, all devices were rejected by the system.
In all cases, IT general controls were embedded in systems and software configurations and were operating as intended.
Access, identification and authentication safeguards
Procedures specify that user accounts should be created only when a contract is approved and becomes active. Access should be removed upon the expiry date of the MRC and password configuration should meet Statistics Canada standards.
The RDC analyst is responsible for configuring accounts to ensure that only data approved in the final contract are accessible for the duration of the project and has administrative privileges to access the system. The statistical assistants do not have administrative privileges. Individual userIDs are created for each researcher and for each contract. When researchers are associated with more than one research project, a separate userID is created corresponding to each project. This way, access user accounts are set to reflect the contract start and end dates and the security clearance expiry date. This also prevents researchers from moving files between projects. Password configuration for access user accounts is set in accordance with Statistics Canada standards.
Although not Statistics Canada employees, the Dalhousie IT support staff are deemed employees. They ensure that systems within the RDC are configured to StatCan requirements and have administrative privileges for IT systems only. This access allows them to modify computer system configurations, such as downloading software, installing printers or other devices locally. IT staff at Dalhousie also use administrative privileges to service IT problems, workstation issues and all server related requirements. They do not have administrative privileges to modify access perimeters for individual userID accounts created for researchers.
The audit team tested a sample of 43 out of 89 contracts having a start date in 2010 to 2014, and compared the data sets noted in the MRC to the information recorded in CRMS and data sets included in the IT system parameters for selected users. Results showed that the list of data sets entered in the system for access privileges agreed with the list of data sets included in the contract or subsequent amendments.
The audit team also tested a random sub-sample of nine individual researchers selected for active contracts to ensure that expiry dates did not exceed the valid security clearance period and that access was granted only for the period stipulated in the MRC. Results confirmed that when the security clearance date expired before the contract end date, the date entered was the security clearance expiry date; this control is effective in ensuring that individuals have security clearance in order to access Statistics Canada data. All other dates were consistent with the MRC, as required.
The audit team also tested data file access perimeters entered in the system for contracts indicating a status of suspended, incomplete, withdrawn or inactive and verified that access was disabled. Tests confirmed that, as required, userID accounts were disabled.
The audit determined that applicable IT security measures are in place and adhere to Statistics Canada's standards for safeguarding and protecting confidential data. IT access, identification and authentication safeguards are embedded in systems and software configurations at the Dalhousie RDC and are working as intended.
Protection of RDC servers against IT threats and protocol for reporting incidents
The Analyst and Statistical Assistants control which electronic files can be added to research project files on the closed network. Making researcher's electronic documents and tables available on the closed network is necessary as part of daily operations. Researchers bring their files on a personal USB key. Because workstations in the research lab are configured in such a way that USB keys cannot be accessed, the retrieval of researchers' files is done from the computers under the RDC staff's control.
At the program level, there are procedures on the RDC extranet that explain the terms and conditions for which a researcher can have data, documentation or other information added to their research projects. However, these procedures do not explain how to safely transfer the researchers' electronic documents from an external USB key onto the server.
In order to reduce the risk of a computer virus, malware or other spyware coming into contact with the data stored on the closed network, specific procedures have been implemented at the Dalhousie RDC for electronic file transfers when using a researcher's USB key. The researcher's USB key is inserted into the RDC analyst's Internet computer and the McAfee virus scan is run on the entire key. Copies of the files that the researcher wants transferred to his/her project folder on the server are then saved to the desktop and individual files are scanned a second time for viruses. The RDC analyst transfers the files/folders of documents from the desktop of the Internet computer to a dedicated RDC USB key and then transfers the files from the researcher's account to the server by using the computer that is linked to the closed network. Using a separate USB dedicated to the RDC was recommended by Dalhousie's IT staff.
Interviews with ITOD confirmed that docking a USB key owned by external users onto a computer that is linked to the closed network increases the risk of inadvertently transferring viruses, spyware and/or malware and should be prohibited. Using a reliable USB key strictly dedicated for transferring files to the server is a good practice which significantly reduces the risk of inadvertently transferring viruses, spyware and/or malware, as the USB key has not been in contact with computers outside of the RDC facility.
Staff at the Dalhousie RDC is employing an effective practice by using a reliable USB key strictly dedicated for transferring electronic files to the server. Currently, there are no documented procedures or protocols in place to mitigate security threats associated with the use of external USB keys for the RDC program. All RDCs could benefit from this practice and further protect the data stored on the closed network.
Recommendations:
It is recommended that the Assistant Chief Statistician of Social, Health and Labour Statistics should ensure that:
Procedures are documented, on the use of USB keys to transfer researchers' data files onto the closed network.
Management response:
Management agrees with the recommendations.
This is a best practice that is utilized in every centre in the program, but it has not been systematically documented. The Director of MAD will prepare and distribute a Communiqué to all RDC staff and researchers to document procedures on the use of USB keys to transfer researchers' data files onto the closed network.
Deliverables and Timeline: Procedures will be posted to the communications network by December 31, 2014.
Physical security
Physical security measures in place within the Dalhousie University RDC comply with applicable TBS policies and Statistics Canada's Security Practices Manual. Key controls such as a secure perimeter with intrusion detection and restricted access to the centre are in place and are effective to ensure the security of the information held in the facility.
The physical security measures required in RDCs are intended to safeguard the confidentiality of the information held in the facility. Physical security inspections verify that security measures comply with applicable TBS policies, such as the Government Policy on Security (GPS) and Statistics Canada's Security Practices Manual. Physical security should include key controls such as the establishment of a secure perimeter with intrusion detection, restricted access to the centre, and ongoing monitoring through observation during business hours.
Departmental physical security inspections
Physical security inspections are completed upon initial opening of RDCs and Statistics Canada management has determined that RDC inspections will take place every four years. Departmental security staff performs the physical inspections, and provide recommendations to the RDC regional managers and headquarters staff. The last Dalhousie University RDC physical security inspection was conducted in 2011. As a result of this inspection, one recommendation was issued: it was recommended that access system reports be printed and reviewed on a regular basis. The RDC staff is able to login to a security system supported by the Dalhousie Security office and print access system reports. At the Dalhousie RDC, access system reports are produced monthly, mainly to track the hours recorded for projects involving fee-for-services. The swipe card log reports are available for examination if required to investigate the comings and goings of researchers more thoroughly. By printing out access system reports and reviewing them monthly, staff at the Dalhousie RDC has addressed the recommendation issued from the 2011 physical inspection.
Through comparison with the information provided in the 2011 physical site inspection and interviews with MAD and RDC staff, it was confirmed that the RDC has not relocated or made any significant changes to its physical structural environment since it was last inspected. In light of this, the audit team selected a number of key physical controls within the RDC's physical environment and verified compliance with RDC physical security requirements. Results of the audit team's inspection were as follows:
The facility has secure locked storage cabinets for storing researchers' files.
There are no printers located in the research area.
The printer/fax/scanning device is for RDC staff use only and is located in the RDC analyst's office.
In all cases, key physical controls were set up in accordance with RDC physical security requirements.
Secure perimeter with intrusion detection
The RDC is located on the main floor at Dalhousie University. The RDC was constructed in compliance with Statistics Canada's requirements for perimeter security for 'shared floor occupancy.' Physical access in and out of the RDC is through a single door entrance, accessible from the university library. In accordance with security requirements for RDCs, the centre is secured by a steel door with a deadbolt with a one-inch throw. RDC staff and campus security have keys to the facility. The RDC does not have exterior windows. Interior windows are frosted except the one in the RDC analyst office (which faces out to the library) which has vertical blinds that are kept closed.
Campus security provides 24/7 monitoring of the university facilities. The RDC also has an alarm system and motion detectors, which are functional and safeguard the facility after working hours. If the alarm or motion sensor system is triggered, campus security would be notified as first response who would then notify the RDC analyst.
Restricted access to the RDC
At the Dalhousie RDC, there is one full-time RDC analyst and two part-time statistical assistants. RDC staff is present during business hours and monitor researchers' presence at the facility, to ensure that access to the centre is restricted to individuals who are authorised. The presence of RDC staff during business hours is a critical control in safeguarding assets within the secure area. RDC staff at Dalhousie set work schedules to ensure continuous presence at the centre to cover vacation leave and personal appointments. In rare cases where coverage amongst staff is not possible during extended periods which go beyond health breaks and brief absences during lunch, the RDC at Dalhousie is closed.
An electronic swipe card access system is also in place. Identification cards are assigned to authorised researchers and there are two swipe readers located on the exterior and interior sides of the door to the RDC. The card must be swiped in order to unlock the door when entering and leaving the secured area. In doing so, the system records all swipe entries and exits. RDC staff has access to the electronic swipe card access system's logs from the analyst's computer through remote login to the Dalhousie security system. The audit team examined the swipe access logs for the RDC for the three days the auditors were on site, as well as the system's reports for the RDC for the month of June 2014. Tests confirmed that the system accurately recorded all swipe entries and exits for the centre. However, reports generated by the access card system do not reflect all individuals' entries and exits during business hours. There are situations where individuals can enter or leave the centre during operating hours without having to swipe their own individual card. For example, if a number of individuals leave the RDC at the same time, or if someone enters the facility just as someone exits, individuals do not remember to always swipe their card. Additional tests performed showed that the use of the manual sign-in log as a compensatory control was effective in ensuring that a list of all visitors to the centre is maintained.
Ongoing monitoring
In order to effectively monitor activities in the RDC, the analyst should have a clear view of researchers while in the centre. At Dalhousie, the RDC analyst can easily observe researchers when they use the workstations located at the front of the RDC. While in the RDC analyst's office, there are challenges in observing researchers sitting at workstations located in the rear of the RDC. As a mitigation strategy, the RDC staff walks through the RDC and observes the workstations from various positions on a regular basis. Program management has stated that it is a challenge to strike the appropriate balance between protecting the confidentiality of the information and affording researchers a certain level of privacy to carry out their work. Each RDC is configured differently; RDC managers are encouraged to use a risk based approach in this regard for each RDC.
Physical security measures in place within the Dalhousie University RDC comply with applicable TBS policies and Statistics Canada's Security Practices Manual. Key controls such as a secure perimeter with intrusion detection and restricted access to the centre are in place and are effective to ensure the security of the information held in the facility.
Appendices
Appendix A: Audit criteria
Control Objective / Core Controls / Criteria
Sub-Criteria
Policy Instrument
1) The Dalhousie University RDC complies with applicable TBS and Statistics Canada policies and standards' regarding Information Technology Security and Physical Security to ensure that confidentiality of data is protected in the delivery of services.
1.1 The physical environment in which the RDC operates complies with current StatCan policies.
1.1.1 Policies, directives and procedures are in place detailing physical security requirements.
1.1.2 Physical access controls are in place to ensure that the physical environment is effective in safeguarding sensitive data.
1.1.3 Both manual and automated controls exist to ensure physical security (i.e. card readers, alarm systems, sign-in sheets).
1.1.4 Ongoing monitoring of the environment takes place to ensure changes/new risks can be quickly addressed.
1.1.5 Regular periodic Physical inspections take place to ensure compliance to policies and are conducted by corporate service functions. Results from Physical inspections are recorded and remedial action is taken when non-compliance is found.
TBS Government Policy on Security
TBS Standard on Physical Security
TBS Directive on Departmental Security Management
Statistics Canada Security Practices Manual
Internal RDC physical security documentation
Security of Sensitive Statistical Information
Statistics Act
Discretionary Disclosure Directive
Policy on Deemed Employees
1.2 The IT environment in which the RDC operates complies with current StatCan policies.
1.2.1 Policies, directives and procedures are in place detailing IT security requirements.
1.2.2 IT hardware, software and general computer controls are in place to safeguard sensitive data.
1.2.3 Each workstation is configured to ensure compliance to StatCan and TBS IT security requirements.
1.2.4 Passwords must be entered to activate the computer systems and are regularly updated and time-outs are functional.
1.2.5 Regular periodic IT inspections take place to ensure compliance to policies and are conducted by corporate service functions. Results from IT inspections are recorded and remedial action is taken when non-compliance is found.
TBS Government Policy on Security
TBS Directive on Departmental Security Management
Statistics Canada Security Practices Manual
Statistics Canada IT Security Policy
Security of Sensitive Statistical Information
Statistics Act
Discretionary Disclosure Directive
Policy on Deemed Employees
1.3 Access to sensitive statistical information is restricted to authorized individuals using IT and environmental control systems that are maintained in accordance with applicable policies.
Physical and IT access is controlled to ensure only those authorized can physically access the RDC premises, and log-in to workstations restricts access to authorized data sets only.
1.3.1 Physical and IT Access control listings are regularly updated and validated.
1.3.2 University staff who maintain the IT systems understand the IT security requirements and ensure access to data is restricted to authorized uses only.
1.3.3 RDC staff monitor researchers and workstations to ensure researchers comply procedures
1.3.4 Researchers authorized to use the RDC have access to only those data sets noted in the MRC.
1.3.5 Accounts are automatically disabled upon contract expiry dates and all external ports have been disabled on researcher workstations.
1.3.6 Access to files is restricted to researchers having MRCs in good standing.
Statistical Information Security of Sensitive
Discretionary Disclosure Directive
TBS Directive on Departmental Security Management
Statistics Canada Security Practices Manual
Statistics Canada IT Security Policy
TBS Government Policy on Security
TBS Standard on Physical Security
Discretionary Disclosure Directive
1.4 The confidentiality of sensitive statistical information is protected through the application of continuous vetting activities prior to letting written documentation leave the RDC.
1.4.1 Practices of continuous data vetting are applied to ensure documents containing sensitive statistical information do not leave the RDC.
Statistics Canada Security Practices Manual
Directive on Sensitive Statistical Information
2) The Dalhousie University RDC has effective practices and mechanisms in place to ensure that the confidentiality of data is protected in the delivery of services.
2.1 Accountabilities in support of the RDC's operations and collaborative initiatives shared among RDC staff, university staff and researchers are formally defined.
2.1.1 Agreements, terms of reference or equivalent documents outlining roles, responsibilities and accountabilities for functions involving the Dalhousie University staff are in place outlining the responsibilities for the:
Regional Manager
RDC analyst and statistical assistant
Academic Director
IT support
2.1.2 Microdata Research Contracts (MRCs) templates outline roles and responsibilities for StatCan and researchers with respect to the treatment and usage of Statistics Canada survey data and information. MRCs templates are used for new contracts and are signed by those with the appropriate authority.
2.1.3 Oaths and values and ethics documents outline requirements related to the confidentiality of Statistics Canada data and are in place prior to granting access to data.
2.1.4 Documentation is in place to outline confidentiality data vetting responsibilities for statistical outputs is in place.
2.1.5 RDC staff receives regular communications and updates from headquarters related to new processes and procedures, changes to process, issues and other items related to RDC operations, MRC management and confidentiality requirements.
2.1.6 RDC staff receives regular information updates related to confidentiality vetting and when new data sets arrive in the RDC locations, vetting requirements are communicated.
Internal RDC roles and responsibilities documentation
RDC documentation for Academic Directors
RDC Researcher Guide
Policy on Deemed Employees
MRC contracts templates
Oath / Affirmation of Secrecy
Values and Ethics documents
Internal Confidentiality Vetting documents
2.2 As deemed employees, researchers, RDC staff and university staff formally acknowledge compliance with Statistics Canada's corporate values and ethics, code of conduct or equivalent policies as it pertains to the confidentiality of sensitive statistical information.
2.2.1 Upon commencement of a new contract with the RDC, researchers are required to sign a statement acknowledging understanding and compliance with the policies and the Statistics Act through the following activities:
Orientation sessions are held to ensure researchers understand confidentiality requirements.
Values and ethics forms are signed by all researchers and are on file.
Documentation is provided to all researchers accessing the RDC to remind them of their responsibilities related to the confidentiality of sensitive statistical information.
University IT, and academic directors have signed oaths and are deemed employees.
Values and Ethics documents
Documentation distributed at orientation session
RDC Researcher Guide
Evidence of orientation sessions provided, and acknowledgement to comply with values and ethics/code of conduct
Evidence of security clearance for researchers
2.3 Management identifies, assesses and responds to the risks that may preclude the achievement of its objectives and assesses the effectiveness of existing controls.
2.3.1 Regular discussions are held between the Regional Manager, the RDC Analyst and MAD.
2.3.2 Management led IT and physical security Inspection Report are conducted for the Dalhousie University and used to determine risks and mitigation strategies.
RDC Security Inspections
Confidentiality Vetting guidelines
Audit trail/files kept within the RDC in support of vetting activities.
2.4 Management has established processes to develop and manage relevant agreements, Memorandum of Understandings (MoUs), and/or contracts, for the purposes of the RDC Program in the region.
2.4.1 Microdata Research Contracts exist, are up to date and outline data requirements, the research being conducted and contain relevant dates (start and expiry dates), and outline the terms and conditions for RDC usage.
2.4.2 All security screening is in place for contracts.